- Work with Product owner, functional SMEs to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform
- Be required to build and deliver Data solutions using GCP products and offerings.
- Ensure that the data architecture is scalable and maintainable.
- Help team Upskill themselves in GCP data technologies.
- Gather requirements and business process knowledge to transform the data in a way that s geared towards the needs of end users.
- Investigate data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions.
What you bring to the table:
- 3-5 years of Strong proficiency in SQL, Python, and BigQuery
- Experience with GCP data architecture and data warehousing
- Demonstrated expertise in developing, testing, and implementing ETL pipelines
- Exposure to machine learning techniques and data modeling
- Strong communication skills with the ability to interact effectively with technical and non-technical stakeholders
- Ability to design and implement highly scalable, secure, and fault-tolerant systems
- Experience with distributed systems and cloud-based technologies
- Familiarity with other GCP services such as Cloud Storage, Dataflow, and Pub/Sub.
- Experience with version control systems (eg, Git).
- Excellent analytical and problem-solving skills.
Good to Have: Knowledge of DBT, Data Form, working in Agile environments.
Perks :
- Benefits health insurance
- Paid time offs
- Training & Career Development: Professional development plans, leadership workshops, mentorship programs, and more!
- Free Snacks, hot beverages, and catered lunches on Fridays
- Culture - comprised of our core values: Drive, Innovation, Respect, and Agility
- Night Shift Premium
- Provident Fund