Search by job, company or skills

Quest Global

GCP Data Engineer Lead

Early Applicant
  • 4 months ago
  • Be among the first 50 applicants

Job Description

Applicants can submit CV direct to Email : Sunil Chandran [Confidential Information] ;

Experience : 6 - 12 Years

Responsibilities:

  • Design, develop, and maintain scalable and efficient data pipelines to extract, transform, and load (ETL) data from various sources into data lakes and data warehouses.
  • Design and develop microservices to
  • Collaborate with data scientists, analysts, and cross functional teams to design data models, database schemas and data storage solutions.
  • Implement data integration and data quality processes to ensure the accuracy and reliability of data for analytics and reporting.
  • Optimize data storage, processing, and querying performance for large-scale datasets.
  • Enable advanced analytics and machine learning capabilities on the data platform.
  • Continuously monitor and improve data quality and data governance practices.
  • Stay up to date with the latest data engineering trends, technologies, and best practices.

Requirements:

  • Bachelor's degree in computer science, Engineering, or a related field.
  • 8+ of proven experience in data engineering, data warehousing, and ETL processes.
  • Proficiency in data engineering tools and technologies such as SQL, Python, Spark, Hadoop, DBT, Airflow, Apache Kafka and Presto.
  • Solid experience with table formats such as Delta Lake or Apache Iceberg
  • Design and development experience with batch and real time streaming infrastructure and workloads.
  • Solid experience with implementing data lineage, data quality and data observability for Big data workflows.
  • Solid experience designing and developing microservices and distributed architecture.
  • Hands-on experience with GCP ecosystem and data lakehouse architectures.
  • Strong experience with implementing Databricks.
  • Strong experience with container technologies such as Docker, Kubernetes.
  • Strong understanding of data modeling, data architecture, and data governance principles.
  • Excellent experience with DataOps principles and test automation.
  • Familiarity with data processing and querying using distributed systems and NoSQL databases.
  • Ability to optimize and tune data processing and storage for performance and cost-efficiency.
  • Excellent problem-solving skills and the ability to work on complex data engineering challenges.
  • Strong communication and collaboration skills to work effectively with cross-functional teams.
  • Previous experience mentoring and guiding junior data engineers is a plus.
  • Relevant certifications in data engineering or cloud technologies are desirable.

Nice to have:

  • Experience working in a Data Mesh architecture.
  • Supply Chain domain experience.

Special requirements

  1. Support US Data platform teams
  2. Monitor and manage platform infrastructure
  3. Assist with platform deployment and maintenance
  4. Support Insights, Control Tower, and other production needs
  5. For Data pipeline automation and efficient data integration

Skills required: Strong experience with platform support and operations, cloud infrastructure management. Strong understanding of Insights, Control Tower and other products that data platform supports. Expertise with data engineering.

More Info

Industry:Other

Function:technology

Job Type:Permanent Job

Skills Required

Login to check your skill match score

Login

Date Posted: 11/07/2024

Job ID: 84127885

Report Job

About Company

Hi , want to stand out? Get your resume crafted by experts.

Similar Jobs

GCP Data Lead

AruGuru TechCompany Name Confidential

Manager GCP Data Engineer Pune

VOISCompany Name Confidential
Last Updated: 20-11-2024 10:28:49 PM