Search by job, company or skills

Ford Motor Company

Senior Data Engineer

Early Applicant
  • a month ago
  • Be among the first 50 applicants

Job Description

Job Description

At Ford Motor Credit Company, we are modernizing enterprise Core Platforms and integrating new lending platform to Google Cloud Data Platform (Data Factory), to improve Data, Analytics and AI/ML capabilities, and enhance customer experience, regulatory compliance & operational efficiencies, enabled by Google Cloud.

This role is for a GCP Data Engineer who can integrate core data from New North America Lending platforms into Data Factory (GCP BQ), and build upon the existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to enable platforms and analytics in the GCP. You will be responsible for deep-dive analysis of Current State Receivables and Originations data in Datawarehouse, as well as impact analysis related to Ford Credit North America modernization and providing solutions for implementation. You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications, integrating it into subject areas, and building data marts & products in GCP.

Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deploying on the Google Cloud Platform.

Responsibilities

  • Design and build production data engineering solutions to deliver reusable patterns using Google Cloud Platform (GCP) services: Big Query, Dataflow, DataForm, Astronomer, Data Fusion, DataProc, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Artifact Registry, GCP APIs, Cloud build and App Engine, and real-time data streaming platforms like Apache Kafka, GCP Pub/Sub and Qlik Replicate
  • Collaborate with stakeholders and cross-functional teams to gather and define data requirements, ensuring alignment with business objectives.
  • Design and implement batch, real-time streaming, scalable, and fault-tolerant solutions for data ingestion, processing, and storage.
  • Perform necessary data mapping, impact analysis for changes, root cause analysis, data lineage activities, and document information flows.
  • Develop and maintain documentation for data engineering processes, standards, and best practices, ensuring knowledge transfer and ease of system maintenance.
  • Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures and provide production support by addressing production issues as per SLAs.
  • Implement an enterprise data governance model and actively promote the concept of data - protection, sharing, reuse, quality, and standards to ensure the integrity and confidentiality of data.
  • Work in an agile product team to deliver code frequently using Test Driven Development (TDD), continuous integration, and continuous deployment (CI/CD).
  • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure.
  • Continuously enhance your FMCC domain knowledge, stay current on the latest data engineering practices, and contribute to the company's technical direction while maintaining a customer-centric approach.

Qualifications


Essential:

  • Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions.
  • 5+ years of complex SQL development experience
  • 2+ experience with programming languages such as Python, Java, or Apache Beam.
  • Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions.
  • In-depth understanding of GCP's underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage
  • DevOps tools such as Tekton, GitHub, Terraform, Docker.
  • Expert in designing, optimizing, and troubleshooting complex data pipelines.
  • Experience developing with microservice architecture from container orchestration framework.
  • Experience in designing pipelines and architectures for data processing.
  • Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques.
  • Self-directed, work independently with minimal supervision, and adapts to ambiguous environments.
  • Evidence of a proactive problem-solving mindset and willingness to take the initiative.
  • Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management.
  • Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity.

Desired:


  • Professional Certification in GCP (e.g., Professional Data Engineer).
  • Master's degree in computer science, software engineering, information systems, Data Engineering, or a related field.
  • Data engineering or development experience gained in a regulated financial environment.
  • Experience with Teradata to GCP migrations is a plus.
  • Experience in coaching and mentoring Data Engineers
  • Project management tools like Atlassian JIRA
  • Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment.
  • Experience with data security, governance, and compliance best practices in the cloud.

More Info

Industry:Other

Function:technology

Job Type:Permanent Job

Skills Required

Login to check your skill match score

Login

Date Posted: 20/10/2024

Job ID: 97066745

Report Job

About Company

Hi , want to stand out? Get your resume crafted by experts.

Similar Jobs

Senior Data Engineer

AlconCompany Name Confidential

Senior Data and Application Quality Assurance Engineer

Ample Success HR SolutionsCompany Name Confidential
Last Updated: 27-11-2024 06:19:35 PM
Home Jobs in Chennai Senior Data Engineer