Search by job, company or skills

Ford

Software Engineer

Early Applicant
  • Posted 6 months ago
  • Be among the first 10 applicants

Job Description

We are currently seeking a seasoned GCP Cloud Data Engineer with 3 to 5 years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model.This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration

  • Requires a bachelor s or foreign equivalent degree in computer science, information technology or a technology related field
  • 3 to 5 years of professional experience in:
  • 3 years of cloud data/software engineering experience building scalable, reliable, and cost-effective production batch and streaming data pipelines using:
  • Data engineering, data product development and software product launches
  • At least three of the following languages: Java, Python, Spark, Scala, SQL and experience performance tuning.
  • Data warehouses like Google BigQuery.
  • Workflow orchestration tools like Airflow.
  • Relational Database Management System like MySQL, PostgreSQL, and SQL Server.
  • Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub
  • Microservices architecture to deliver large-scale real-time data processing application.
  • REST APIs for compute, storage, operations, and security.
  • DevOps tools such as Tekton, GitHub Actions, Git, GitHub, Terraform, Docker.
  • Project management tools like Atlassian JIRA

Even better, you may have..

  • Experience in IDOC processing, APIs and SAP data migration projects.
  • Experience working in SAP S4 Hana environment
  • You may not check every box, or your experience may look a little different from what weve outlined, but if you think you can bring value to Ford Motor Company, we encourage you to apply!
  • Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc.
  • Build ETL pipelines to ingest the data from heterogeneous sources into our system
  • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data
  • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets
  • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements
  • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs.
  • Implement security measures and data governance policies to ensure the integrity and confidentiality of data.
  • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure.

More Info

Industry:Other

Function:technology

Job Type:Permanent Job

Skills Required

Login to check your skill match score

Login

Date Posted: 07/08/2024

Job ID: 87787255

Report Job

About Company

Hi , want to stand out? Get your resume crafted by experts.

Similar Jobs

Workplace Technologies Software Engineer

Motorola

Software Engineer II - HL7 Integration

Respironics Inc
Last Updated: 09-08-2024 10:09:18 AM
Home Jobs in Chennai Software Engineer