Job Description
IT Data Hub is a central warehouse hosted on Cloud platform to ingest IT data from varied data sources and curate them to serve to end users & downstream applications for different business use cases.
This position is part of the IT Data Hub team involving working on the data pipeline to ensure all Ford and GDI&A processes, patterns and data protection requirements are met for any data requests to meet customer delivery needs.
Responsibilities
- Manages Extract, Transform, Load (ETL) Processes: uses programming and tools for data ingestion, configures pipelines, applies transformations, integrates data and securely deliver to customers.
- Work with Global Data Insights and Analytics team to understand & implement the landing & curation patterns to build data products in IT Data Hub.
- Work with GDI&A data factory teams closely to align to GDI&A standards and guidelines for data product development.
- Work with data architects to align to build data products meeting business use case and performance requirements.
- Creates Proof of Concept (PoC) processes in Dev/QA for data ingestion/curation involving new data sources.
- Maintains feedback loop with Data Stewards on data issues, standards, fit for use (Data Stewardship is a subset of data engineering which would include responsibilities like enabling discovery of data for end users, protecting data & restricting access to data).
- Work with data steward to ensure consumable views are accessible to end users & downstreams.
- Communicates with end users to set expectations and ensure alignment around data accuracy, completeness, timeliness, and consistency.
- Participate actively in meetings with Business and other IT stakeholders.
- Work with Anchor to develop solutions for Features as required.
- Adheres to relevant Ford IT processes. Follows Agile software processes, culture, best practices and techniques.
- Work on data product support requests & incidents and manage them until closure per SLA requirements.
- Contribute to continuous improvements & add value to the team mix.
Qualifications
- 4 + years of progressive responsibilities in IT.
- Experienced with relational SQL databases.
- Experienced with a cloud data warehouse preferably Google Cloud Platform(GCP) (or) one of the On-prem big data systems Hadoop (Apache, Hortonworks, Cloudera, MapR).
- AWS or Azure experience can also be considered if there is an overall understanding of the data engineering discipline on Cloud platforms.
- Awareness of and hands-on compliance with data privacy(CCPA,GDPR et al), security, legal and contractual guidelines.
- Must have good communication skills to work with Business Partners, User Community and Key Stakeholders.
- Ability to develop solid rapport/gain confidence with Business Partners.
- Ability to quickly and accurately scope projects/develop solid estimates and interpret ambiguous data.
- (Preferred) Experience with working in L2/L3 production support & maintenance.
- (Preferred) Experience working in an agile team(Scrum or Kanban) for at least 6 months.
- (Preferred) Experience working in a Global stakeholder environment.