Search by job, company or skills
Introduction
we are looking for 5+years experienced candidates for this role.
Job Description
Create and maintain optimal data pipeline architecture
Assemble large, complex data sets that meet business requirements
Optimize data delivery and re-design infrastructure for greater scalability
Build the infrastructure required for optimal extraction, transformation, and loading of data
from a wide variety of data sources using kafka and Azure technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer
acquisition, operational efficiency, and other key business performance metrics.
Work with internal and external stakeholders to assist with data-related technical issues and
support data infrastructure needs.
Work closely with business intelligence team, understand their reuiqremts thorugh API and
needs to check with the backend postgres and snowflake data warehouses and needs to update
the facts and dimensions according to that. Strong understanding of datawarehouse concepts
is needed.
Responsibilities include:
The candidate needs to work closely with Business intelligence team, understand their
reuiqremts thorugh API and needs to check with the backend postgres and snowflake data
warehouses and needs to update the facts and dimensions according to that. Strong understand
of datawarehouse concepts is needed.
Primary Skills :
Databases, Datawarehouses, Kafka, AWS Datalake, Postgres , snowflake, API, Data
ingestion, python. Distributed systems, apache Airflow.
MDM/PIM
Informatica, Databricks/Snowflakes, Atacamma, Syndigo (PIM)
Secondary Skills :
Mongo DB , openAPI, fastAPI, business intelligence
Role:Software Engineer/Programmer, Datawarehousing Consultants, Other Software/Hardware/EDP
Function:IT
Job Type:Permanent Job
Date Posted: 04/06/2024
Job ID: 80791367