Search by job, company or skills
Software Engineering,
Business Information Systems,
Cloud Data Services,
Data Visualization
Role and Responsibilities
Design, develop and deliver the building of ADF pipelines across various integration points for the Accounting & Reporting Stream.
Create and maintain scalable data pipelines using PySpark and ETL workflows using Azure Databricks and Azure Data Factory
Data Modelling and Architecture: Build and optimize data models and structures to support analytics and business requirements.
Performance Optimization: Monitor, tune, and troubleshoot pipeline performance to ensure efficiency and reliability.
Collaboration: Partner with business analysts and stakeholders to understand data needs and deliver actionable insights
Governance: Implement data governance practices, ensuring data quality, security, and compliance with regulations
Documentation: Develop and maintain comprehensive documentation for data pipelines and architecture
Testing: Experience in testing and test automation
Collaborate with cross-functional teams to understand data requirements and provide technical advice.
Qualifications
Experience Level: Extensive experience (at least 4yrs+) in Azure Databricks, Unity Catalog, Azure Data Factory, Azure DevOps Delta Lakes, DevSecOps, NFRs (Non-Functional Requirements); experience in insurance domain desirable
Education Level: bachelor's or master's degree in computer science, Information Technology, Data Science or a related relevant field.
Certification in Azure Data Engineer Associate or similar, and experience with additional Azure services (e.g., Azure Synapse Analytics, Azure SQL Database)
Personal Attributes: Strong analytical skills, leadership capabilities, effective communication, and problem-solving ability
Mandatory skills
Strong background in data engineering, with experience architecting complex data systems in large-scale environments.
Technical Skills: Proficiency in SQL, Azure Databricks, Blob Storage, Azure Data Factory, programming languages such as Python or Scala, Logic App, and Key Vault.
Communication: Strong problem-solving skills and the ability to communicate complex technical concepts to non-technical stakeholders
Skill Set : Data Engineering, Azure Databricks, Azure Data Factory, Data Lake
Location : Bangalore, Karnataka, India
Login to check your skill match score
Date Posted: 30/09/2024
Job ID: 94472173