Search by job, company or skills
Job Title: Senior Snowflake Data Engineer
Location: Hi-tech city, Hyderabad, Telangana, India
Job Type: Full-time
Timings: UK Shift or US EST
Summary:
We are seeking an elite candidate who has strong understating of data warehouse concepts, with exceptional skills in Snowflake with expertise in ETL and ELT processes, specifically within the insurance industry. This role is designed for a professional, highly motivated, self-disciplined with deep technical expertise and a strategic mindset, and willing to follow directions from the USA team. This role will focus on developing, optimizing, and managing using Snowflake's cloud data platform to support analytics, reporting, and business intelligence initiatives. The ideal candidate will have a strong understanding of insurance data and processes, along with a background of data warehousing, ETL and ELT practices.
Key Responsibilities:
Data Pipeline Development & Management:
Design, build, and maintain efficient and scalable ETL/ELT data pipelines to integrate, transform, and load data into Snowflake from a variety of sources (internal and external).
Develop and optimize high-performance queries and ensure proper data processing using Snowflake's native features (e.g., clustering, materialized views, streams).
Automate data workflows using tools like Apache Airflow, DBT, or custom orchestration frameworks.
Data Architecture & Modeling:
Collaborate with data architects and business teams to design and implement efficient data models within Snowflake, ensuring scalability and performance.
Apply best practices for data warehousing, including star/snowflake schemas, and help define the organization's data structure.
Ensure data quality, integrity, and compliance with governance standards.
Performance Optimization:
Continuously monitor data pipelines and workflows for efficiency, troubleshooting and resolving issues as needed.
Implement and optimize data partitioning, clustering, and query performance tuning to enhance processing speeds and reduce costs.
Collaboration & Mentorship:
Work closely with data scientists, analysts, and business intelligence teams to understand data needs and design pipelines that provide clean, reliable, and actionable data.
Mentor junior and mid-level data engineers, promoting best practices, code reviews, and knowledge sharing.
Cloud Infrastructure & Security:
Leverage cloud technologies (AWS, Azure, GCP) to integrate Snowflake with other systems and ensure the smooth flow of data.
Implement and enforce data security practices including encryption, access control, and monitoring to ensure compliance with organizational and regulatory standards.
Continuous Improvement:
Stay up to date with the latest developments in Snowflake and related cloud technologies, implementing improvements and optimizations where applicable.
Proactively identify bottlenecks, inefficiencies, and areas for improvement in data processes and provide solutions.
Qualifications:
Minimum of 10 years proven experience building ETL/ELT pipelines, integrating data from various sources into data warehouses.
Minimum of 5 years of experience in data engineering, with at least 2-3 years of direct experience working with Snowflake.
Extensive experience in the insurance industry with knowledge of insurance data, processes, and regulations.
Strong background in working with cloud data platforms (AWS, Azure, or GCP) and integrating Snowflake with cloud storage (e.g., S3, Blob Storage, or Google Cloud Storage).
Technical Skills:
Strong proficiency in SQL, including experience with complex queries, stored procedures, and window functions.
Expertise in data modeling, data warehousing, and best practices for schema design (star, snowflake schemas, etc.).
Experience with data pipeline orchestration tools like Apache Airflow, DBT, or similar.
Knowledge of Python, Java, or Scala for writing custom transformation logic and automation scripts.
Familiarity with data governance, security, and compliance frameworks in cloud environments.
Snowflake-Specific Skills:
Expertise in Snowflake's architecture, data storage, query optimization, and scaling strategies.
Experience with Snowflake features such as streams, tasks, materialized views, time travel, and zero-copy cloning.
Soft Skills:
Excellent communication skills with the ability to collaborate effectively with cross-functional teams.
Strong problem-solving skills with a focus on data quality and performance.
Ability to mentor and lead junior engineers while maintaining a focus on teamwork and shared goals.
Preferred Qualifications:
Experience with data integration and orchestration tools (e.g., Apache Airflow).
Familiarity with cloud data platforms and services (e.g., AWS, Azure, or GCP).
Insurance-related certifications (e.g., CPCU, ARe, ARM).
Why Join Us
This elite role provides a unique opportunity to work at the intersection of data engineering, cloud technology, and insurance. You will play a key role in transforming and analyzing data for impactful business insights within the insurance industry, working alongside other top professionals in a dynamic, forward-thinking environment. If you are passionate about data and have a strong understanding of the insurance industry, we would love to have you on our team.
Industry:Other
Job Type:Permanent Job
Date Posted: 07/11/2024
Job ID: 99544529