Role Objective:
We are seeking highly skilled Data Engineers to join our innovations group. The selected candidates will lead projects to build accelerators, Points of View (PoVs), Proof of Concepts (PoCs), and conduct research on emerging technologies.
Key Responsibilities:
- Design, develop, and maintain end-to-end data analytics and technology solutions.
- Work with data warehouses and data lakes to ensure data accuracy, availability, and accessibility.
- Implement ELT/ETL processes using industry-standard tools and best practices.
- Collaborate with data scientists, analysts, and stakeholders to meet data needs.
- Develop and maintain comprehensive documentation for data processes, systems, and protocols.
- Stay updated with emerging technologies and industry trends to apply them in current and future projects.
Required Skills & Experience:
- 5+ years of experience in Data Engineering.
- 3+ years of strong programming experience with Python, PySpark, and SQL.
- Minimum of 2 years of experience with Snowflake (SnowSQL, Snowpipe, Snowpipe Streaming, Snowpark are mandatory).
- Knowledge of Machine Learning and Generative AI platforms.
- Clear understanding of data warehouse fundamentals, data modeling, and ELT/ETL processes.
- Experience with Apache Spark, Azure Databricks, and PySpark.
- Hands-on experience with cloud platforms such as Azure, AWS, or Databricks.
- Strong verbal and written communication skills.
- Ability to learn and adapt quickly to new technologies and methodologies.
Preferred Skills:
- Experience with AI/ML lifecycle and related technologies.
- Familiarity with additional cloud platforms and data engineering tools.
Educational Qualifications:
- BE/B.Tech, MCA, M.Sc, or M.Tech from reputable institutions(preferably Tier 1/2 such as IITs, NITs, BITS Pilani, etc.)