- Collaborate with DW/BI leads to understand new ETL pipeline development requirements
Triage issues to find gaps in existing pipelines and fix the issues
Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs
Help joiner team members to resolve issues and technical challenges.
Drive technical discussion with client architect and team members
Orchestrate the data pipelines in scheduler (UC4 and Airflow)
Skills and Qualifications:
Bachelors and/or master s degree in computer science or equivalent experience.
Must have total 5+ yrs. of IT experience and 3+ years experience in Data warehouse/ETL projects.
Should have experience at least in 1 end to end implementation of Snowflake cloud data warehouse and 2 End to end data warehouse implementations on-premise.
Expertise in Snowflake - data modelling, ELT using SQL, implementing stored Procedures and standard DWH and ETL concepts
Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe
Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
Deep understanding of Star and Snowflake dimensional modelling
Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
Certified in Snowflake (SnowPro Core) (Desirable).
Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects
Should have experience working in Agile methodology
Strong verbal and written communication skills.
Strong analytical and problem-solving skills with a high attention to detail.
Self-motivated, collaborative, innovative, eager to learn, and hands on