Job Title: Data Engineer
Experience: 6 to 12 Years
Location: Chennai, India
Skills: SQL, Informatica, AWS, Snowflake
About Us
This position is being recruited by
Smartwork IT Services, a trusted partner in talent acquisition and product development. Smartwork IT Services is known for its innovation in recruitment and the development of cutting-edge products like
SWITS ATS (Applicant Tracking System) and
SWITS HRMS (Human Resource Management System). Our goal is to deliver exceptional value through comprehensive solutions that empower our clients with top-tier talent and efficient HR tools.
Job Description
Our client is looking for an experienced
Data Engineer to join their team in Chennai. This role requires strong skills in SQL, Informatica, AWS, and Snowflake to support data-driven insights and ensure the reliability, security, and quality of data systems.
Primary Skills- SQL:
- Advanced proficiency in SQL for data manipulation and querying.
- Experience with complex SQL queries, stored procedures, and performance tuning.
- Informatica:
- Hands-on experience with Informatica PowerCenter.
- Proficiency in designing, developing, and deploying ETL processes using Informatica.
- Knowledge of Informatica Cloud Data Integration (CDI) is a plus.
- AWS:
- Strong understanding of AWS services such as S3, Redshift, RDS, Lambda, and Glue.
- Experience in setting up and managing data pipelines on AWS.
- Familiarity with AWS security best practices.
- Snowflake:
- Experience with Snowflake data warehouse solutions.
- Proficiency in Snowflake SQL and performance optimization.
- Knowledge of data loading and unloading using Snowflake utilities.
Secondary Skills
- Data Warehousing: Understanding of data warehousing concepts and methodologies.
- Programming: Basic to intermediate knowledge of Python or Java for data processing and automation tasks.
- Data Visualization: Familiarity with tools like Tableau, Power BI, or QuickSight.
- Big Data Technologies: Knowledge of technologies such as Hadoop, Spark, or Kafka.
- DevOps: Experience with CI/CD pipelines, version control systems like Git, and containerization tools like Docker and Kubernetes.
Responsibilities
- Design, develop, and maintain ETL processes using Informatica.
- Build and manage data pipelines on AWS and Snowflake.
- Write complex SQL queries to extract, transform, and load data efficiently.
- Optimize and tune SQL queries for enhanced performance.
- Collaborate with data analysts and business stakeholders to understand data requirements.
- Ensure data quality and integrity across various sources.
- Implement data security best practices.
Qualifications
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 6+ years of experience in data engineering or related roles.
- Strong problem-solving skills and attention to detail.
- Excellent communication and teamwork skills.