Search by job, company or skills
Senior Data Engineer ( please don't apply below 10 years experience resume)
Remote Contract
10+ years.
Working time : USA EST time zone (4:00 PM 1:00 AM IST)
kindly send resume to [Confidential Information]
Must-have: Mortgage domain, Matillion, SQL, Python, Snowflake, DBT, ETL Pipeline, and Managerial skills.
Good to have: Power BI, AWS, Azure DevOps, and Control M
Job Description:
We are seeking a highly skilled and motivated Data Engineer to join our team. In this role, you will play a crucial part in designing, developing, and maintaining our data infrastructure to enable efficient data processing, analysis, and reporting. Your expertise in Snowflake, SQL, and Python will be essential in building and optimizing data pipelines to ensure the availability, scalability, and reliability of our data ecosystem.
Responsibilities:
Collaborate with cross-functional teams to understand data requirements and translate them into effective data solutions.
Design and implement robust, scalable data pipelines using Snowflake, SQL, and Python to extract, transform, and load (ETL) data from various sources.
Optimize data workflows for performance, reliability, and maintainability.
Develop and maintain data models, ensuring data accuracy, consistency, and integrity.
Work with large datasets, both structured and unstructured, to derive meaningful insights.
Identify and address data quality issues through data profiling, validation, and cleansing techniques.
Monitor and troubleshoot data pipelines to ensure data availability and accuracy.
Continuously improve data processes and automation to streamline data operations.
Collaborate with data scientists and analysts to support their data needs and facilitate data-driven decision-making.
Qualifications:
Bachelor's degree in Computer Science, Information Technology, or a related field.
Strong experience with Snowflake for data warehousing and analytics.
Proficiency in SQL for querying and manipulating data.
Extensive programming experience in Python for building data pipelines and automating data processes.
Solid understanding of data modeling concepts and database design principles.
Familiarity with data integration, transformation, and ETL techniques.
Experience with version control systems (e.g., Git) and CI/CD pipelines is a plus.
Excellent problem-solving skills and ability to work in a dynamic, collaborative environment.
Strong communication skills to collaborate effectively with technical and non-technical stakeholders.
Basic domain experience is a plus, particularly in understanding specific terms and developing testing plans tailored to the assets we manage.
Preferred Skills:
Experience with AWS Services: EC2, S3, etc.
Experience using Bash, and Python.
Proficiency with AWS CLI.
Experience with batch processing tools like Autosys.
Login to check your skill match score
Date Posted: 19/11/2024
Job ID: 100834017