You'll build and maintain systems for efficient data collection, storage, and processing to ensure data pipelines are robust and scalable for seamless integration and analysis. We are seeking an individual who not only possesses the requisite expertise but also thrives in the dynamic landscape of a fast-paced global firm.
What You'll Do
- Design/implement complex and scalable enterprise data processing and BI reporting solutions.
- Build and optimize ETL pipelines for data integration and enhance data warehouse systems through architectural reviews.
- Ensure strict data compliance, security, and cost optimization.
- Re-architect data solutions for scalability, reliability, and resilience.
- Manage data schemas and flow to ensure compliance, integrity, and security.
- Deliver end-to-end data solutions across multiple infrastructures and applications.
- Build strong partnerships with other teams to create valuable solutions.
Qualifications
- 1.5 to 2.5 years of experience in data engineering role with an engineering degree in background.
- Proficient in ETL/ELT pipeline implementation.
- Extensive experience with handling large data and maintaining data quality. Automating ETL pipelines.
- Knowledge of data warehouses (Redshift, Snowflake, Databricks, Cloudera).
- Proficient in Python scripting, PySpark, and Spark.
- Experience with data ingestion, storage, and Consumption.
- Skilled in SQL and data schema management.
- Domain knowledge of the pharma landscape is a must have.
Skills: sql,python,etl,cloud,aws,azure,data warehouse,tableau