Search by job, company or skills
Design, implement, and maintain scalable cloud-based infrastructure solutions.
Perform bug fixes and configurations to maintain the stability and reliability of the
data platform
Be able to programmatically automate complex manual workloads and provisioning.
Develop and implement automation scripts for deploying and managing cloud
resources for distribution pipelines.
Monitor and optimize the performance of cloud-based infrastructure to ensure
maximum uptime and availability.
Collaborate with other engineers and stakeholders to identify and resolve technical
issues.
General or previous experience with Software Development
Preferred skills and qualifications:
Bachelor's or Master's degree in Computer Science, Engineering
5+ years of professional experience in designing, building, and scaling services,
especially using AWS cloud infrastructure
3+ years experience with data engineering technologies:
o Python & PySpark development
o Apache Spark or Databricks for large-scale data processing
o AWS Glue Cataloging
o Apache Trino (Presto) or AWS Athena for distributed SQL queries
o Apache Iceberg/Apache Hive for table management
o Apache Airflow for workflow orchestration and ArgoCD
3+ years experience with Kubernetes, Docker, and various strategies for deploying
clusters through automation
3+ years building out CI / CD pipelines in Github or similar technologies
3+ years experience with observability toolsets (Datadog, Prometheus, Grafana, etc.)
Develop a configurable data platform enabling ETL solutions
Excellent communication and collaboration skills
Ability to work independently
Location: Bangalore
Experience: 5-10yrs
Date Posted: 29/06/2024
Job ID: 83466471