We are looking for a Data Engineer DevOps who should be able to perform the below roles
Responsibility:
- Managing and automating the deployment pipelines using Docker, Kubernetes, and tools like Jenkins or GitLab CI. Experience with CI/CD automation, creating pipelines, and working with cloud technologies like Azure is highly valuable
- Experience with Docker-Kubernetes deployment, CI/CD automation, and cloud technologies is essential for building and maintaining robust systems.
- With hands-on experience in Kafka and CI/CD automation, the candidate could work as an Integration Engineer, responsible for integrating various systems and applications, designing message queues, and ensuring seamless communication between different components.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Knowledge of airflow/ETL deployment, Kafka, Python, and SQL make the candidate suitable for a Data Engineering role.
Re quirements:
- Good understanding and experience in Docker-Kubernetes deployment, airflow/ETL deployment is preferred.
- Experience in CI/CD automation and creating pipelines (gitlab/ jenkins).
- Hands on experience in Kafka.
- Experience of working with Cloud technologies (Azure is preferred).
- Experience in Python and SQL.
- Strong problem-solving skills and the ability to work with large datasets.
- Good communication skills and the ability to work in a collaborative team environment.
- Knowledge of Agile methodologies.
- Working with internal and external stakeholders in an international environment.