We are looking for a highly skilled Senior Data Engineer with expertise in Full Stack Development and a deep understanding of Google Cloud Platform (GCP) to join our team. In this role, you will design, implement, and optimise data pipelines, leverage GCP services, and contribute to full stack application development. Your work will be crucial in building scalable, data-driven solutions that empower the organisation to make informed decisions.
Key Responsibilities: - Data Engineering on GCP:
- Design, develop, and maintain scalable data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
- Implement ETL processes that ensure the accurate and timely movement of data across various systems within the GCP ecosystem.
- Optimise data architectures for performance, cost efficiency, and reliability, leveraging GCP s suite of tools.
- Ensure data security, governance, and compliance within the GCP environment.
- Implement CI/CD pipelines using GCP tools like Cloud Build, and manage containerized applications
- Full Stack Development:
- Develop full stack applications and deploy them on GCP data services, ensuring robust, user-friendly interfaces and backend systems, implement APIs and microservices that interact seamlessly wherever required
- Contribute to the data visualisations that provide actionable insights within the application.
- Maintain and optimise existing full stack solutions, ensuring they meet performance and scalability requirements
- Cross-functional Collaboration:
- Work closely with data scientists and other stakeholders to translate business requirements into technical solutions.
- Lead the data engineering aspects of cross-functional projects, ensuring data solutions are integrated seamlessly with other applications.
- Mentor and guide junior engineers in both data engineering and cloud practices, fostering a culture of continuous learning
Qualifications: - Education:
- Bachelor s or Master s degree in Computer Science, Engineering, Information Technology, or a related field.
- Experience:
- 3-4 years of experience in data engineering, with a strong focus on GCP services like BigQuery, Dataflow, and Pub/Sub.
- Proven experience in full stack development, with proficiency in both front-end and back-end technologies.
- Hands-on experience with SQL, NoSQL databases, and cloud-native data warehousing.
- Proficiency in programming languages such as Python.
- Extensive experience with cloud infrastructure management, particularly within the GCP ecosystem.
- Skills:
- Deep understanding of GCP data services, architecture, and best practices.
- Strong knowledge of RESTful API design, microservices architecture, and containerization.
- Proficiency in front-end frameworks (e.g., React, Angular) and back-end frameworks (e.g., Node.js, Flask).
- Experience with CI/CD pipelines, Kubernetes, and cloud-native application development.
- Excellent problem-solving skills and ability to optimize complex cloud-based systems.
- Strong communication and collaboration skills, with the ability to lead projects and work effectively in a team environment.
Certifications: - GCP certifications (e.g., Associate Cloud Engineer or Professional Data Engineer)