The AWS/Azure/GCP Architect at Koantek builds secure, highly scalable big data solutions to achieve tangible, data-driven outcomes all the while keeping simplicity and operational effectiveness in mind. This role collaborates with teammates, product teams, and cross-functional project teams to lead the adoption and integration of the Databricks Lakehouse Platform into the enterprise ecosystem and AWS/Azure/GCP architecture. This role is responsible for implementing securely architected big data solutions that are operationally reliable, performant, and deliver on strategic initiatives.
- Expert-level knowledge of data frameworks, data lakes and open-source projects such as Apache Spark, MLflow, and Delta Lake
- Expert-level hands-on coding experience in Spark/Scala,Python or Pyspark
- In depth understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, Spark Streaming, RDD caching, Spark MLib
- IoT/event-driven/microservices in the cloud- Experience with private and public cloud architectures, pros/cons, and migration considerations.
- Extensive hands-on experience implementing data migration and data processing using AWS/Azure/GCP services
- 10+ years in consulting experience with minimum4+ Years of experience in data engineering, data platform and analytics,
- Projects delivered with hands-on experience in development on databricks
- knowledge of any one cloud platform (AWS or Azure or GCP)
- Deep experience with distributed computing with spark with knowledge of spark runtime internals, Unity Catalog.
- Familiarity with CI/CD for production deployments
- Familiarity with optimization for performance and scalabilityCompleted data engineering professional certification and required classes