The Data Engineer sits within the Technical Solutions umbrella, whose mission is to play a vital role in technically supporting our customers throughout the lifecycle of Komodo s relationship.
The Data Engineer will play a crucial role, responsible for technical aspects.. You will interact and collaborate with SA s, SE s Product owners, and Customer Success to understand and articulate the technical requirement to FDE team members.
Looking back on your first 12 months at Komodo Health, you will have
- Obtained an understanding of Komodo s Healthcare Map, platform, and suite of products
- Executed creative software solutions, design, development, and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Developed secure high-quality production code, and reviewed and debugged code written by others
- Identified opportunities to eliminate or automate the remediation of recurring issues to improve the overall operational stability of software applications and systems
You will accomplish these outcomes through the following responsibilities
- Building trusting and influential relationships and engaging with stakeholders to gather technical requirements
- Collaborating with architects and developers to provide technical design guidance to align with strategy and applicable technical standards
- Working independently to evaluate and make strategic decisions that will address specific solution design needs
What you bring to Komodo Health (required):
- Minimum of 4 years experience coming from a background in professional software and data engineering on customer-facing products
- Hands-on practical experience delivering system design, application development, testing, and operational stability
- Experienced in designing and developing large-scale, multi-tiered, embedded, or distributed software applications, tools, systems, and services using Python, Spark, Hadoop etc
- Proficiency in developing complex data pipelines, ETLs, and workflows on cloud platforms optimized for high volumes of healthcare data
- Expertise in designing modern data architecture and knowledge of tools like Spark, Kafka, Databricks, and Snowflake
Additional skills and experience we d prioritize (nice to have)
- Knowledge of computer infrastructure, such as networking, firewalls, load balancers, and auto-scaling
- Experience with AWS services, eg CloudFormation, Lambda, EC2, VPC, S3, DynamoDB, SNS, SQS, ECS, EKS, Route53, ELB/ALB
- Experience with building and deploying applications as containers using Docker and Kubernetes
- Strong familiarity with DevOps implementations like CI/CD, Applicant Resiliency, and Security
- Good understanding of AI/ML concepts and use cases.
Industry certifications such as:
- AWS Certified Solutions Architect - Professional
- AWS Certified Developer - Associate
- Spark, Airflow
What We Offer