About SGK
SGK is a global packaging and brand experience company. From idea to implementation, we deliver packaging solutions and brand experiences that give brands the freedom to speak louder, scale faster and grow stronger. We simplify marketing and amplify brands to deliver value. SGK is a Matthews International company.
Position Summary
SGK seeks an AI Data and MLOps Engineer to join our Production IT team and AI Lab. Directed by the Director of Digital Content Strategic Solutions A.I. & Cloud Governance, you'll work on innovative projects at the intersection of AI, Data Science, MLOps, and Cloud technologies. You'll design, develop, and maintain data pipelines and infrastructure to support AI-powered solutions on AWS Cloud environments. Your role will involve implementing MLOps practices, automating model deployment, and setting up monitoring systems to ensure seamless integration and performance of AI models and applications.
Key Responsibilities
- Design and implement scalable data pipelines on AWS for ingesting, transforming, and storing large volumes of data
- Develop data warehouses, data lakes, and other storage solutions on AWS, ensuring data quality, security, and accessibility
- Collaborate with AI developers and data scientists to understand and meet data requirements for AI models
- Implement data governance and management best practices, ensuring compliance with privacy and security regulations
- Optimize data pipelines and infrastructure for performance, cost-effectiveness, and reliability
- Monitor and troubleshoot data pipelines, ensuring high availability and minimal downtime
- Design and implement CI/CD pipelines for automated model training, testing, and deployment
- Automate model deployment and scaling using containerization (Docker) and orchestration tools (Kubernetes)
- Develop and implement monitoring systems to track AI model performance and manage model drift
- Use infrastructure automation tools (Terraform, AWS CloudFormation) for cloud resource management
- Implement model versioning and experiment tracking (MLflow, DVC) for traceability and reproducibility
- Collaborate cross-functionally to integrate AI models into existing applications and services
- Stay updated with the latest advancements in data engineering, AI, and cloud technologies
Qualifications
- Bachelor's or Master's in Computer Science, Data Science, or related field
- Proven experience in data engineering, focusing on AWS data pipelines and infrastructure
- Strong proficiency in Python, Java, or Scala; experience with Apache Spark and Hadoop
- Expertise in AWS data services (S3, Glue, Athena, Redshift, EMR)
- Knowledge of data warehousing, modeling, and management best practices
- Experience with data streaming technologies (Apache Kafka, AWS Kinesis)
- Familiarity with AI and machine learning concepts and frameworks (TensorFlow, PyTorch, scikit-learn)
- Experience with CI/CD tools, Docker, Kubernetes, and AWS CloudFormation
- Knowledge of model monitoring tools (Prometheus, Grafana) and MLOps frameworks (Kubeflow, MLflow, AWS SageMaker)
- Understanding of security best practices for data and AI model deployment
- Strong problem-solving, analytical, and communication skills