Job Title: AI Architect (Senior)
Location: India-based Noida preferred, Chennai or Bangalore optional
Job Type: Full-time
Band: E3/4
Job Description
We are seeking an experienced Platform Architect with a strong background in AI/ML, LLM, SLM, Gen-AI, and Data Engineering to support the design, development, and co-creation of cutting-edge AI solutions, with a strong focus on hands-on execution and rapid prototyping. This role will require expertise in building real-world solutions that can be quickly demonstrated to clients, leveraging open-source tools and partner solutions to showcase impactful use cases across the industry verticals. The ideal candidate will be agile, hands-on, and capable of creating demos on the fly. They should be able to solution, build demos, play with customer data to build use-cases, and work with open-source elements and partner solutions (ISVs/SaaS/Technology majors) to show value-based proof-of-concepts. The candidate should have strong conceptualization skills and the ability to process data and build quick real-life demos and have excellent presentation skills to explain the solution to clients. The Architect should also be competent to work with internal stakeholders, partners and clients proactively, as well as support responses to RFIs and RFPs, clearly exhibiting the ability to think out-of-the-box and build solution framework and architectural approach that stands well and uniquely differentiated.
Key Responsibilities
- Design, prototype, and architect scalable AI and ML Platform solutions using LLMs, ML models, and data engineering pipelines to solve business challenges in the products and services operations.
- Support proactive client conversations and responses to RFx and work as a team to evolve highly differentiated and innovative approach to meet and excel customer needs
- Work directly with customer data to build use cases and deliver quick, functional demos, demonstrating tangible outcomes with minimal turnaround time.
- Create and deliver demos on the fly to showcase capabilities and solutions.
- Develop and implement use-cases by working with customer data.
- Collaborate with open-source and partner solutions (ISVs, SaaS, Technology Partners) to create proof-of-concept (POC) projects that illustrate value in real-life scenarios.
- Lead bootcamp-style innovation workshops and competitions (e.g., innovation days), showcasing conceptualized solutions through rapid prototyping and demos.
- Architect seamless integrations for AI outputs with business systems and dashboards, focusing on formats like JSON, arrays, HTML, and UI/UX design for demos.
- Develop and optimize AI pipelines, ensuring quick iteration from concept to demo, including data preprocessing, model training, and evaluation for impactful results.
- Stay informed on the latest tools, frameworks, and products in Platform, AI/ GenAI, LLM/ SLM, and data engineering, ensuring solutions leverage the most appropriate technologies.
- Present AI solutions clearly and effectively to both technical and non-technical stakeholders, demonstrating strong communication and presentation skills.
- Participate in innovation-day competitions and similar events to showcase tangible solutions.
- Provide technical leadership and guidance to development teams.
- Ensure the scalability, performance, and security of platform solutions.
- Stay updated with the latest industry trends and technologies.
Qualifications
- Bachelor's or master's degree in computer science, engineering, or a related field.
- 10+ years of hands-on experience in full-stack engineering, including AI and data engineering, with proficiency in Python, C#, and architecture design.
- 5+ years in technical leadership, setting project direction and delivering AI solutions in real-world scenarios.
- Proven experience as a Platform Architect or similar role deploying AI systems in production, across vertical use cases, including customer service and operational improvements.
- Strong knowledge of natural language generation, AI/ML, LLMs, machine learning algorithms, Gen-AI and data engineering pipelines.
- Familiarity with open-source platforms and partner solutions (SymphonyAI, Nvidia, Intel OpenVino) to quickly deliver results preferred.
- Experience with AI frameworks (PyTorch, TensorFlow, Hugging Face) and platforms (AWS, GCP, Azure), including containerization.
- Experience building rapid POCs and demos, with the ability to conceptualize and deliver tangible outcomes in short timeframes.
- Hands-on experience with open-source elements and partner solutions.
- Excellent problem-solving and analytical skills, presentation skills, capable of explaining technical concepts and demoing AI solutions to a diverse audience.
- Strong conceptualization and demo-building skills.
- Experience in participating in innovation-day competitions or similar events.
- Ability to work in a fast-paced and agile environment.
Preferred Skills
- Experience with cloud platforms (e.g., AWS, Azure, Google Cloud), Open-AI/Gen-AI stack preferably knowledge of Intel's OpenVino, GetiTM , SceneScape and/or Nvidia's omniverse, CUDA toolkit, TenserFlow framework, or other similar technologies.
- Good AI/ML and GenAI capabilities in terms of developing and deploying AI/ML models and algorithms, implementing GenAI solutions for various use-cases, working with frameworks like TensorFlow, PyTorch, and other AI/ML tools, integrating AI/ML solutions with existing systems and platforms.
- Experience in self-supervised learning, transfer learning, and reinforcement learning applied to service industry use cases.
- Knowledge of AI ethics and responsible AI practices, with a focus on delivering fair and secure AI solutions.
- Familiarity with vector databases like Pinecone, and other cutting-edge data engineering tools.
- Knowledge of designing and implementing Digital Twins / Asset performance capabilities, real-time simulation models to mirror physical assets, Integrating digital twin solutions with IoT and other data sources, optimizing asset performance and predictive maintenance using digital twin technology is preferred..
- Good Data Engineering experiences in design and implementation of data pipelines and ETL processes, working with large datasets to extract, transform, and load data for analysis, ensuring data quality, integrity, and security, optimizing data storage and retrieval for performance and scalability.
- Knowledge of data privacy and security regulations.
- Familiarity with DevOps practices and tools.
Skills: c#,llms,intel,docker,gcp,gen-ai,containerization,natural language generation,devops,artificial intelligence,slm,aws,tensorflow,presentation skills,data engineering,python,pytorch,azure,analytical skills,ml,ai/ml,ai frameworks,architecture development,nvidia,etl,demo-building skills,open-source platforms,problem-solving,machine learning algorithms,architects,architecture design