Has strong execution knowledge of Data Modeling, Databases in general (SQL and NoSQL), software development lifecycle and practices, unit testing, functional programming, etc.
10+ years of experience in data architecture, data engineering, or related roles.
5+ years of experience in building scalable enterprise data warehousing and modelling on prem and/or cloud (AWS, GCP, Azure).
Expertise in data structures, distributed computing, manipulating and analyzing complex high-volume data from variety of internal and external sources.
Expertise in cloud platforms such as Azure, AWS, or Google Cloud and their data services (e.g., Azure Data Lake, AWS S3, BigQuery).
Hands-on experience with data integration tools (e.g., Azure Data Factory, Talend, Informatica).
Experience in developing ETL designs and data models for structured/ unstructured and streaming data sources
Experience with real-time data streaming technologies like Kafka, Kinesis, or Event Hub.
Proficiency with big data tools and frameworks like Apache Spark, Databricks, Snowflake or Hadoop.
Experience with SQL and NoSQL databases (e.g., SQL Server, Snowflake, Cosmos DB, DynamoDB).
Solid knowledge of scripting and programming languages such as Python, Spark, Java, or Scala.
Design secure data solutions ensuring compliance with standards such as GDPR, HIPAA, or CCPA.
Implement and enforce data governance frameworks, including data cataloging, lineage, and access controls.