Expertise in designing and implementing scalable data pipelines (e.g., ETL) and processes.
Experience building enterprise-scale data warehouse and database models end-to-end.
Proven hands-on experience with Snowflake and related ETL technologies.
Experience working with Tableau, Power BI or other BI tools.
Experience with native AWS technologies for data and analytics such as Redshift, S3, Lambda, Glue, EMR, Kinesis, SNS, CloudWatch, etc.
Experience with NoSQL databases (MongoDB, Elasticsearch).
Experience working with relational databases and awareness of writing & optimising SQL queries for analytics and reporting.
Experience developing scalable data applications and reporting frameworks.
Experience working with message queues, preferably Kafka and RabbitMQ.
Ability to write code in Python, Java, Scala or other languages.
Qualification:
5+ years experience in architecture of DW/Data Lake solutions for the Enterprise using multiple platforms.
Experience writing high quality, maintainable SQL on large datasets.
Expertise in designing and implementing scalable data pipelines (e.g., ETL) and processes in Data Warehouse/Data Lake to support dynamic business demand for data.
Experience working on building/optimising logical data models and data pipelines while delivering high quality data solutions that are testable and adhere to SLAs.
Excellent knowledge and experience of query optimisation and tuning.
Knowledgeable about a variety of strategies for ingesting, modelling, processing, and persisting data.