You have a minimum of 5+ years of technical, hands-on experience building, optimizing, and implementing data pipelines and architecture
Hands-on experience to wrangle, explore, and analyse data to answer specific business questions and identify opportunities for improvement
You are a highly driven professional and enjoy serving in a fast-paced, dynamic role where delivering solutions to exceed high expectations is a measure of success
You have a passion for providing both formal and informal mentorship
You have strong communication and interpersonal skills
You have a deep understanding of data governance and data privacy best practices
The ideal candidate will have technical knowledge of the following:
Big data tools (e.g. Hadoop, Spark, Kafka, etc.) Relational SQL and NoSQL databases (e.g. Postgres, MySQL, SQL Server, Cassandra, MongoDB, etc.)
Data pipeline and workflow management tools (e.g. Azkaban, Oozie, Luigi, Airflow, etc.)
Stream-processing systems (e.g. Storm, Spark-Streaming, etc.) Scripting languages (e.g. Python, Java, C++, Scala, etc.)