Unlock yourself. Take your career to the next level.
At Atrium, we live and deliver at the intersection of industry strategy, intelligent platforms, and data science empowering our customers to maximize the power of their data to solve their most complex challenges. We have a unique understanding of the role data plays in the world today and serve as market leaders in intelligent solutions. Our data-driven, industry-specific approach to business transformation for our customers places us uniquely in the market.
Who are you
You are smart, collaborative and take ownership to get things done. You love to learn and are intellectually curious in business and technology tools, platforms and languages. You are energized by solving complex problems and bored when you don't have something to do. You love working in teams, and are passionate about pulling your weight to make sure the team succeeds.
What will you be doing at Atrium
In this role, you will join the best and brightest in the industry to skillfully push the boundaries of what's possible. You will work with customers to make smarter decisions through innovative problem-solving using data engineering, Analytics, and systems of intelligence. You will partner to advise, implement, and optimize solutions through industry expertise, leading cloud platforms, and data engineering.
As a Lead Data Engineering Consultant, you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.
In this role, you will:
- Create and maintain optimal data pipeline architecture
- Assemble large, complex data sets that meet functional / non-functional business requirements
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, DBT, Python, AWS, and Big Data tools
- Development of ELT processes to ensure timely delivery of required data for customers
- Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data
- Design, implement, and maintain data models that can support the organization's data storage and analysis needs
- Deliver technical and functional specifications to support data governance and knowledge sharing
In this role, you will have:
- Bachelor's degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education
- 4-8 years of experience delivering consulting services to medium and large enterprises. Implementations must have included a combination of the following experiences:
- Data Warehousing or Big Data consulting for mid-to-large-sized organizations.
- Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture
- Strong experience with Snowflake and Data Warehouse architecture preferred
- SnowPro Core certification is highly desired
- Hands-on experience with Python (Pandas, Dataframes, Functions)
- Hands-on experience with SQL (Stored Procedures, functions) including debugging, performance optimization, and database design
- Hands-on experience with Snowflake data loads
- Experience with data analysis and data processing using Python & Snowflake
- Experience with creating User defined functions using Snowflake SnowSQL
- Experience with SQL optimization, workload management using Snowflake
- Strong Experience with Apache Airflow and API integrations
- Experience with leading and mentoring junior data engineers
- Solid experience in any one of the ETL/ELT tools (DBT, Mulesoft, FiveTran, AirFlow, AirByte, Matillion, Talend, Informatica, SAP BODS, DataStage, Dell Boomi, Mulesoft, FiveTran, Matillion, etc.)
- Nice to have: Experience in Docker, DBT, data replication tools (SLT, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies
- Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment
- Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions
- Strong presentation and communication skills
Next Steps
Our recruitment process is highly personalized. Some candidates complete the hiring process in one week, others may take longer as it's important we find the right position for you. It's all about timing and can be a journey as we continue to learn about one another. We want to get to know you and encourage you to be selective - after all, deciding to join a company is a big decision!
At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employer and all qualified applicants will receive consideration for employment.