Search by job, company or skills

Wrike

Seasoned Data Engineer - GCP/AWS, Python, SQL

Early Applicant
  • 5 months ago
  • Be among the first 50 applicants

Job Description

  • Own data assets and data pipelines that provide actionable insights into customer, product, GTM, and other key business functions
  • Design, develop and maintain scalable data pipelines and transformations using data from a variety of engineering and business systems (e.g., Salesforce/CPQ, NetSuite, Marketo)
  • Collaborate with Analysts to improve data models that feed business intelligence tools, increasing data accessibility and drive adoption of data
  • Deploy ML models together with Data Science teams according to best practices of ML life cycle, and improve our AI infrastructure
  • Implement processes and systems to manage data quality, ensuring production data is always accurate and maintains SLAs
Experience Requirements
  • Work experience building maintaining data pipelines on data-heavy environments (Data Engineering, Backend with emphasis on data processing, Data Science with emphasis on infrastructure)
  • Strong knowledge of Python required
  • Advanced working SQL knowledge and experience working with relational databases
  • 7+ years experience with Data Warehousing Solutions (BigQuery, Redshift, Snowflake, Vertica, or similar)
  • 7+ years experience with Data Pipeline Orchestration (Airflow, Dagster, Prefect, or similar)
  • Confidence in using Git, CICD, and containerization
  • Google Cloud Platform, AWS or Azure experience
  • Database architecture experience
Desired Skills
  • Experience with major B2B vendor integrations (Salesforce/CPQ, NetSuite, Marketo, etc.)
  • Good understanding of Data Modelling (Kimball, Inmon, SCDs, Data Vault, Anchor)
  • Knowledge of Python Data Libraries (Pandas/SciPy/NumPy/Sci-Kit Learn/TF/PyTorch)
  • Experience with Data Quality Tools, Monitoring and Alerting
  • Experience with Enterprise Data Governance, Master data management, Data Privacy and Security (GDPR, CCPA)
  • Familiarity with Data Streaming and CDC (Google Pub/Sub, Google DataFlow, Apache Kafka, Kafka Streams, Apache Flink, Spark Streaming, or similar)
  • Experience with building analytic solutions in a B2B SaaS environment
  • Experience partnering with go-to-market, sales, customer success, and marketing teams
Interpersonal skills
  • Good communication, collaborative demeanor and ability to work in distributed, multi-functional, multinational teams with the ability to articulate a point of view.
  • Fostering a fun and productive team environment

More Info

Industry:Other

Function:Data Engineering

Job Type:Permanent Job

Skills Required

Login to check your skill match score

Login

Date Posted: 09/06/2024

Job ID: 81258233

Report Job

About Company

Follow

Hi , want to stand out? Get your resume crafted by experts.

Similar Jobs

Seasoned Data Engineer GCP AWS Python SQL

WrikeCompany Name Confidential

Data Engineer Databricks AWS Data Warehousing SQL

ADAR Pvt LtdCompany Name Confidential
Last Updated: 23-11-2024 06:32:15 PM
Home Jobs in Bengaluru / Bangalore Seasoned Data Engineer - GCP/AWS, Python, SQL