Job Description
About
Roles & Responsibilities
We are a digital- rst technology services rm specializing in accelerating
business transformation and delivering human-centric digital experiences.
We have been meeting customers wherever they are in the digital lifecycle
and helping them outperform their competition through speed and
innovation. We brings together distinct core competencies in AI,
analytics, app development, cloud, commerce, CX, data, DevOps, IoT,
mobile, quality engineering and UX, and our deep expertise in BFSI,
healthcare, and life sciences -to help businesses capitalize on the unlimited
opportunities digital offers. Our reputation is built on a comprehensive
suite of engineering services, a dedication to solving clients toughest
technology problems, and a commitment to continuous improvement.
Backed by Goldman Sachs Asset Management and Everstone Capital,
now has a global presence of 15 of ces (and 10 delivery centers)
across four continents.
Requirements
Design, develop, and maintain scalable data pipelines and ETL
processes using Databricks and PySpark.
Implement data transformation and integration processes using
PySQL.
Optimize and enhance the performance of Databricks
environment.
Collaborate with data scientists, analysts, and business
stakeholders to understand data requirements and deliver
effective solutions.
Ensure data quality and integrity through robust data validation
and testing procedures.
Develop and maintain comprehensive documentation for data
systems and processes.
Quali Cations
Bachelor's degree in Computer Science, Information Technology,
or a related eld.
3+ years of experience in data engineering or a similar role.
Strong expertise in Databricks, including the use of PySpark and
PySQL.
Pro Cient In SQL And Experience With Other Programming
languages such as Python.
Familiarity with ETL tools and data integration techniques.
Experience with cloud platforms (e.g., AWS, Azure, GCP) and
related services.
Solid understanding of data warehousing concepts and best
practices.
Excellent problem-solving skills and attention to detail.
Strong communication and collaboration skills.
Our Perks And Bene Ts
Our bene ts and rewards program has been thoughtfully designed to
recognize your skills and contributions, elevate your learning/upskilling
experience and provide care and support for you and your loved ones. As
an Apexer, you get continuous skill-based development, opportunities for
career advancement, and access to comprehensive health and well-being
bene ts and assistance.
We Also Offer
Group Health Insurance covering family of 4
Term Insurance and Accident Insurance
Paid Holidays & Earned Leaves
Paid Parental LeaveoLearning & Career Development
Employee Wellness
Notes from Hiring Manager
Design, develop, and maintain scalable data pipelines and ETL processes
using Databricks and PySpark.
Implement data transformation and integration processes using PySQL.
Optimize and enhance the performance of Databricks environment.
Collaborate with data scientists, analysts, and business stakeholders to
understand data requirements and deliver effective solutions.
Ensure data quality and integrity through robust data validation and
testing procedures.
Develop and maintain comprehensive documentation for data systems
and processes.
Skills: gcp,sql,mobile,ux,ai,etl,devops,azure,python,iot,documentation,databricks,cloud,healthcare,pysql,data,commerce,analytics,bfsi,pyspark,aws,cx,life sciences,etl tools,quality engineering,app development,databricks environment,everstone capital,data warehousing,data validation,data validation and testing,goldman sachs asset management,data quality and integrity,enhance the performance of databricks,hadoop,scala,spark