KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada.
KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment.
Should have experience in Data and Analytics and overseen end-to-end implementation of data pipelines on cloud-based data platforms.
Strong programming skills in Python, Pyspark and some combination Java, Scala (good to have)
Experience writing SQL, Structuring data, and data storage practices.
Experience in Pyspark for Data Processing and transformation.
Experience building stream-processing applications (Spark steaming, Apache-Flink, Kafka, etc.)
Maintaining and developing CI/CD pipelines based on Gitlab.
You have been involved assembling large, complex structured and unstructured datasets that meet functional/non-functional business requirements.
Experience of working with cloud data platform and services.
Conduct code reviews, maintain code quality, and ensure best practices are followed.
Debug and upgrade existing systems.
Nice to have some knowledge in Devops.
Equal employment opportunity information
KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.
Qualifications
Bachelor's degree in computer science or related field
Experience in Snowflake and Knowledge in transforming data using Data build tool.
Strong programming skills in Python, Pyspark and some combination Java, Scala (good to have)
Experience in AWS and API Integration in general with knowledge of data warehousing concepts.
Excellent communication and team collaboration skills