About KPMG in India:
KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada.
KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment.
#KI
Job Description
This job opportunity is with KPMG India
Location: Bangalore
Experience : 12 to 16 Years
Job Summary:
We are seeking a highly skilled Senior Google BigQuery Data Engineer to lead the design, development, and optimization of our data platform. The ideal candidate will have a deep understanding of BigQuery's architecture, capabilities, and best practices, including columnar storage, nested and repeated fields, and separation of compute and storage. A proven track record of delivering complex data warehousing solutions on the Google Cloud Platform (GCP) is essential.
Responsibilities:
- Architect, design, and implement scalable data solutions leveraging Google BigQuery as the core data warehouse platform, utilizing features such as partitioning, clustering, and materialized views for optimal performance.
- Develop and optimize complex SQL queries and data pipelines using BigQuery for efficient data ingestion, transformation, and loading, including leveraging batch and streaming ingestion methods.
- Utilize BigQuery ML or integrate with other GCP ML services to build predictive models and derive insights from data.
- Implement data governance and security best practices within the BigQuery environment, including data encryption, access controls, and auditing.
- Collaborate with data analysts and business stakeholders to understand data requirements and translate them into BigQuery-based solutions, considering factors like data volume, query complexity, and latency.
- Mentor and guide junior data engineers in BigQuery best practices, including query optimization, performance tuning, and data modeling techniques.
- Lead data warehousing projects focused on BigQuery implementation and optimization, considering cost-efficiency, scalability, and maintainability.
Qualifications:
- 12+ years of experience in data warehousing and business intelligence.
- In-depth expertise in Google BigQuery, including advanced query optimization, performance tuning, and data modeling techniques.
- Strong SQL skills and proficiency in Python or other scripting languages for data processing.
- Proven experience in designing and implementing BigQuery data models and schemas, considering normalization, denormalization, and data partitioning strategies.
- Hands-on experience with BigQuery data ingestion and extraction methods, including batch loading, streaming ingestion, and external tables.
- Knowledge of other GCP services relevant to data engineering (e.g., Dataflow, Cloud Storage, Looker) and their integration with BigQuery.
- Strong problem-solving, analytical, and communication skills.
- Ability to work independently and collaboratively in a fast-paced environment.
- Preferred: Google Cloud certifications (e.g., Certified Data Engineer) and experience with machine learning or data science.
- Pre-Sales Support: Cross functional collaboration to understand client requirements and design tailored solutions.
- Prepare and deliver technical presentations and product demonstrations to potential clients.
- Assist in the preparation of proposals and RFP responses, ensuring technical accuracy and feasibility.
- Project Delivery: Manage and oversee the delivery of data analytics and data warehousing projects from inception to completion.
- Ensure projects are delivered on time, within scope, and within budget.
- Coordinate with cross-functional teams to ensure successful project execution and stakeholder satisfaction.
- The ideal candidate should have managed and delivered complex large data sciences projects.