Search by job, company or skills
Information Technology,
Information and Communications Technology (ICT),
Cloud Data Services,
Telecommunications,
Information Services
Responsibilities:
AWS Pipeline knowledge to develop ETL for data movement to Redshift, experience to map the source to target rules and fields.
Strong in writing complex queries with nested joins and derived tables.
Translating functional and technical requirement into detailed architecture and design
Coding and testing complex system components
Design, build, and maintain data pipelines
Analyses, tests, and implements physical database design.
Ensure data quality and integrity
Documents environment and maintains current patches
Monitor and optimize data performance
Ensures data recovery, maintenance, data integrity, and space requirements for physical database are met through formulation and monitoring of policies, procedures, and standards relating to database management.
Audits the current environments, provides capacity planning and best practices for the future production/development/test environments, which may include establishing new standards and procedures.
Conducts performance assessment and tuning as related to the database system.
Provides technical knowledge working with system administrators, technical managers, developers, and architects.
Must Have:
Must have hands on experience ofAmazon Redshift
CoreKnowledge on Integration and Architeure ability.
Amazon AWS.
Anyfrom the given (AWS EC2, S3, VPC, VPN, AWS ECS, AWS Aurora, AWS RDS, AWS Route 53)
S3 & DBis Must
Bachelor Of Technology (B.Tech/B.E), Bachelor Of Computer Application (B.C.A)
Date Posted: 05/08/2024
Job ID: 87554011
IT Consulting Services