Search by job, company or skills
Define key responsibilitiesto reflect the duties and responsibilities of this role.
Responsible for Building and maintaining high-performance data systems that enable deeper insights for all parts of our organization
Responsible for Developing ETL/ELT pipelines for both batch and streaming data
Responsible for Data flow for the real-time and analytics
Improving data pipelines performance by implementing the industry's best practices and different techniques for data parallel processing
Responsible for the documentation, design, development and testing of Hadoop reporting and analytical application.
Responsible for Technical discussion and finalization of the requirement by communicating effectively with Stakeholder
Responsible for converting functional requirements into the detailed technical design
Responsible for adhering to SCRUM timelines and deliver accordingly
Responsible for preparing the Unit/SIT/UAT test cases and log the results
Responsible for Planning and tracking the implementation to closure
Ability to drive enterprise-wide initiatives for usage of external data
Envision enterprise-wide Entitlement's platform and align it with Bank's NextGen technology vision
Continually looking for process improvements
Propose new ways of doing things
Suggests novel ways to fulfil requirements
Helps elaborate requirements where necessary
Coordinate between various technical teams for various systems for smooth project execution starting from technical requirements discussion, overall architecture design, technical solution discussions, build, unit testing, regression testing, system integration testing, user acceptance testing, go live, user verification testing and rollback [if required]
Prepare technical plan with clear milestone dates for technical tasks which will be input to the PM's overall project plan.
Coordinate with technical teams across technology on need basis who are not directly involved in the project example: Firewall network teams, DataPower teams, EDMP , OAM, OIM, ITSC , GIS teams etc.
Responsible to support change management process
Responsible to work alongside PSS teams and ensure proper KT sessions are provided to the support teams.
Ensure to identify any risks within the project and get that recorded in Risk wise after discussion with business and manager.
Ensure the project delivery is seamless with zero to negligible defects.
Responsibilities*
5+ years, hands-on working experience with Hadoop Hive, Spark, Python, PySpark
Hands on experience of workflow/schedulers like NIFI/Ctrl-m
Hands on experience with Unix shell scripting
Experience with Kafka, Spark streaming
Experience with Data loading tools like sqoop
Experience and understanding of Object-oriented programming
Motivation to learn innovative trade of programming, debugging, and deploying
Self-starter, with excellent self-study skills and growth aspirations, capable of working without direction and able to deliver technical projects from scratch
Excellent written and verbal communication skills. Flexible attitude, perform under pressure
Ability to lead and influence direction and strategy of technology organization
Test driven development, commitment to quality and a thorough approach to work
A good team player with ability to meet tight deadlines in a fast-paced environment
Guide junior's developers and share the best practices
Having Cloud certification will be an added advantage: any one of Azure/Aws/GCP
Must have Knowledge & understanding of Agile principles
Must have good understanding of project life cycle
Must have Sound problem analysis and resolution abilities
Good understanding of External & Internal Data Management & implications of Cloud usage in context of external data
Login to check your skill match score
Date Posted: 19/06/2024
Job ID: 82257027