About Oportun
Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009.
WORKING AT OPORTUN
Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups.
Position Overview
As a Sr. Data Engineer at Oportun, you will be a key member of our team, responsible for designing, developing, and maintaining sophisticated software / data platforms in achieving the charter of the engineering group. Your mastery of a technical domain enables you to take up business problems and solve them with a technical solution. With your depth of expertise and leadership abilities, you will actively contribute to architectural decisions, mentor junior engineers, and collaborate closely with cross-functional teams to deliver high-quality, scalable software solutions that advance our impact in the market. This is a role where you will have the opportunity to take up responsibility in leading the technology effort from technical requirements gathering to final successful delivery of the product - for large initiatives (cross-functional and multi-month-long projects).
Responsibilities
Data Architecture and Design:
- Lead the design and implementation of scalable, efficient, and robust data architectures to meet business needs and analytical requirements.
- Collaborate with stakeholders to understand data requirements, build subject matter expertise, and define optimal data models and structures.
Data Pipeline Development And Optimization
- Design and develop data pipelines, ETL processes, and data integration solutions for ingesting, processing, and transforming large volumes of structured and unstructured data.
- Optimize data pipelines for performance, reliability, and scalability.
Database Management And Optimization
- Oversee the management and maintenance of databases, data warehouses, and data lakes to ensure high performance, data integrity, and security.
- Implement and manage ETL processes for efficient data loading and retrieval.
Data Quality And Governance
- Establish and enforce data quality standards, validation rules, and data governance practices to ensure data accuracy, consistency, and compliance with regulations.
- Drive initiatives to improve data quality and documentation of data assets.
Mentorship And Leadership
- Provide technical leadership and mentorship to junior team members, assisting in their skill development and growth.
- Lead and participate in code reviews, ensuring best practices and high-quality code.
Collaboration And Stakeholder Management
- Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand their data needs and deliver solutions that meet those needs.
- Communicate effectively with non-technical stakeholders to translate technical concepts into actionable insights and business value.
Performance Monitoring And Optimization
- Implement monitoring systems and practices to track data pipeline performance, identify bottlenecks, and optimize for improved efficiency and scalability.
Common Requirements
- You have a strong understanding of a business or system domain with sufficient knowledge & expertise around the appropriate metrics and trends. You collaborate closely with product managers, designers, and fellow engineers to understand business needs and translate them into effective solutions.
- You provide technical leadership and expertise, guiding the team in making sound architectural decisions and solving challenging technical problems. Your solutions anticipate scale, reliability, monitoring, integration, and extensibility.
- You conduct code reviews and provide constructive feedback to ensure code quality, performance, and maintainability. You mentor and coach junior engineers, fostering a culture of continuous learning, growth, and technical excellence within the team.
- You play a significant role in the ongoing evolution and refinement of current tools and applications used by the team, and drive adoption of new practices within your team.
- You take ownership of (customer) issues, including initial troubleshooting, identification of root cause and issue escalation or resolution, while maintaining the overall reliability and performance of our systems.
- You set the benchmark for responsiveness and ownership and overall accountability of engineering systems.
- You independently drive and lead multiple features, contribute to (a) large project(s) and lead smaller projects. You can orchestrate work that spans multiples engineers within your team and keep all relevant stakeholders informed. You support your lead/EM about your work and that of the team, that they need to share with the stakeholders, including escalation of issues
Qualifications
- Bachelor's or Master's degree in Computer Science, Data Science, or a related field.
- 5+ years of experience in data engineering, with a focus on data architecture, ETL, and database management.
- Proficiency in programming languages like Python/PySpark and Java or Scala
- Expertise in big data technologies such as Hadoop, Spark, Kafka, etc.
- In-depth knowledge of SQL and experience with various database technologies (e.g., PostgreSQL, MariaDB, NoSQL databases).
- Experience and expertise in building complex end-to-end data pipelines.
- Experience with orchestration and designing job schedules using the CICD tools like Jenkins, Airflow or Databricks
- Ability to work in an Agile environment (Scrum, Lean, Kanban, etc)
- Ability to mentor junior team members.
- Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services (e.g., AWS Redshift, S3, Azure SQL Data Warehouse).
- Strong leadership, problem-solving, and decision-making skills.
- Excellent communication and collaboration abilities.
- Familiarity or certification in Databricks is a plus.
We are proud to be an Equal Opportunity Employer and consider all qualified applicants for employment opportunities without regard to race, age, color, religion, gender, national origin, disability, sexual orientation, veteran status or any other category protected by the laws or regulations in the locations where we operate.
California applicants can find a copy of Oportun's CCPA Notice here: https://oportun.com/privacy/california-privacy-notice/.
We will never request personal identifiable information (bank, credit card, etc.) before you are hired. We do not charge you for pre-employment fees such as background checks, training, or equipment. If you think you have been a victim of fraud by someone posing as us, please report your experience to the FBI's Internet Crime Complaint Center (IC3).