Staff Data Engineer

November 19

Apply Now
Logo of Oportun

Oportun

Financial Services • Responsible Lending • Pre-Paid Debit Cards • Serving the Underbanked • Data Analytics

1001 - 5000 employees

Founded 2006

💳 Fintech

💸 Finance

Description

• Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. • We provide intelligent borrowing, savings, and budgeting capabilities, empowering members with the confidence to build a better financial future. • Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. • We celebrate and nurture our inclusive culture through our employee resource groups. • As a Senior Data Engineer, you will be responsible for designing, developing, and maintaining sophisticated software/data platforms. • This includes technical leadership, architectural design, data pipeline development, database management, project management, data quality governance, and collaboration. • Your expertise will guide junior engineers, ensuring high-quality, scalable solutions.

Requirements

• Bachelor's or Master's degree in Computer Science, Data Science, or a related field. • 10+ years of experience in data engineering, with a focus on data architecture, ETL, and database management. • Ability to identify opportunities for optimization and efficiency gains within the data pipelines/architecture/infrastructure. • Work like a technical thought leader of the team and provide necessary guidance and mentorship to senior team members. • Ability to take ownership of critical projects and incaves, providing project leadership, and ensuring successful delivery through effective project management and communication. • Involve in rigorous code and data quality reviews, offering valuable feedback to maintain best practices, quality, performance, and maintainability. • Lead the team in designing and building complex end-to-end data pipelines. • Proficiency in programming languages like Python/PySpark and Java /Scala • Expertise in big data technologies such as Hadoop, Spark, Kafka, etc. • In-depth knowledge of SQL and experience with various database technologies (e.g., PostgreSQL, MySQL, NoSQL databases). • Guide the team in orchestrating and designing job schedules using the CICD tools like Jenkins and Airflow. • Ability to work in an Agile environment (Scrum, Lean, Kanban, etc) • Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services (e.g., AWS Redshift, S3, Azure SQL Data Warehouse). • Strong leadership, problem-solving, and decision-making skills. • Excellent communication and collaboration abilities.

Benefits

••

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com