Data Engineer

October 20

Apply Now
Logo of Wizeline

Wizeline

Ux Design • Product/Platform Development • Cloud & DevOps • Data • Artificial Intelligence

1001 - 5000 employees

Founded 2014

🏢 Enterprise

☁️ SaaS

🤖 Artificial Intelligence

💰 $43M Series B on 2018-03

Description

• Wizeline is a global digital services company helping mid-size to Fortune 500 companies build, scale, and deliver high-quality digital products and services. • We thrive in solving our customer’s challenges through human-centered experiences, digital core modernization, and intelligence everywhere (AI/ML and data). • We help them succeed in building digital capabilities that bring technology to the core of their business. • Wizeline prioritizes a culture of diversity and development for its nearly 2,000 person team spread across the globe. • We believe great technology comes from a mix of talents and perspectives. • Our core values of ownership, innovation, community, and inclusivity are central to our work. • Wizeline is invested in its employees' growth, offering opportunities to create personalized career paths and develop in-demand skills. • We even have a free education program, Wizeline Academy, to help both employees and the broader community upskill in tech.

Requirements

• Advanced Databricks. • Consulting mindset. • Data modeling. • Strong General Programming Skills. • Solid experience with Python. If not proficient in Python, we expect the candidate to be proficient in other languages and prove their ability to learn new ones very quickly. • Good to have experience with Scala. • Experience with Spark. • Solid engineering foundations (good coding practices, good architectural design skills). • Experience working with SQL in advanced scenarios that require heavy optimization. • 5+ years of experience with large-scale data engineering with an emphasis on analytics and reporting. • Experience building cloud-scalable, real-time and high-performance Data Lake solutions. • Proficiency in designing and implementing ETL (Extract, Transform, load) processes, dealing with big volumes of data (terabytes of data which required distributed processing). • Experience developing solutions within Cloud Services (AWS, GCP, or Azure). • Advanced English level.

Apply Now

Similar Jobs

September 24

Allata

201 - 500

🤝 B2B

Improve data analysis capabilities, migrating SQL Server backend to Databricks.

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com