Join our Facebook group
👉 Remote Jobs Network¡Las mejores cosas suceden en Kueski! #Kueskilife #Kueskiway
Fintech • Finance • Development • technology • data
501 - 1000
đź’° $23.3M Venture Round on 2022-10
July 30
¡Las mejores cosas suceden en Kueski! #Kueskilife #Kueskiway
Fintech • Finance • Development • technology • data
501 - 1000
đź’° $23.3M Venture Round on 2022-10
• Design robust, scalable data-driven solutions to support long-term needs • Develop complex data processing systems to meet data consumers requirements • Create CI/CD pipelines to automate production deployments and monitoring • Develop infrastructure as code for automated, clean, and organized infrastructure deployments • Apply data cleansing techniques to improve data quality and facilitate consumption • Show initiative and offer mentorship proactively to junior team members • Mastery of the data stack and a profound understanding of data platform components • Design complex distributed systems using batch and streaming fundamentals • Collaborate effectively with technical leaders, architects, and stakeholders • Focus discussions on important aspects and guide others
• Proven ability to transform requirements into working software in production • Ability to collaborate effectively within an interdisciplinary team • Capable of leading medium to complex projects requiring multiple iterations and different technologies • Strong understanding of a well-organized Software Development Lifecycle (SDLC) • Proficient knowledge of integrating non-functional requirements into solution delivery • Experience in designing, implementing, testing, and deploying production-ready code • Expertise in designing and solving data processing problems • Skilled in building and debugging Spark-based applications • Ability to collaborate with data consumers to apply large-scale data analytics and machine-learning techniques • Strong technical background in programming, preferably with Java and Python; Scala is a plus • Proficiency in SQL traditional query development • Solid experience and understanding of software design patterns • Solid experience with database engines and distributed programming engines for data processing, particularly with Spark • Proven experience in integrating, designing, and building data pipelines • Solid experience in deploying and maintaining data pipelines in production environments • Knowledge of big data AWS cloud services: S3, EC2, Glue, Athena, Lambda and EMR • Nice to have: Experience developing high-level streaming solutions • Experience working with distributed message brokers in production • Experience with any non-relational databases, preferably Apache Cassandra • Experience implementing data warehousing using data modeling techniques • Knowledge of data architecture fundamentals • Knowledge of data governance being actively participating in a program
• Mission-driven culture focused on customer value, teamwork, humility, and integrity • Role clarity, career growth, and a personal development plan • Competitive salary, medical insurance, and ample flexible time off • Competitive stock options with a commitment to inclusivity and diversity
Apply NowJuly 27
11 - 50
Design and maintain scalable data management systems for global near-shore BPO services.