machine learning • data science • deep learning • artificial intelligence • consulting
51 - 200
May 21, 2023
Airflow
AWS
Big Data
CI/CD
Cloud
Deep Learning
ETL
GCP
Google Cloud Platform
Kafka
Kubernetes
Machine Learning
NoSQL
Numpy
Pandas
Python
Scikit-Learn
Spark
SQL
machine learning • data science • deep learning • artificial intelligence • consulting
51 - 200
• Manage projects: interact with customers and understand the scope of work • Brainstorm and design solutions that solve use cases, and use your expertise to advise on possible solution paths and obstacles to be avoided • Act as a bridge between the technology team and the product/business side, handle the scope-creep and prioritize resources or tasks when needed • Lead teams: work with data or Machine Learning + Data engineers • Help the team with code development, technical follow-up, and code review • Collaborate with Interview instances for Tech People (Exam Review and Technical Interview) • Regularly discuss issues, arrange meetings and give peer feedback - help others in their technical career’s objectives • Research and develop new technologies to improve Mutt Data’s toolset, as well as best-practices • Propose, define, present, evangelize, prototype, build, handle and maintain data systems • Develop proof of concepts, produce machine learning models, build dashboards, APIs, and data platforms • Research and develop new technologies to improve Mutt Data’s toolset, as well as best practices. We invite you to contribute your knowledge on infrastructure, code, frameworks, etc • Build ETL processes for a wide variety of data sources (relational database, NoSQL, web services, flat files, etc) as per projects needs
• Software Engineering and Development experience • Experience in data construction or ML Pipelines • Team leadership and client interactions • Advanced Python knowledge • Solid SQL knowledge of databases and SQL • Ability to interpret and implement customer’s technical requirements • Experience building analytical data systems over Modern Data Warehouses (BigQuery, Redshift, Snowflake) or Data Lakes (Databricks, AWS S3, Presto, EMR, Glue, etc.) • Great capacity for teamwork • Good management of AWS (or GCP) • Have a keen sense of code hygiene: review, documentation, testing, CI/CD. • Proficiency in the DAG stack: DBT + Airflow + Great Expectations • Development of data-pipelining with -Spark, Python, and SQL • Experience with any stream processing tools (Kafka, Kinesis, Spark Streaming, etc.) • Proficiency in Hypermodern Python Stack: State of the art tools like Poetry, Formatters (black), Linters (flake8, pylint, etc), testing libraries (pytest, hypothesis. etc), type checking, static analysis, tooling for continuous integration and delivery. • Proficiency in Python’s Scientific Stack (numpy, pandas, jupyter, matplotlib, scikit-learn)
• Mutt week! An additional week of vacation per year. • Paid AWS and GCP certification exams and materials. • Birthday free day! • In-company English lessons. • Social Paid Events • Worknmates Coworking spaces • Referral Bonuses. • Remote First Culture: flexible working time + flexible working location. • Annual Mutters' Day. • Annual Mutters' Trip.
Apply Now