December 6
Airflow
Apache
AWS
Azure
Cloud
Django
Docker
ElasticSearch
ETL
Flask
Google Cloud Platform
MongoDB
Next.js
NoSQL
Python
RabbitMQ
React
SQL
Go
• About Us: • Aquaticode builds artificial intelligence solutions for aquaculture. • Our core competency lies at the intersection of biology and artificial intelligence, utilizing specialized imaging technology to detect, identify, and predict traits of aquatic species. • We value commitment and creativity in building real-world solutions that benefit humanity. • Key Responsibilities: • Develop, maintain, and optimize data pipelines and workflows to support ML research and model development. • Design and implement scalable data architectures for handling large datasets used in ML models. • Collaborate closely with ML researchers and data scientists to understand data requirements and ensure data availability and quality. • Work with databases and data integration processes to prepare and transform data for ML experiments. • Utilize MongoDB and other NoSQL databases to manage unstructured and semi-structured data. • Write efficient, reliable, and maintainable code in Python and SQL for data processing tasks. • Implement data validation and monitoring systems to ensure data integrity and performance. • Support the deployment of ML models by integrating data solutions into production environments. • Ensure the scalability and performance of data systems through rigorous testing and optimization.
• Proficiency in English (spoken and written). • Strong experience in Python and SQL. • Hands-on experience with data processing in Apache Airflow. • Experience working with databases, including MongoDB (NoSQL) and relational databases. • Understanding of data modeling, ETL processes, and data warehousing concepts. • Experience with cloud platforms like AWS, GCP, or Azure. • Experience with other NoSQL databases like InfluxDB, Elasticsearch, or similar technologies. • Experience with backend frameworks like FastAPI, Flask, or Django. • Knowledge of containerization tools like Docker. • Familiarity with messaging queues like RabbitMQ. • Understanding of DevOps practices and experience with CI/CD pipelines. • Experience with front-end development (e.g., React, NextJs).
Apply NowNovember 20
Forbes Advisor seeks a Data Warehouse Engineer/Developer to create DW solutions. Work in a team to enhance business intelligence using industry standards.
November 14
10,000+ employees
Data Engineer for building data architecture and pipelines at BrassCraft.
November 14
Data Engineer for Masco, focusing on data architecture and infrastructure development.
November 10
Develop data integration pipelines for online retail technology company
November 10
Expand and optimize data architecture for Tangoe's product development.