Data Engineer

December 6

Apply Now
Logo of Nacre Capital

Nacre Capital

Artificial Intelligence • Start-Ups • Deep Learning • Technology • Innovation

Description

• Data Engineer • About Us • Aquaticode builds artificial intelligence solutions for aquaculture. Our core competency lies at the intersection of biology and artificial intelligence, utilizing specialized imaging technology to detect, identify, and predict traits of aquatic species. We value commitment and creativity in building real-world solutions that benefit humanity. • Position Overview • We are seeking a talented Data Engineer with experience in supporting Machine Learning (ML) research to join our team. The ideal candidate will have a strong background in building robust data pipelines and workflows that facilitate ML projects and eagerness to learn new technologies. This role requires proficiency in data processing technologies and an understanding of the data needs specific to ML research. • Key Responsibilities • Develop, maintain, and optimize data pipelines and workflows to support ML research and model development. • Design and implement scalable data architectures for handling large datasets used in ML models. • Collaborate closely with ML researchers and data scientists to understand data requirements and ensure data availability and quality. • Work with databases and data integration processes to prepare and transform data for ML experiments. • Utilize MongoDB and other NoSQL databases to manage unstructured and semi-structured data. • Write efficient, reliable, and maintainable code in Python and SQL for data processing tasks. • Implement data validation and monitoring systems to ensure data integrity and performance. • Support the deployment of ML models by integrating data solutions into production environments. • Ensure the scalability and performance of data systems through rigorous testing and optimization.

Requirements

• Proficiency in English (spoken and written). • Strong experience in Python and SQL. • Hands-on experience with data processing in Apache Airflow. • Experience working with databases, including MongoDB (NoSQL) and relational databases. • Understanding of data modeling, ETL processes, and data warehousing concepts. • Experience with cloud platforms like AWS, GCP, or Azure. • Experience with other NoSQL databases like InfluxDB, Elasticsearch, or similar technologies. • Experience with backend frameworks like FastAPI, Flask, or Django. • Knowledge of containerization tools like Docker. • Familiarity with messaging queues like RabbitMQ. • Understanding of DevOps practices and experience with CI/CD pipelines. • Experience with front-end development (e.g., React, NextJs).

Apply Now

Similar Jobs

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com