Join our Facebook group
👉 Remote Jobs NetworkMosaic is a Strategic Finance Platform that transforms the way business gets done
51 - 200
2 days ago
Airflow
Amazon Redshift
Apache
AWS
Azure
Cloud
Docker
ETL
Hadoop
Java
Kafka
Kubernetes
MySQL
Oracle
Postgres
Python
Scala
Spark
SQL
Go
Mosaic is a Strategic Finance Platform that transforms the way business gets done
51 - 200
• Design, develop, and maintain scalable and efficient data pipelines to process large volumes of data from various sources. • Collaborate with stakeholders and other backend engineers to understand data requirements and deliver high-quality data solutions. • Optimize and maintain data infrastructure, ensuring reliability, scalability, and performance. • Implement best practices for data management, including data governance and data quality. • Develop and maintain ETL processes to integrate data from multiple heterogeneous sources into a unified data warehouse. • Monitor and troubleshoot data pipeline issues, ensuring data integrity and availability.
• Strong communication and collaboration skills, with the ability to work effectively in a distributed team across various time zones. • Demonstrated ability to manage data projects from start to finish, effectively negotiating requirements and deliverables with key stakeholders. • 5+ years of experience in data engineering or a related field working with data in a high-volume environment. • Proficiency in programming languages such as Python, Java, or Scala. • Extensive experience with SQL and database technologies (e.g., PostgreSQL, MySQL, Oracle). • Familiarity with data orchestration tools (e.g. Apache Airflow), data transformation tools (e.g. Spark), dimensional modeling (e.g. star schema), metadata, indexing, dependencies, and data workflows to support data analytics and data science. • Experience with big data technologies (e.g., Hadoop, Spark, Kafka) and cloud platforms (e.g., AWS, Azure, Google Cloud). • Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). • Solid understanding of data modeling, data architecture, and database design principles. • Excellent problem-solving skills and attention to detail. • Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field.
Apply Now2 days ago
51 - 200
Build and maintain data infrastructure for Everstream Analytics' cloud-native platform.
2 days ago
11 - 50
Transform K-1 industry through scalable data engineering and machine learning support.
2 days ago
201 - 500
Senior Data Engineer to implement and evolve Brightwheel's data platform.
🇺🇸 United States – Remote
💰 $55M Series C on 2021-02
⏰ Full Time
🟠 Senior
🚰 Data Engineer
🗽 H1B Visa Sponsor
3 days ago
51 - 200
Build and maintain Everstream's data infrastructure on AWS cloud.