July 29
• Responsible for the design, maintenance, and ongoing enhancement of Pattern’s data analytics pipeline • Design, develop, and maintain our data warehouse and ETL pipelines using technologies like Scala, PostgresQL, Airflow, and Python to ensure the accuracy, consistency, and reliability of our data • Operate within Pattern’s data warehouse, writing queries against large volumes of structured and unstructured information in PostgreSQL • Build tools to monitor the health and responsiveness of Pattern’s analytic products • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions
• Bachelor's degree in Mathematics, Economics, Data Science/Analysis, Computer Science, or a related field, or equivalent certifications • Proven experience as a Data Engineer or similar role at an early stage startup • 5 + years working with SQL data sources, preferably PostgreSQL, with demonstrated success for both OLTP and OLAP workloads • Strong emphasis on data quality, monitoring, and transparency within reporting products • Passion for using data to solve problems, uncover new insights and trace underlying problems
Apply NowJuly 19
1001 - 5000
Develop data flows and analytics solutions for enterprise data platform.
July 18
10,000+
Design and implement data warehouse solutions for healthcare analytics.
🇺🇸 United States – Remote
💵 $85.3k - $136k / year
💰 $2G Post-IPO Debt on 2022-05
⏰ Full Time
🟠 Senior
🚰 Data Engineer
July 18
11 - 50
Design and implement data infrastructure for blockchain protocols at Risk Labs
July 17
11 - 50
Architect and scale data infrastructure for UMA and Across protocol ecosystems.