Pattern Data is a company that specializes in providing AI-powered legal technology solutions, particularly focused on mass tort litigations. Their platform leverages artificial intelligence to automate and expedite the review of medical records, helping legal teams build evidence more efficiently and accurately. By digitizing, analyzing, and categorizing client data, Pattern Data helps improve law firm operations and enhances strategic decision-making. Trusted by leading law firms, their software significantly speeds up case reviews, settlements, and overall litigation processes.
July 29, 2024
Pattern Data is a company that specializes in providing AI-powered legal technology solutions, particularly focused on mass tort litigations. Their platform leverages artificial intelligence to automate and expedite the review of medical records, helping legal teams build evidence more efficiently and accurately. By digitizing, analyzing, and categorizing client data, Pattern Data helps improve law firm operations and enhances strategic decision-making. Trusted by leading law firms, their software significantly speeds up case reviews, settlements, and overall litigation processes.
• Responsible for the design, maintenance, and ongoing enhancement of Pattern’s data analytics pipeline • Design, develop, and maintain our data warehouse and ETL pipelines using technologies like Scala, PostgresQL, Airflow, and Python to ensure the accuracy, consistency, and reliability of our data • Operate within Pattern’s data warehouse, writing queries against large volumes of structured and unstructured information in PostgreSQL • Build tools to monitor the health and responsiveness of Pattern’s analytic products • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions
• Bachelor's degree in Mathematics, Economics, Data Science/Analysis, Computer Science, or a related field, or equivalent certifications • Proven experience as a Data Engineer or similar role at an early stage startup • 5 + years working with SQL data sources, preferably PostgreSQL, with demonstrated success for both OLTP and OLAP workloads • Strong emphasis on data quality, monitoring, and transparency within reporting products • Passion for using data to solve problems, uncover new insights and trace underlying problems
Apply NowJuly 18, 2024
Design and implement data infrastructure for blockchain protocols at Risk Labs
July 17, 2024
Architect and scale data infrastructure for UMA and Across protocol ecosystems.
June 22, 2024
Join City of Hope to design, optimize, and integrate health data architectures for better healthcare outcomes.
June 22, 2024
Design health data systems at City of Hope. Lead projects and mentor junior team members.
Discover 100,000+ Remote Jobs!
We use powerful scraping tech to scan the internet for thousands of remote jobs daily. It operates 24/7 and costs us to operate, so we charge for access to keep the site running.
Of course! You can cancel your subscription at any time with no hidden fees or penalties. Once canceled, you’ll still have access until the end of your current billing period.
Other job boards only have jobs from companies that pay to post. This means that you miss out on jobs from companies that don't want to pay. On the other hand, Remote Rocketship scrapes the internet for jobs and doesn't accept payments from companies. This means we have thousands more jobs!
New jobs are constantly being posted. We check each company website every day to ensure we have the most up-to-date job listings.
Yes! We’re always looking to expand our listings and appreciate any suggestions from our community. Just send an email to Lior@remoterocketship.com. I read every request.
Remote Rocketship is a solo project by me, Lior Neu-ner. I built this website for my wife when she was looking for a job! She was having a hard time finding remote jobs, so I decided to build her a tool that would search the internet for her.