Business • Finance • Investing • Technology • Politics
201 - 500
💰 $200M Corporate Round on 2022-02
5 days ago
Business • Finance • Investing • Technology • Politics
201 - 500
💰 $200M Corporate Round on 2022-02
• Assemble large, complex data that meet functional/non-functional business requirements. • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing transformation for greater scalability, etc. • Use the infrastructure/services required for optimal extraction, transformation, and loading of data from a wide variety of data sources using GCP and AWS services. • Create data for analytics and CRM team members that assist them in building and optimizing our product. • Work with stakeholders including the Product, Data and Design teams to assist with data-related technical issues and support their data requirement needs.
• Bachelor’s degree with Minimum 5+ years of experience working in globally distributed teams successfully • Must have experience working on Spark, Kafka, and Python • Apply experience with cloud storage and computing for data pipelines in GCP (GCS, BQ, composer, etc) • Write pipelines in Airflow to orchestrate data pipelines • Experience analyzing data from 3rd party providers: Google Analytics, Google Ads etc. • Strong analytic skills related to working with unstructured datasets. • Experience in manipulating, processing and extracting value from large disconnected datasets. • Experience with software engineering practices in data engineering, e.g. release management, testing, etc and corresponding tooling (dbt, great expectations, …) • Experience with data governance, privacy and security; Basic ML exposure & know-how; supporting teams as they migrate through guidelines, support and adherence to good practices. • Independently fulfill incoming tracking requests as assigned by your PO. • Support the team championing web analytics and data literacy within the organization and work to enhance our capabilities through training and support as needed. • Dynamically adapt to fluid situations to deliver the intended results • Strong interpersonal and conflict resolution skills • Ability to form productive relationships quickly and create the spheres of influence required for transformational changes • Excellent verbal and written communication skills
• Day off on the 3rd Friday of every month (one long weekend each month) • Monthly Wellness Reimbursement Program to promote health well-being • Paid paternity and maternity leaves
Apply Now5 days ago
1001 - 5000
Collect data on chemical substances from various regulatory and industry sources.
October 25
1001 - 5000
Data Engineer at TwiningsOvo focusing on building data pipelines and modeling.
October 24
1001 - 5000
Data Engineer to migrate Oracle data warehouse to Snowflake at Duck Creek.
🇮🇳 India – Remote
💰 $230M Private Equity Round on 2020-06
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer
October 19
2 - 10
Develop scalable data solutions at Saaf Finance for mortgage technology.