Data Engineer

13 hours ago

Apply Now
Logo of EverCommerce

EverCommerce

SaaS • Software • Services • SMB

1001 - 5000

💰 Private Equity Round on 2019-07

Description

• Build, maintain, and improve major components of our data infrastructure such as our data loader, data ingestion, ETL tools and connections with downstream destinations • Build a grow our enterprise data lake infrastructure for reporting and analytics • Work closely with the BI engineering team and support them in establishing analytics and decision support capabilities that enables the business teams to make data-driven decisions driving acquisition growth and customer monetization and retention • Develop internal web and product tracking strategy and work with internal organizations to ensure proper implementation and execution of that strategy • Increase the efficiency of old business processes and aid in the development of new business processes to continually improve data collection processes, integrity, and reliability • Own various special data-focused projects to support planning/forecasting, ongoing operations, and corporate development, as necessary • Extract and manipulate data from a variety of systems and offline reporting for use reporting, utilizing cloud-based and/or on-premise analytics tools to analyze data, as necessary • Exercise inquisitive mindset to transform business questions into actionable data exploration exercises, as well as take reported data and shape it into actionable business questions • Develop an understanding of each operating business's role within the larger EverCommerce organization; maintain an advanced knowledge of their data quality, references (definitions, sources), strengths, and areas of improvement

Requirements

• Bachelor’s degree in Computer Science or other related field with a top-tier academic background, or equivalent experience • 5+ years of post-undergraduate relevant work experience in a highly analytical environment; SaaS, payments, and/or technology industry experience preferred • Strong Python, SQL • Experience with cloud platforms and data tools such as AWS, Redshift, Glue, Athena, Data Zone, S3, Spark, Snowflake, Fivetran • Experience with data lake technologies such as Apache Iceberg, Glue Catalog, Polaris Catalog • Experience with data workflow tools such as Airflow, dbt • Experience with other tools and software such as Docker, Ansible, Terraform or similar Infrastructure-as-code • Exceptional written and verbal communication skills, with the ability to present complex technical concepts to non-technical stakeholders • Proven leadership abilities, with experience in mentoring and team management • Extreme attention to detail with a strong sense of ownership and organization • Low ego, “no-job-too-small” attitude with a willingness to shift from high-level critical thinking to necessary mundane tasks required to complete a job • Ability to protect extremely confidential information

Benefits

• Flexibility to work where/how you want – in-office, remote, or hybrid • Continued investment in your professional development • Robust health and wellness benefits • 401k with up to a 4% match • Monthly wellness stipend • Flexible and generous time off (FTO) • Employee Stock Purchase Program • Student Loan Repayment Program

Apply Now

Similar Jobs

22 hours ago

Growe

501 - 1000

Join Growe as a Middle Data Engineer to build data pipelines using AWS and Airflow. Collaborate with teams to improve data processes and governance.

Yesterday

Tarkett

10,000+

Join Tarkett as a Data Engineer to enhance and manage data pipelines for our cloud environment. Leverage your expertise in data integration tools such as Boomi, IICS, Snowflake, and dbt.

Yesterday

Imprint

51 - 200

Join Imprint's Data Analytics team to build and manage data pipelines for decision-making.

Yesterday

Bay Cities

201 - 500

Data Engineer position focused on building scalable data environments at Bay Cities. Manage data pipelines and collaborate with teams for AI advancement.

Yesterday

Strive Pharmacy seeks a Data Engineer to manage ETL processes and analytics. Focus on data pipelines and dashboard visualization.

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com