April 13
• Experience with data transformation tools, orchestration tools and data warehouse. Snowflake, Airflow and DBT preferred. • In-depth understanding of data systems, process and structure principles • Knowledge of data modeling methods such as dimensional modeling, Vault, Star Schema, etc. • Knowledge of data mining and segmentation techniques • Effective communication skills (written and verbal) to properly articulate complicated cloud reports to management and other IT development partners • Exceptional team building and leadership skills • Experience in large-scale, secure, and high-availability solutions in Cloud environments such as Google Cloud (GCP) • Good understanding of GCP services such as IAM, and Google Cloud Storage buckets
• Develop the best cloud architecture solutions to help the key stakeholders accomplish their strategic goals • Develop data solutions and processes to store, transform and retrieve company information • Build data models and develop an Enterprise Data Model at a large scale • Work with a team of cloud data engineers to install and configure information systems to ensure functionality • Analyze structural requirements for new software, applications and process • Migrate data from legacy systems to cloud solutions • Design conceptual and logical data models and flowcharts • Improve system performance by conducting tests, troubleshooting and integrating new elements • Optimize new and current data systems • Work with security and operations teams to define security and backup procedures • Coordinate with key stakeholders of all data workstreams across departments to identify future needs and requirements
• Develop the best cloud architecture solutions to help the key stakeholders accomplish their strategic goals • Develop data solutions and processes to store, transform and retrieve company information • Build data models and develop an Enterprise Data Model at a large scale • Work with a team of cloud data engineers to install and configure information systems to ensure functionality • Analyze structural requirements for new software, applications and process • Migrate data from legacy systems to cloud solutions • Design conceptual and logical data models and flowcharts • Improve system performance by conducting tests, troubleshooting and integrating new elements • Optimize new and current data systems • Work with security and operations teams to define security and backup procedures • Coordinate with key stakeholders of all data workstreams across departments to identify future needs and requirements
Apply NowMarch 20
501 - 1000
Senior Analytics Engineer at PandaDoc optimizing data systems with Snowflake and dbt.
🇺🇸 United States – Remote
💵 $125k - $150k / year
💰 Series C on 2021-09
⏰ Full Time
🟠 Senior
📊 Analytics Engineer
🗽 H1B Visa Sponsor
November 11, 2023
11 - 50
Analytics Engineer for a management insurance platform to enhance data models.