November 8
• Architect, design, document & implement data pipelines for Snowflake using dbt, airflow. • Ensure correctness and completeness of the transformed data. • Monitor and triage technical challenges. • Evaluate technical solutions and share MVPs or PoCs. • Develop relationships with external stakeholders on data and security issues. • Review work from other tech team members and provide growth feedback. • Implement data performance, data security policies aligned with governance objectives.
• 4+ years of IT experience with major focus on data warehouse/database related projects. • Must have exposure to technologies such as dbt, Apache Airflow, Snowflake. • Experience in data platforms: Snowflake, Oracle, SQL Server, MDM etc. • Expertise in writing SQL and database objects - Stored procedures, functions, views. • Hands on experience in ETL/ELT and data security, SQL performance optimization and job orchestration tools and technologies e.g., dbt, Attunity, Golden Gate, APIs, Apache Airflow etc. • Experience in data modelling and relational database design. • Well versed in applying SCD, CDC and DQ/DV framework. • Demonstrate ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket). • Good to have experience with Cloud Platforms such as AWS, Azure, GCP and Snowflake. • Good to have strong programming/ scripting skills (Python, PowerShell, etc.). • Good to have knowledge around developing financial models and forecasting to support financial planning, and decision-making processes. • Experience around responsibility for analyzing and interpreting financial data to provide valuable insights and support strategic decision-making. • Experience working with agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams (Product Owners, Scrum Master, Architects, and data SMEs). • Excellent written, oral communication and presentation skills to present architecture, features, and solution recommendations.
Apply NowOctober 13, 2023
51 - 200