Ziply Fiber is a leading provider of fiber-optic internet services in the Pacific Northwest, offering high-speed internet plans that cater to both residential and business customers. With a focus on transparency and customer satisfaction, Ziply Fiber provides no annual contracts, no data caps, and a variety of multi-gig internet options designed for modern households and businesses. Their services include whole home WiFi, streaming TV, and phone services, making them a one-stop solution for connectivity needs.
April 2
🥔 Idaho – Remote
🦬 Montana – Remote
+2 more states
💵 $114.7k - $154.2k / year
⏰ Full Time
🟠 Senior
🚰 Data Engineer
Airflow
Amazon Redshift
Apache
AWS
Azure
BigQuery
Cassandra
Cloud
Distributed Systems
Docker
DynamoDB
ETL
GraphQL
Hadoop
Java
Kafka
Kubernetes
Linux
MongoDB
MySQL
Node.js
NoSQL
Oracle
Postgres
Python
Scala
Spark
SQL
Tableau
Terraform
Unix
Ziply Fiber is a leading provider of fiber-optic internet services in the Pacific Northwest, offering high-speed internet plans that cater to both residential and business customers. With a focus on transparency and customer satisfaction, Ziply Fiber provides no annual contracts, no data caps, and a variety of multi-gig internet options designed for modern households and businesses. Their services include whole home WiFi, streaming TV, and phone services, making them a one-stop solution for connectivity needs.
• The Senior Data Engineer will be responsible for designing, building, and maintaining scalable data pipelines, data models, and infrastructure that support business intelligence, analytics, and operational data needs. • This role involves working with various structured and unstructured data sources, optimizing data workflows, and ensuring high data reliability and quality. • The ideal candidate will be proficient in modern data engineering tools and cloud platforms bringing innovative solutions to a fast-paced and diverse data infrastructure. • Essential Duties and Responsibilities: • Design, develop, and maintain scalable data pipelines for ingestion, transformation, and storage of large datasets. • Optimize data models for analytics and business intelligence reporting. • Build and maintain data infrastructure, ensuring performance, reliability, and scalability. • Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and design appropriate solutions. • Implement best practices for data governance, security, and compliance. • Work with structured and unstructured data, integrating data from various sources including databases, APIs, and streaming platforms. • Troubleshoot and resolve data pipeline and ETL failures, implementing robust monitoring and alerting systems. • Automate data workflows to increase efficiency and reduce manual intervention. • Mentor and train junior engineers, fostering a culture of learning and innovation. • Develop and maintain documentation for data engineering processes and workflows.
• A Bachelor’s degree in Computer Science, Engineering, or a related field is required • Minimum of eight (8) years of experience in data engineering, ETL development, or related fields • Strong proficiency in SQL and database technologies (PostgreSQL, MySQL, Oracle, SQL Server, etc.) • Experience with big data processing frameworks such as Spark, Hadoop, Flink, and Apache Hudi • Familiarity with Linux/Unix and scripting technologies utilized on them • Proficiency in programming languages such as Python, Java, or Scala for data engineering tasks • Hands-on experience with cloud platforms such as Microsoft Azure and its data services such as Azure Data Factory, Azure Synapse Analytics, and Azure Databricks • Experience working with data warehouses such as Snowflake, Redshift, BigQuery, or Azure SQL Data Warehouse • Familiarity with workflow orchestration tools such as Apache Airflow or Azure Data Factory • Knowledge of data modeling, schema design, and data architecture best practices • Strong understanding of data governance, security, and compliance standards • Ability to work independently in a remote environment and collaborate effectively across teams • Experience with Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or Azure Resource Manager (ARM) templates • Knowledge of containerization and orchestration technologies such as Docker, Kubernetes, and Azure Kubernetes Service (AKS) • Exposure to GraphQL and RESTful APIs for data retrieval and integration • Familiarity with NoSQL databases such as MongoDB, DynamoDB, Cassandra, or Azure Cosmos DB • Experience required with: real-time analytics databases such as Apache Pinot. • data transformation tools such as DBT, AWS Glue, or Alteryx. • metadata management and data discovery tools such as Apache DataHub. • Data visualization tools such as Tableau, Power BI, or Looker. • version control software such as GitLab.
• Comprehensive health benefits include - medical, dental, vision, 401k, flexible spending account, paid sick leave and paid time off • parental leave • quarterly performance bonus • training • career growth and education reimbursement programs
Apply NowApril 2
Senior Data Engineer for Tilia, building and maintaining warehouse infrastructure for decision making.
April 2
As a Senior Data Architect, you'll design data architecture for MSPbots' BI and AI solutions. Collaborate with product teams to optimize data strategy and quality.
April 1
SailPoint, a leader in identity security, seeks a Senior Data Engineer to build scalable AI solutions.
🇺🇸 United States – Remote
💵 $116.9k - $217.1k / year
⏰ Full Time
🟠 Senior
🚰 Data Engineer
🦅 H1B Visa Sponsor
April 1
Join AmeriSave as a Senior Data Engineer to design and maintain data warehouse solutions for analytics.
🇺🇸 United States – Remote
💵 $100k - $170k / year
⏰ Full Time
🟠 Senior
🚰 Data Engineer
🦅 H1B Visa Sponsor
April 1
Senior Data Engineer at Archer Education enhancing data capabilities through modernizing infrastructure and building a centralized data lake.
Discover 100,000+ Remote Jobs!
We use powerful scraping tech to scan the internet for thousands of remote jobs daily. It operates 24/7 and costs us to operate, so we charge for access to keep the site running.
Of course! You can cancel your subscription at any time with no hidden fees or penalties. Once canceled, you’ll still have access until the end of your current billing period.
Other job boards only have jobs from companies that pay to post. This means that you miss out on jobs from companies that don't want to pay. On the other hand, Remote Rocketship scrapes the internet for jobs and doesn't accept payments from companies. This means we have thousands more jobs!
New jobs are constantly being posted. We check each company website every day to ensure we have the most up-to-date job listings.
Yes! We’re always looking to expand our listings and appreciate any suggestions from our community. Just send an email to Lior@remoterocketship.com. I read every request.
Remote Rocketship is a solo project by me, Lior Neu-ner. I built this website for my wife when she was looking for a job! She was having a hard time finding remote jobs, so I decided to build her a tool that would search the internet for her.