Data Engineer

July 22

Apply Now
Logo of Tiger Analytics

Tiger Analytics

AI & Analytics for today’s business challenges.

Machine Learning • Predictive Analytics • Forecasting • Optimization • Natural Language Processing

1001 - 5000

Description

• Play a crucial role in building scalable and cost-effective data pipelines, data lakes, and analytics systems. • Implement data ingestion processes to collect data from various sources, including databases, streaming data, and external APIs. • Develop ETL (Extract, Transform, Load) processes to transform and cleanse raw data into a structured and usable format for analysis. • Manage and optimize data storage solutions, including Amazon S3, Redshift, and other AWS storage services. • Utilize AWS services like AWS Glue, Amazon EMR, and AWS Lambda to process and analyze large datasets. • Continuously monitor and optimize data pipelines and infrastructure for performance, cost-efficiency, and scalability. • Collaborate with data scientists, analysts, and other stakeholders to integrate AWS-based solutions into data analytics and reporting platforms. • Maintain thorough documentation of data engineering processes, data flows, and system configurations. • Design AWS-based solutions that can scale to accommodate growing data volumes and changing business requirements. • Implement cost-effective solutions by optimizing resource usage and recommending cost-saving measures. • Diagnose and resolve AWS-related issues to minimize downtime and disruptions.

Requirements

• A bachelor's degree in computer science, information technology, or a related field is typically required. • AWS certifications like AWS Certified Data Analytics - Specialty or AWS Certified Big Data - Specialty are highly beneficial. • Proficiency in programming languages such as Python, Java, or Scala for data processing and scripting. Shell scripting and Linux knowledge would be preferred. • Strong knowledge of AWS database services like Amazon Redshift, Amazon RDS, and NoSQL databases. • Experience with AWS Glue or other ETL tools for data transformation. • Proficiency in version control systems like Git. • Strong analytical and problem-solving skills to address complex data engineering challenges. • Effective communication and collaboration skills to work with cross-functional teams. • Knowledge of Machine learning concepts would be good to have.

Apply Now

Similar Jobs

July 10

Credible

201 - 500

Provide data-driven insights to inform company decisions for a lending and insurance marketplace.

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com