November 7
Airflow
AWS
Azure
Cloud
ETL
Google Cloud Platform
Hadoop
HBase
HDFS
Java
JavaScript
MapReduce
Maven
MySQL
Node.js
Oracle
Python
Shell Scripting
Unix
Go
• Analyze, program, debug and modify software enhancements and/or new products. • Participate in the development of data warehouse designs. • Work in an agile Scrum driven environment to deliver new and innovative products. • Duties include designing applications, writing code, developing, testing, debugging and documenting work and results. • Keep up-to-date with relevant technology to maintain and improve functionality for authored applications. • Design and implement reusable frameworks, libraries and Java components, product features in collaboration with business and IT stakeholders. • Ingest data from various structured and unstructured data sources into Hadoop and other distributed Big Data systems. • Support the sustainment and delivery of an automated ETL pipeline. • Validate data extracted from sources using scripts and other automated capabilities, logs, and queries. • Monitor and report the data flow through the ETL process. • Troubleshoot production support issues post-deployment and come up with solutions. • Mentor junior engineers within the team for development.
• B.S. or M.S. in Computer Science (or equivalent experience) • Three years of related industry experience • Experience in back-end programming, like Java, JS, Python, Node.js and OOAD and ETL Tools • Experience with one of Database technologies (Ex: Vertica, Oracle, Netezza, MySQL, BigQuery) • Experience of working with large scale databases • Knowledge and experience of Unix (Linux) Platforms and Shell Scripting • Experience in writing Pig Latin scripts, MapReduce jobs, HiveQL etc. • Good knowledge of database structures, theories, principles, and practices • Familiarity with data loading tools like Flume, Sqoop • Knowledge of workflow/schedulers like Oozie, Airflow • Analytical and problem solving skills, applied to Big Data domain • Proven understanding with Hadoop(Dataproc), HBase, Hive, Pig • Knowledge of Cloud providers like AWS, GCP, Azure • Writing high-performance, reliable and maintainable code • Expertise in version control tools like GIT • Good aptitude in multi-threading and concurrency concepts • Effective analytical, troubleshooting and problem-solving skills • Strong customer focus, ownership, urgency and drive
• healthcare • 401K savings plan • company holidays • vacation • sick time • parental leave • employee assistance program
Apply NowNovember 7
Develop core product for an employee engagement platform.
🇺🇸 United States – Remote
💰 $4.3M Seed Round on 2022-04
⏰ Full Time
🟡 Mid-level
🟠 Senior
🔙 Backend Engineer
November 7
Backend developer to build technology for virtual events at Active Theory.
November 7
ASP.NET developer for Camelot Unchained at Unchained Entertainment.
🇺🇸 United States – Remote
💰 $15M Funding Round on 2022-11
⏰ Full Time
🟡 Mid-level
🟠 Senior
🔙 Backend Engineer
November 6
Develop software solutions for SmartAsset's financial advice platform.
November 5
Designs full stack solutions for RTX's classified Digital Services team.