5 days ago
• Career Renew is recruiting for one of its clients a Big Data Engineer - Crypto - candidates need to be based in CET +/-4 timezones. • We are the fastest Telegram bot on Solana, with over $10 billion in traded volume. • We empower traders with advanced on-chain trading tools like DCA orders, limit orders, and wallet copy-trading, offering a seamless, innovative experience. • We are synonymous with speed, innovation, and cutting-edge trading solutions. • This is a unique opportunity to lead and build the data infrastructure for our project, collaborating with an elite team to shape a product that directly impacts thousands of active users in a fast-growing ecosystem. • We are looking for a Big Data Engineer to take ownership of our data architecture, ensuring scalability, low latency, and reliability. • The ideal candidate will lead the design and implementation of data pipelines, real-time processing systems, and analytics platforms that support trading decisions and insights. • Maintain a scalable, high-performance data architecture tailored for real-time trading data, trading events, and analytics. • Identify and integrate the most effective big data tools and frameworks to handle the ingestion, processing, and storage of Solana-based blockchain data. • Build and maintain stream-processing systems using tools like Apache Kafka, Spark Streaming, or Flink for real-time price feeds and trading events. • Design and optimize storage solutions using a combination of in-memory databases (e.g., Redis) for active trading data and scalable databases (e.g., Cassandra, ClickHouse) for analytics. • Monitor, troubleshoot, and optimize the performance of the data pipeline to handle high-throughput scenarios, such as trading spikes. • Implement caching strategies and horizontal scaling solutions to maintain low latency and high availability. • Deploy monitoring systems (e.g., Prometheus, ELK Stack) to oversee system health, data flow, and anomalies. • Work closely with engineering, product, and analytics teams to align data solutions with business goals. • Resolve issues in the big data ecosystem and ensure high availability and reliability.
• Proficiency in distributed computing principles and large-scale data management for financial or trading systems. • Proficiency in tools like Kafka, Spark, and Flink. • Strong expertise in stream-processing frameworks like Spark Streaming, Apache Flink, or Storm. • Proficiency in TypeScript with 5+ years of experience. • Proficiency in ETL tools and frameworks, such as Apache Nifi, Airflow, or Flume.
• Remote Flexibility: Work from anywhere while contributing to a high-impact role. • Growth Opportunities: Be a key player in defining our data infrastructure. • Challenging Projects: Work with cutting-edge technologies and tackle complex data challenges. • Collaborative Culture: Join a team that values innovation, expertise, and efficiency.
Apply NowDecember 20, 2024
Join IOHK as a Data Engineer in Technical Intelligence to advance blockchain analytics and solutions.
December 6, 2024
11 - 50
Become the first dedicated Data Engineer at a remote-first travel scale-up, enhancing data capabilities.
November 29, 2024
Join Ground Truth Labs as a Data Engineer to develop AI biomarkers. Impact patient lives through advanced data engineering in a fast-paced environment.
November 10, 2024
Data Engineer to maintain and scale analytics infrastructure at Maker&Son.
November 5, 2024
201 - 500
Data Engineer at Methods Analytics improving decision-making with data solutions.