Data Engineer
Numeus is a diversified digital asset investment firm built to the highest institutional standards, combining synergistic businesses across Alpha Strategies, Trading, and Asset Management.
Numeus was founded by successful executives with decades of experience across the finance, blockchain and technology industries, with a shared passion for digital assets. Our values are grounded in an open approach based on connectivity, collaboration, and partnerships across the digital asset ecosystem. People and technology are at the core of everything we do.
We are seeking a skilled and experienced Data Engineer. This is a high impact role with the ability to contribute meaningfully to the development and management of our data platform. This position requires strong technical capabilities and a proven ability to collaborate effectively with quantitative researchers to process, analyze, and manage a large and diverse corpus of datasets that enable our quantitative investment process.
Key Responsibilities:
- Work closely with quantitative researchers to understand their data requirements
- Build out and improve the efficiency and accessibility of our Python-based, petabyte-scale data lake
- Develop high-throughput, near real-time event-driven datasets using live and historical market data
- Orchestrate the training, validation and deployment of various machine learning based trading models
- Design, develop, and optimize robust data pipelines, enabling ETL processes that support quantitative research, analysis, alpha forecasting, and execution
- Implement rigorous and automated data quality assurance measures to ensure the accuracy, reliability, and consistency of data inputs and outputs
Skillset and Qualifications:
- 3+ years experience in a related field or role. Previous experience at a quantitative hedge fund is highly desirable.
- Bachelor’s degree in Computer Science or related field
- Proficiency in programming languages such as Python and Rust, along with Linux and Docker
- Experience with open source data tools such as Apache Arrow and distributed computing tools such as Ray or Dask
- Demonstrated expertise in designing, optimizing and automating data pipelines, data modeling, ETL processes, data warehousing, and data governance
- Experience building event driven applications using technologies like Protobuf, Kafka and Schema Registry
- Experience with AWS and their data offerings
- Proven ability to collaborate effectively with quantitative researchers and other stakeholders, translating their requirements into technical solutions
- Strong analytical and problem-solving skills, capable of analyzing complex datasets, identifying patterns, and extracting meaningful insights
- Highly detail oriented and meticulous
- Thrive in a fast-paced and dynamic environment, adaptable to changing priorities and emerging technologies
- Strong verbal and written communication skills
- Ability to travel periodically between our other offices in NYC and Zug, Switzerland
Are you keen to work in a well-resourced startup environment, where your ideas, experience, and drive to find creative solutions makes a difference? We’d like to hear from you.
Apply for this job
*
indicates a required field