Our client is a global investment management firm that utilizes a diversified portfolio of systematic and quantitative strategies across financial markets that seeks to achieve high quality, uncorrelated returns for clients. They have deep expertise in trading, technology and operations and attribute their success to rigorous scientific research. As a technology and data–driven firm, they design and build their own cutting–edge systems, from high-performance trading platforms to large-scale data analysis and compute farms. With offices around the globe, they emphasize true, global collaboration by aligning investment, technology and operations teams’ functionally around the world.
The Data Development group manages the lifecycle of data used by investment for trading, backtesting and research. Working with quants and tech teams to integrate, process and serve data from vendors and public sources in the firm’s data infrastructure (alpha data and cross-asset referential data).
Currently seeking new data developers to join their growing teams. You will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded and diversity of thought and creations solutions are valued.
As a data developer, you would join one of the Alpha Data teams. They are fast-paced Python development teams working closely with quantitative researchers to design, build, test and maintain data pipelines to onboard new data sets for research on new trading strategies. They own the entire pipelines starting with how data is ingested from the outside world, transform that data into timeseries of actionable insights and design the data models exposed to quantitative researchers. They also support these pipelines in productions during live trading and contribute to the data platform by building new frameworks, libraries, and full-stack services used to build data pipelines.
- Acquire a deep understanding of the data requirements of investment research teams to deliver the right solutions.
- Design, implement, test, optimize and troubleshoot Python data pipelines, frameworks and services.
- Collaborate with and influence technologists and investment researchers to ensure the data pipelines and platform meet constantly evolving requirements.
- Work closely with data operations and data platform developers to improve the data platform and reduce technical debt.
- Write and review technical documents, such as requirements docs for researchers, design docs to propose new platform solutions and production support runbooks.
- Bachelor’s degree in Computer Science, Software Engineering or related subject.
- 2+ years’ development experience with Python.
- Practical knowledge of commonly used protocols and tools used to transfer data (e.g. FTP, SFTP, HTTP APIs, AWS S3).
- Excellent communication skills.
Nice to have:
- 2+ years designing, testing, optimizing and troubleshooting data intensive applications.
- Experience analyzing and organizing data.
- Experience with big data frameworks, databases, distributed systems, Cloud or Web development.