My client is a global quantitative and systematic investment manager, operating in all liquid asset classes across the world. A technology and data driven group implementing a scientific approach to investing. Combining data, research, technology, and trading expertise has shaped the collaborative mindset, enabling them to solve the most complex challenges. They have a culture of innovation which continuously drives their ambition to deliver high quality returns for investors.
The role:
- Develop ETL pipelines to integrate and test very large alternative datasets for the Commodities desk in collaboration with quant researchers and data engineering teams.
- Architect, deploy and manage cloud-based systems for storing and exploring very large alternative datasets in collaboration with the AWS infrastructure team.
- Monitor, support, debug and extend existing Commodities trading and research infrastructure together with Researchers and Support Engineers.
Requirements:
- Comfortable in Python, in particular numerical libraries – numpy, pandas, matplotlib, etc.
- Basic knowledge of AWS.
- Basic knowledge of databases (e.g. SQL).
- Development practices – version control with Git, unit testing, etc.
- A quantitative mindset.
- Team player and collaborative attitude.
Nice to have:
- Experience creating dashboards or using data visualization software (e.g. Tableau, Dash).
- In-depth AWS experience (e.g. DynamoDB, RDS, S3, Lambda, AWS CDK).
- Advanced database knowledge (query optimisation, relational vs non-relational databases, etc.).
- Parallel computation.
- Experience with geographic data using geopandas, xarray.
- Financial knowledge is a plus but not required.