A Data Analytics and Platform team within the Institutional Securities Technology division is seeking a collaborative, hands-on developer to build best-in-class solutions for Data APIs, Data Governance, Data Quality, Security & Privacy, and Architecture. This role offers a unique opportunity to work in a state-of-the-art modern data stack using open-source technologies aligned with our cloud strategy. We are looking for a proficient developer who is excited about innovation, with a proven track record of delivery.
Roles and responsibilities include :
- You will work closely with quants and traders across multiple trading desks to design, develop, deploy and support the innovative data science environment
- Design widely used and flexible APIs to our core data and functionality
- Analyze and optimize performance of large data workloads using compute clusters
- Stay up to date with emerging trends and tools in the data & analytics domain
- Provide support and design advice to users of the cross-asset platform
- Work closely with strategists and other stakeholders to help move their financial assets, valuation models and data pipelines to the platform
- Efficient Communication across regions and functions
- At least 3 years relevant experience would generally be expected to find the skills required for this role
What were looking for : we have a number of roles available for a range of technologists at different experience levels to join the team who will need
Excellent problem solving and code development skillsAbility and interest to research, learn and implement new Data and Analytics technologies and paradigmsEnterprise level software development practicesStrong oral and written communication skillsStrong team working ability in local and global teamsPassion for continuous improvement both personally and as a teamSkills that will help you in the role :
Established experience with Python and Python ecosystem.Experience with performance optimization, concurrent programming and microservices design.Good knowledge of data processing libraries and stacks (Polars, Pandas, Numpy, Dask, Spark), SQL / NoSQL (Mongo)?Knowledge of ETL and streaming pub / sub platforms (Kafka)?Knowledge of SDLC management tools (Git, Jenkins, Github, docker, Kubernetes, Autosys), apps observability libraries (OpenTelemetry) and monitoring tools (Grafana, Loki, Tempo, Prometheus).Exposure to grid distributed applications (optional)Exposure to KDB or other timeseries database technologies (optional)Exposure to Cloud technologies (optional)Exposure to Risk Management systems and a wide range of Financial Instruments (optional)