Skip to main content
We're enhancing our site and your experience, so please keep checking back as we evolve.
Back to News
Create Today. Predict Tomorrow.

Create Today. Predict Tomorrow.

20 September 2018
  • Software Engineering

Preamble

G-Research has been on an incredible journey over the past few years. We’re a Quant Research and Technology company in the heart of London. We use quantitative analysis and modelling to create software, simulations and forecasts of how financial markets and individual instruments within them behave (and will behave) under different circumstances.

Essentially, the whole effort is fed by structured and unstructured data coming from many sources relating to company performance, market performance, economic factors, social media sentiment analysis and much more. Our Data Processing team streams tick-data in real-time to generate aggregates and advanced indicators; stores it and serves it to our real-time and research teams. We then apply complex Machine Learning to that data to predict future market behaviour.

We have one simple goal. To know more about the world than our competitors. Our vision and strategy required us to push the boundaries of modern software development. Think about it; what was considered five years ago as ‘bleeding edge’ is known simply as software engineering today. What’s really exciting is that the work we’re doing now will be considered software engineering five years in the future. We’re incredibly proud to have put together a collective of incredibly smart and talented engineers to help us solve some of the most challenging Computer Science problems in the world.

The Company has gone from having 150 people five years ago to having 700 employees in 2018. We’re experiencing tremendous growth across our teams right now and we need smart engineers who can solve tough problems at high levels of complexity. We’re aiming to grow at a similar rate in the next few years…

The Data Challenge

We develop systems that handle massive data sets, from simple pricing data to news stories and other unstructured data, all of which need to be represented and distributed in a consistent and timely manner in order to make reliable future price predictions. High availability and distribution requirements provide plenty of opportunity to introduce new technologies and invent some of our own. The specialised APIs and bespoke systems reduce time to market for research and execution. Data is at the core of our business.

The Latency Challenge

We produce feed handlers for a wide variety of data sources including exchange data for multiple asset classes and non-exchange data feeds. Our software transforms raw data in many forms into the common normalized data model used by our end users. Our platform is implemented in C++, C# and includes components running on Windows and Linux.

We have expertise in using low-latency middleware to provide high data volumes to many clients in real-time. Our code is performance critical, and we invest time in analysing and optimising our components at sub-microsecond precision.

Our work gives us the opportunity to work with technical teams from markets around the world and gain an in-depth understanding of how those markets function. We work closely with the researchers who use our data and the other development teams around the company.

 

The Security Challenge:

We’re developing state-of-the-art software controls and architecture to secure our research platform and are interested in secure communication, authentication and access control systems that deliver security libraries to be used across the company. We take great measures to ensure high code quality, securing build and release, and runtime diagnostics.

Our engineers use encryption, PKI, obfuscation, security API (modular) design, logging & reverse engineering/debugging tools such as IDA.

Some members of the Software Engineering team are currently working on a new distributed system to manage secure build, test and encryption of the firm’s IP. As the company is growing fast we expect to be taking on a wider range of projects in future including defining core libraries which will be integrated into multiple investment platforms.

 

Our recipe for growth has been:

Speed: To accomplish this hiring growth, it’s important to move quickly, testing raw intellectual horsepower and problem solving capabilities rather than focusing on particular domain experience or knowledge of certain tools and frameworks. Hiring decisions are typically made 24-48 hours after the candidate completes the process.

Interview Process: As you can imagine, we have an incredibly high bar for engineering talent and aim to test engineers across four fundamental areas of competency: Systems Design, Algorithms and CS Fundamentals, Practical Coding, Culture fit & Leadership capability. Our recruitment process aims to select the ‘right’ interviewers to check each candidates capabilities (i.e. selecting an interviewer of the right seniority to check a potential candidate).

Decision Making: Our collaborative hiring model utilizes the entire team’s input ensuring we are not only hiring the best but also ensure we hire willing collaborators and team players – being a team player is a huge determining factor when it comes to a hiring decision.

Culture: Culture is very important at G-Research. We are cultivating an environment where engineers are not only building great services but have the opportunity to socialise and enjoy some downtime:

  • Various different sports clubs
  • Marathons and Charity events
  • Games Room for social use
  • Monthly Drinks
  • Christmas party
  • Team Lunches
  • Sponsored events and HackDays
  • 10% Days
  • Mahjong Tournaments
  • Access to the Innovation Lab for POC projects and much more

Engineering Excellence: We are building the most advanced prediction engine within the financial markets. We have huge Software, Data and ML challenges in front of us and require curious, pragmatic and agnostic engineers who enjoy tinkering on small problems relating to optimisation, and at the same time love solving complex scalability issues in systems of non-trivial complexity.

If you want to be a part of our growth drop me an email: recruitmentteam@gresearch.com

Stay up to date with
G-Research