We tackle the most complex problems in quantitative finance, by bringing scientific clarity to financial complexity.
From our London HQ, we unite world-class researchers and engineers in an environment that values deep exploration and methodical execution - because the best ideas take time to evolve. Together we’re building a world-class platform to amplify our teams’ most powerful ideas.
As part of our engineering team, you’ll shape the platforms and tools that drive high-impact research - designing systems that scale, accelerate discovery and support innovation across the firm.
The role
We’re seeking an experienced Elastic Data Engineer to lead the design, implementation and ongoing management of log ingestion pipelines and data strategy within the Elastic SIEM ecosystem.
You will ensure that ingested data is reliable, scalable and high-quality, supporting both security investigations and long-term analytics. You’ll also be responsible for data governance, honing in on integration, retention, redaction and compliance requirements.
Reporting to the Security Engineering Manager, this role is made for someone keen to help shape the long-term data strategy within the security tooling estate, mentor junior engineers and collaborate across teams to deliver robust security data services.
Key responsibilities of the role include:
- Designing, implementing and maintaining scalable log ingestion pipelines into Elastic SIEM
- Defining and enforcing data governance policies, including integration, retention, redaction and access
- Optimising data models and indexing strategies for performance and cost-effectiveness
- Working with security teams to ensure data quality for detection, hunting and compliance needs
- Collaborating with platform engineers to ensure data flow reliability and system scalability
- Mentoring junior engineers on data engineering practices and tooling
- Contributing to the broader Azure Logging strategy and integrations as required
Driving the ongoing development of a coherent security data strategy aligned to business and regulatory needs.
Who are we looking for?
The ideal candidate will have the following skills and experience:
- Strong experience in data engineering with Elastic Stack, including Ingest pipelines, Logstash, Beats, Kafka or similar
- Expertise in data modelling, indexing and optimisation within Elastic
- Proven ability to design scalable log collection and processing architectures
- Knowledge of data governance, compliance and retention strategies
- Familiarity with security data sources, such as cloud logs, endpoint logs and network telemetry
- Experience with cloud platforms, including Azure, AWS or GCP, and hybrid environments
- Track record of mentoring or developing junior engineers
Behavioural:
- Strategic Thinking: Able to develop and execute a long-term data strategy.
- Precision: Ensures data quality, consistency and reliability at all times
- Communication: Capable of conveying complex data engineering concepts to security and non-security stakeholders
- Leadership: Supports and mentors junior colleagues to develop skills and independence
- Collaboration: Works across teams to ensure alignment of data practices with wider security and business goals
Why should you apply?
- Highly competitive compensation plus annual discretionary bonus
- Lunch provided (via Just Eat for Business) and dedicated barista bar
- 30 days’ annual leave
- 9% company pension contributions
- Informal dress code and excellent work/life balance
- Comprehensive healthcare and life assurance
- Cycle-to-work scheme
- Monthly company events
G-Research is committed to cultivating and preserving an inclusive work environment. We are an ideas-driven business and we place great value on diversity of experience and opinions.
We want to ensure that applicants receive a recruitment experience that enables them to perform at their best. If you have a disability or special need that requires accommodation please let us know in the relevant section