Leadtech Group Logo

Leadtech Group

Data Engineer - Fintech

Posted Yesterday
Be an Early Applicant
Remote
5 Locations
Senior level
Remote
5 Locations
Senior level
Design and maintain data pipelines in AWS, ensuring data integrity and availability. Collaborate with teams to meet data requirements and optimize infrastructure performance.
The summary above was generated by AI

We are looking for an experienced Data Engineer with at least 5 years of professional experience and a solid technology background using Java or Python as a primary language. In this role, you will design, build, and maintain scalable, secure, and high-performance cloud-based data pipelines to support real-time and batch analytics within our payments platform. You will work closely with product owners, and cross-functional engineering teams to translate business requirements into robust data models and ETL/ELT workflows. Your day-to-day work will include architecting and implementing Kafka-based streaming pipelines, processing event streams and orchestrating data ingestion and transformation jobs on AWS. You will leverage Snowflake as our central data warehouse.

A little bit about us :

Revup Payments is a fintech leader in payment orchestration, providing businesses with seamless access to global payment solutions for over four years. Specializing in revenue optimization, we offer card processing and alternative payment methods

enhanced by smart routing, fraud prevention, and an intuitive dashboard. Backed by a team of payment and fraud experts, our all-in-one platform is designed to maximize revenue, reduce costs, and improve the payment experience—all through a single API integration.

Key Responsibilities:

  • Design, build, and maintain scalable data pipelines in AWS to support operational and analytical use cases
  • Define and enforce best practices for data ingestion, cataloging, and lineage across our cloud infrastructure (AWS S3, Glue, EMR, Lambda, etc.).
  • Develop and maintain real-time processing applications using Kafka (Producers, Consumers, Streams API) or similar technologies to aggregate, filter, and enrich streaming data from multiple sources.
  • Define data schemas, partitioning strategies, and access patterns optimized for performance and cost
  • Collaborate with development and analytics teams to understand and fulfill the company's data requirements.
  • Implement monitoring and alerting mechanisms to ensure the integrity and availability of data streams.
  • Work with the operations team to optimize the performance and efficiency of the data infrastructure.
  • Automate management and maintenance tasks of the infrastructure using tools such as Terraform, Ansible, etc.
  • Stay updated on best practices and trends in data architectures, especially in the realm ofreal-time data ingestion and processing.
  • Monitor and troubleshoot data workflows using tools such as CloudWatch, Prometheus, or Datadog—proactively identifying bottlenecks, ensuring pipeline reliability, and handling incidentresponse when necessary.
  • Ensure data quality and performance
  • Define and test disaster recovery plans (multi-region backups, Kafka replication,Snowflake Time Travel) and collaborate with security/infra teams on encryption,permissions, and compliance

Requirements

You’re our perfect candidate if you:

  • Bachelor's degree in Computer Science, Software Engineering, or a related field (equivalent experience is valued).
  • At least 3 years of programming experience with Java / Python
  • Experience in data engineer design and delivery with cloud based data Warehouse technologies, in particular Snowflake, or Redshift, BigQuery
  • Experience in a wide range of DB technologies such as, DynamoDB, Postgres,and Mongo
  • Development with cloud services, especially Amazon Web Services
  • Demonstrable experience in designing and implementing data pipeline architectures based on Kafka in cloud environments, preferably AWS.
  • Deep understanding of distributed systems and high availability design principles.
  • Experience in building and optimizing data pipelines using technologies like Apache Kafka, Apache Flink, Apache Spark, etc., including real-time processing frameworks such as Apache Flink or Apache Spark Streaming.
  • Excellent communication and teamwork skills.
  • Ability to independently and proactively solve problems.

Extra bonus if:

  • Experience with other streaming platforms such as Apache Pulsar or RabbitMQ.
  • Experience in DBA administration and performance tuning standards
  • Familiarity with data lake architectures and technologies such as Amazon S3,Apache Hadoop, or Apache Druid.
  • Relevant certifications in cloud platforms such as AWS (optional).
  • Understanding of serverless architecture and event-driven systems
  • Previous professional experience in FinTech / online payment flows
  • Experience with data visualization tools like Tableau, PowerBI, or Apache Superset.
  • Understanding of machine learning concepts and frameworks for real-time data analytics.
  • Previous experience in designing and implementing data governance and compliance solutions.

Benefits

What We Offer :

  • Competitive compensation package, including health insurance and performance bonuses.
  • Opportunities for professional growth and developmentin a high-growth fintech environment.
  • Collaborative and innovative culture focused on making an impactin the global payments industry.
  • Flexible working environment with supportfor work-life balance.
  • Full remote work.

Top Skills

Ansible
Apache Flink
Apache Kafka
Spark
AWS
BigQuery
Cloudwatch
Datadog
DynamoDB
Emr
Glue
Java
Lambda
Mongodb
Postgres
Prometheus
Python
Redshift
S3
Snowflake
Terraform

Similar Jobs

9 Hours Ago
Easy Apply
Remote
28 Locations
Easy Apply
Mid level
Mid level
Cloud • Security • Software • Cybersecurity • Automation
The SRE will automate operations, manage PostgreSQL databases, optimize performance, provide on-call support, and collaborate with engineering teams.
Top Skills: AnsibleChefGitlabGoLinuxPostgresRubyTerraform
9 Hours Ago
Easy Apply
In-Office or Remote
33 Locations
Easy Apply
Mid level
Mid level
Cloud • Security • Software • Cybersecurity • Automation
As an Intermediate Full Stack Engineer, you will develop and maintain analytics instrumentation tools, collaborating across teams to enhance product insights and support data-driven decisions.
Top Skills: AirflowAtlanAWSAzureClickhouseDbtGCPJavaScriptMixpanelPosthogPulumiRuby on RailsReactRubySegmentSnowflakeTerraformVue
9 Hours Ago
Easy Apply
Remote
28 Locations
Easy Apply
Entry level
Entry level
Cloud • Security • Software • Cybersecurity • Automation
As a Fullstack Engineer, you will enhance GitLab's contributor community by managing projects, coaching contributors, and analyzing contributions to improve engagement and success.
Top Skills: Ai ToolingGitJavaScriptRuby

What you need to know about the Belfast Tech Scene

If asked to name the birthplace of the RMS Titanic, you might not say Belfast. Similarly, if asked to name Europe's leading destination for foreign direct investment in new software development, Belfast might not come to mind. Yet, both are true. The city has emerged as a tech powerhouse, recently ranked among the best in the U.K. for tech careers — especially for software developers. It also leads the U.K. with the highest percentage of software development jobs advertised.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account