WalletConnect Logo

WalletConnect

Senior Data Engineer

Posted Yesterday
Be an Early Applicant
In-Office or Remote
Hiring Remotely in United Kingdom
Senior level
In-Office or Remote
Hiring Remotely in United Kingdom
Senior level
The Senior Data Engineer will design and maintain scalable data pipelines, ensure data quality, and develop backend services to support WalletConnect's payments infrastructure and analytics.
The summary above was generated by AI
About WalletConnect

WalletConnect is one of the core infrastructure teams in Web3 — we build the connectivity layer that lets wallets and apps communicate securely across blockchains. Since launching in 2018, we’ve grown into a network of 75,000+ apps, 700+ wallets, and 50+ million users. Our mission is to power the financial internet by making digital ownership and payments interoperable and accessible.

We’ve recently launched WalletConnect Pay — a payments solution that lets merchants and payment providers leverage blockchain rails for new payment experiences, like stablecoin checkout, payouts, and deposits.

Backed by $38M from investors like Union Square Ventures, Shopify, Coinbase Ventures, Circle Ventures, and Uniswap Labs, we’re a global, remote-first team that values openness, simplicity, innovation, and ownership.

Why Now

We’re entering our most ambitious chapter yet. WalletConnect Pay is an end-to-end crypto and stablecoin payment method built on the world’s largest wallet network — already embedded in Stripe, Coinbase Commerce, Shopify, MoonPay, Shift4, and BitPay, with a landmark partnership with Ingenico bringing stablecoin payments to 40M+ terminals across 120+ countries.

As WalletConnect scales into payments, data is becoming foundational — not optional. From wallet connection success rates and transaction health to financial reconciliation, merchant reporting, and operational monitoring, our data systems must be as reliable as the payment flows they track. We’re building a next-generation data platform to support real-time payments infrastructure, developer tooling, and product analytics. This hire joins at the moment we’re moving from MVP pipelines to scalable production architecture, and we need a senior engineer who can help design and operate the data foundation behind a global payments network.

The Role

This is a Senior Data Engineer position on the Data Engineering team. You’ll work closely with the Data Engineering Lead to design and scale the data platform powering WalletConnect’s payments infrastructure and ecosystem analytics.

This role sits at the intersection of data engineering and backend systems. You’ll build and operate event-driven data pipelines, maintain real-time processing systems, and ensure data quality across high-volume transactional workflows. You’ll also build backend services that expose data to internal platforms and APIs, so teams across the company can consume it.

Day-to-day, you’ll work with Python, SQL, ClickHouse, Airflow, and dbt within an event-driven architecture. You’ll own the systems you build end-to-end — including monitoring, reliability, and data correctness — and collaborate with product, engineering, and infrastructure teams to support new data use cases as the platform grows.


Requirements

Data Platform & Pipeline Engineering

  • Design, build, and operate scalable data pipelines for WalletConnect payments and platform data — event-driven, near real-time processing across high-volume transaction flows.
  • Maintain and evolve the real-time data platform, moving it from MVP pipeline to scalable production architecture.
  • Model and process transactional and ledger-style data — financial reconciliation, merchant reporting, wallet fee breakdowns, and settlement tracking.
  • Contribute to improving the overall data platform architecture as requirements grow.

Data Quality & Observability

  • Ensure data accuracy, freshness, and observability across critical workflows — payments flow correctness, wallet connection success rates, and transaction health.
  • Build monitoring and alerting that meets the reliability bar required for financial data systems.
  • Own data quality across the pipeline — from ingestion through transformation to downstream consumption.

Backend Services & Data Access

  • Develop backend services that expose data to internal platforms, APIs, and downstream systems so teams across the company can consume it.
  • Support downstream use cases including financial reconciliation, merchant reporting, operational monitoring, and product analytics.
  • Collaborate with product, engineering, and infrastructure teams to onboard new data use cases and ensure clean integration points.

Ownership & Operations

  • Own and operate the systems you build, including deployment, monitoring, and reliability.
  • Collaborate closely with the Data Engineering Lead on architecture decisions and platform evolution.
  • Contribute to engineering standards, documentation, and operational playbooks for the data platform.
Tech Stack

Our current stack includes Python, SQL, ClickHouse, Airflow, dbt, Grafana, and Preset / Deepnote for data visualization. The architecture is event-driven with near real-time processing. Rust is a plus but not required. We’re also experimenting with AI-assisted monitoring and data tooling as part of our platform evolution.

What You Bring

Must-Haves

  • 5+ years of experience in data engineering or data infrastructure — you’ve built, shipped, and operated production data systems.
  • Strong experience building data pipelines and production data platforms — ingestion, transformation, modelling, and serving.
  • Comfortable using AI-assisted development tools/agents, with strong discipline in reviewing, validating, and productionizing generated code.
  • Expertise in SQL and large-scale data processing.
  • Experience with modern data stack tools: Airflow, dbt, ClickHouse, BigQuery, Snowflake, Athena, or similar.
  • Experience working with event-driven architectures and real-time or near real-time data processing.
  • Strong programming skills in Python or similar backend languages.
  • Experience designing systems with data quality, monitoring, and reliability in mind — you treat data correctness as a first-class concern.
  • Ability to operate independently and own production systems end-to-end in a remote, async team.

Nice-to-Haves

  • Experience in fintech, payments, or financial infrastructure — reconciliation, settlement, ledger reporting, or merchant data.
  • Experience working with high-volume event data at scale.
  • Familiarity with ledger-style or transactional data models.
  • Exposure to Web3, blockchain, or crypto ecosystems — on-chain data, wallet analytics, or token transaction flows.
  • Rust experience or willingness to learn.
  • Experience building data platforms in fast-growing startups where you’ve shaped the architecture from early stages.

Benefits
  • Fully remote position with a budget for your home office or work environment.
  • Regular team offsites to incredible locations around the world.
  • Opportunities to travel to conferences and community events.
  • Generous PTO and parental leave.
  • Meaningful Learning & Development budget.
  • Competitive compensation package including salary, equity, and potentially tokens.
  • Healthcare coverage for US-based team members.
  • The chance to build at the forefront of onchain payments with one of the most recognised companies in Web3.

Top Skills

Airflow
Clickhouse
Dbt
Grafana
Python
Rust (Optional)
SQL

Similar Jobs

19 Days Ago
In-Office or Remote
Senior level
Senior level
Payments
Design and build reliable, performant data pipelines for large payment datasets. Own end-to-end data flows, drive data quality, implement transformations (dbt), operate on StarRocks and Flink/Spark in cloud (AWS/GCP), collaborate cross-functionally, establish engineering standards, mentor engineers, and shape CI/CD and incident practices.
Top Skills: AirflowApache HudiAWSData Quality ToolingDbtFlinkGCPLakehouseMedallion ArchitecturePythonSparkSQLStarrocks
20 Days Ago
Remote
GBR
Senior level
Senior level
Fintech • Software • Financial Services
Build and own end-to-end data pipelines and production code to turn raw data into insights. Ensure reliability through testing, observability, and data quality checks. Collaborate with product, data science, and engineering to ship features and apply AI where valuable.
Top Skills: AirflowC#Ci/CdCloudDagsterDatabricksDbtDelta LakeDockerPythonSparkSQLTemporal
5 Days Ago
Remote or Hybrid
United Kingdom
Senior level
Senior level
Blockchain • Fintech • Payments • Cryptocurrency • Web3
As a Senior Data Platform Engineer, you will lead complex data engineering projects, design core data platforms, mentor engineers, and develop automation capabilities for data workflows.
Top Skills: AirflowApache BeamBigQueryBigtableGCPKafkaKubernetesPub/SubRedisTerraform

What you need to know about the Belfast Tech Scene

If asked to name the birthplace of the RMS Titanic, you might not say Belfast. Similarly, if asked to name Europe's leading destination for foreign direct investment in new software development, Belfast might not come to mind. Yet, both are true. The city has emerged as a tech powerhouse, recently ranked among the best in the U.K. for tech careers — especially for software developers. It also leads the U.K. with the highest percentage of software development jobs advertised.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account