At Ensono, our purpose is to be a relentless ally, disrupting the status quo and enabling our clients to Do Great Things. As a trusted technology adviser and managed services provider, we help clients navigate continuous change and embrace innovation.
We deliver world-class hybrid cloud, infrastructure, mainframe transformation, data, IDAM, and cloud-native solutions, simplifying complex business challenges and creating new pathways to success. Headquartered in the USA and backed by private equity, Ensono has a strong and growing presence in the UK and Europe.
What is the role about
This is a hands-on technical role within our Data & AI competency. You will join a cross-functional team of highly skilled, like-minded professionals, helping build and deliver modern data solutions for our clients – enabling them to realise the business value of their cloud investments.
As a Data Engineering Consultant, you will design, build, and maintain data pipelines and lakehouse architectures that power analytics, AI/ML, and operational decision-making. You will work across ingestion, transformation, storage, and serving layers – delivering solutions that are scalable, reliable, and cost-efficient.
What You’ll Deliver
- Cleaned, validated datasets and production-ready data pipeline code
- Contributions to requirements documents, data inventories, and technical specifications
- Data quality checks and monitoring dashboards for pipeline health
- Documentation supporting data governance, lineage, and cataloguing
What You’ll Be Doing:
- Building and maintaining data ingestion pipelines (batch, streaming, and micro-batch) using modern cloud-native tools and frameworks
- Developing and optimising transformations within lakehouse and medallion architecture patterns (bronze, silver, gold layers)
- Working with cloud data platforms such as Databricks, Apache Spark, and cloud-native services on AWS and/or Azure/Microsoft
- Implementing data quality checks, validation rules, and automated testing to ensure pipeline reliability
- Supporting data governance and compliance activities, including cataloguing, lineage tracking, and access control
- Collaborating with analysts, data scientists, and business stakeholders to understand requirements and deliver fit-for-purpose data products
- Contributing to technical documentation, architecture decision records, and runbooks
- Participating in code reviews, pair programming, and knowledge-sharing sessions to raise team capability
What you’ll Bring:
- Strong development skills in Python and SQL; experience writing clean, testable, well-documented code
- Hands-on experience building data pipelines and ETL/ELT workflows using tools such as Apache Spark, Databricks, or equivalent
- Understanding of lakehouse and data warehouse architecture patterns, including star schemas, medallion architecture, and data modelling best practices
- Experience working with cloud platforms (AWS, Azure, or GCP), including cloud storage, compute, and managed data services
- Familiarity with common big data file formats (e.g. Parquet, Delta, Avro) and concepts such as partitioning, compression, and columnar storage
- Good understanding of software engineering best practices: version control (Git), CI/CD, code review, SOLID principles, and DRY
- Clear communication skills – able to explain technical concepts to non-technical audiences and collaborate effectively across disciplines
- A proactive, self-starting attitude with a genuine interest in understanding the business context behind the data
Desirable Skills & Experience
- Experience with Databricks (Delta Live Tables, Unity Catalog, Workflows) or similar lakehouse platforms
- Familiarity with Infrastructure as Code tools (e.g. Terraform, CloudFormation) and container technologies (Docker, Kubernetes)
- Exposure to data governance, cataloguing, or data quality frameworks
- Experience with TDD/BDD testing practices for data pipelines
- Knowledge of streaming technologies (e.g. Kafka, Kinesis, Spark Structured Streaming)
- Additional programming experience in Scala or Java
- Relevant cloud or data certifications (e.g. Databricks Data Engineer, AWS Data Analytics, Azure Data Engineer)
Personal & Leadership Qualities
At Ensono Digital, we place as much value on how you work as on what you deliver. The following qualities reflect our expectations for Consultants and are aligned with our Personal and Leadership Frameworks:
- Coachable and curious: You actively seek feedback, learn from experienced colleagues, and reflect on how to improve.
- Team-oriented: You contribute positively to team culture, support new joiners, and collaborate openly – building trust through reliability and approachability.
- Ownership mindset: You take responsibility for your work, communicate blockers early, and deliver to the quality and timelines expected.
- Adaptable learner: You embrace new tools, methods, and feedback. You take initiative in your own development and are energised by the pace of change in cloud and data.
- Business-aware: You seek to understand the “why” behind your work – connecting your technical delivery to client goals and business value.
- Clear communicator: You express ideas clearly in writing and conversation, listen actively, and ask good questions to build shared understanding.
Top Skills
Ensono Belfast, Northern Ireland Office
Belfast, United Kingdom



