Fractal Logo

Fractal

AWS Data Engineer

Reposted 4 Days Ago
Be an Early Applicant
In-Office
London, Greater London, England
Senior level
In-Office
London, Greater London, England
Senior level
The AWS Data Engineer will design, build, and operationalize enterprise data solutions on AWS, mentoring fellow engineers and consulting with clients on data architecture and migration.
The summary above was generated by AI

It's fun to work in a company where people truly BELIEVE in what they are doing!

We're committed to bringing passion and customer focus to the business.

AWS Data Engineer

London (Hybrid)

12 month FTC

Fractal is a strategic AI partner to Fortune 500 companies with a vision to power every human decision in the enterprise. Fractal is building a world where individual choices, freedom, and diversity are the greatest assets. An ecosystem where human imagination is at the heart of every decision. Where no possibility is written off, only challenged to get better. We believe that a true Fractalite is the one who empowers imagination with intelligence. Fractal has been featured as a Great Place to Work by The Economic Times in partnership with the Great Place to Work® Institute and recognized as a ‘Cool Vendor’ and a ‘Vendor to Watch’ by Gartner.

About the role 

As a AWS Data Engineer in the insurance sector, If you are an extraordinary developer and who loves to push the boundaries to solve complex business problems using creative solutions, then we wish to talk with you. As an Analytics Technology Engineer, you will work on the Technology team that helps deliver our Data Engineering offerings at large scale to our Fortune clients worldwide. The role is responsible for innovating, building, and maintaining technology services.

Key Responsibilities: 

  • Consult, Design, build and operationalize large scale enterprise data solutions using one or more of AWS data and analytics services in combination with 3rd parties - Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue, Snowflake and Databricks

  • Analyse, re-architect and re-platform on-premises data stores / Databases to modern data platforms on AWS cloud using AWS or 3rd party services

  • Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala

  • Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, DynamoDB, RDS, S3

  • Design and implement data engineering, ingestion and curation functions on AWS cloud using AWS native or custom programming

  • Perform detail assessments of current state data platforms and create an appropriate transition path to AWS cloud as part of customer consultation and business proposals

  • Participate in client design workshops and provide trade-offs and recommendations towards building solutions

  • Mentor other engineers in coding best practices and problem solving

Technical Requirements: 

  • 6 years’+ experience in the industry

  • 3 or more years of hands-on experience on the AWS Services, especially AWS Glue

  • Experience and knowledge of Big Data Architectures, on cloud and on premise

  • Working experience with: AWS Athena and Glue Pyspark, EMR, DynamoDB, Redshift, Kinesis, Lambda, Apache Spark, Databricks on AWS, Snowflake on AWS

  • Proficient in AWS Redshift, S3, Glue, Athena, DynamoDB

  • AWS Certification: AWS Certified Solutions Architect and/or AWS Certified Data Analytics

  • Capability to design production-grade dbt pipelines using reusable macros, incremental models, and SCD Type 2 implementations.

  • Deep knowledge of platform-specific tuning, such as clustering keys in Snowflake or dist/sort keys in Redshift and table transformation using dbt.

  • Proficiency in Python for building custom Airflow operators, reusable ingestion frameworks, and complex workflow automation.

  • Detailed understanding of AWS data share capabilities for sharing data across account and redshift instances.

  • Working experience with Agile Methodology, work across all phases of SDLC, and use Software Engineering principles to build scaled solutions

  • Strong hands-on experience in AWS Glue, automation build-out, integration with other AWS services, using SPARK

  • Good knowledge with Spark and Python

  • Experience in building and delivering proofs-of-concept, in order to address specific business needs, using the most appropriate techniques, data sources and technologies

  • Experience partnering with executive stakeholders as a trusted advisor as well as enabling technical implementers

  • Hands-on administrating skill required with AWS Managed Kafka configuration for creating topics, consumer and producer control

  • Need to review and decompose the existing Java application to be replaced with either a Mule service built by the Mule squad, or Lambda service in Python within the Operational Data squad. Ideally suited to someone who has worked on Java development in past

  • Be an integral part of large-scale client business development and delivery engagements

Education:

B.E/B.Tech/M.Tech in Computer Science or related technical degree OR Equivalent

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Not the right fit?  Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Top Skills

Airflow
AWS
Databricks
Dbt
DynamoDB
Emr
Glue
Java
Kinesis
Lambda
Python
Redshift
Scala
Snowflake
Spark

Similar Jobs

Yesterday
In-Office
Senior level
Senior level
AdTech • Marketing Tech
The Lead Data Engineer will create solution architectures, lead data engineering teams, ensure high-quality delivery, and engage with clients to understand requirements while overseeing end-to-end engineering delivery.
Top Skills: AWSCi/CdDatabricksDevOpsDockerGithub ActionsPower BIPythonScalaSnowflakeSparkSQLTableauTerraformTreasure Data
Yesterday
In-Office
Senior level
Senior level
AdTech • Marketing Tech • Software
Lead the engineering effort to deliver scalable data solutions, overseeing technical delivery, mentoring engineers, and engaging with clients to meet requirements.
Top Skills: AWSDatabricksDockerGithub ActionsPower BIPythonScalaSnowflakeSparkSQLTableauTerraformTreasure Data
2 Days Ago
In-Office
Mid level
Mid level
Cloud • Digital Media • Information Technology • Software
The Data Engineer will design and develop data pipelines, manage data processing solutions, ensure data quality, and collaborate with team members to create scalable data assets.
Top Skills: AWSEmrGitGlueKafkaPysparkPythonS3Terraform

What you need to know about the Belfast Tech Scene

If asked to name the birthplace of the RMS Titanic, you might not say Belfast. Similarly, if asked to name Europe's leading destination for foreign direct investment in new software development, Belfast might not come to mind. Yet, both are true. The city has emerged as a tech powerhouse, recently ranked among the best in the U.K. for tech careers — especially for software developers. It also leads the U.K. with the highest percentage of software development jobs advertised.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account