Haleon logo
Senior Data Engineer
full-timeShanghai

Summary

Location

Shanghai

Type

full-time

Explore Jobs

About this role

Welcome to Haleon. We’re a purpose-driven, world-class consumer company putting everyday health in the hands of millions. In just three years since our launch, we’ve grown, evolved and are now entering an exciting new chapter – one filled with bold ambitions and enormous opportunity.

Our trusted portfolio of brands – including Sensodyne®, Panadol®, Advil®, Voltaren®, Theraflu®, Otrivin®, and Centrum® – lead in resilient and growing categories. What sets us apart is our unique blend of deep human understanding and trusted science.

Now it’s time to fully realise the full potential of our business and our people. We do this through our Win as One strategy. It puts our purpose – to deliver better everyday health with humanity – at the heart of everything we do. It unites us, inspires us, and challenges us to be better every day, driven by our agile, performance-focused culture.

These responsibilities include some of the following:

  • Individual contributor with strong vision and technical skills.
  • Strong & specialized data engineering skills that can be used to contribute to work with business and technical stakeholders.
  • Be part of a team accountable for setting Data Engineering quality and standards for the entire organization. This aspect of the role will focus on…
    • driving compute and storage optimization and cost efficiency.
    • ensuring all deliverables align with approved architecture and engineering design patterns and standards.
    • helping design and implement extremely high levels of data platform automation (infrastructure and software) to support the required levels of operational stability and scalability.
  • Very strong experience of the following core Azure data platform components, or very strong adjacent experience from other platforms.
    • Distributed compute technologies such as Databricks & Spark plus knowledge of underlying programmatic frameworks.
    • Modern data streaming technologies such as Kafka in association with Event Hubs. 
    • Cloud Storage Technologies (Data Lake Store, Blob Storage, etc). 
    • Azure DW/Synapse 
    • Azure Functions and other adjacent data relevant components. 
    • IaC / Automated delivery technology such as Azure DevOps, Terraform, Ansible, etc. 
    • Azure Kubernetes Service or other Containerized delivery technology. 
    • Workflow management technologies such as Apache Airflow.
  • Fluency with agile and DevOps methodology and automaton practices
  • Fluency with Data Streaming patterns and hybrid real-time / lambda integration.
  • Thought leadership in data management, data analytics, data science. Leverages deep technical competency and interpersonal skills to drive business value and results.
  • Responsible for ensuring new data technologies and innovations are integrated into the organization, advising and recommending data architecture strategies, decisions, processes, tools, standards and methodologies.
  • Active involvement in all stages of the project lifecycle – from ideation to industrialization – in an Agile development environment. Discover and develop new promising technologies in a collaborative way, create Proof-of-Concepts (POCs), Proof-of-Values (POVs) and Minimal-Viable Products (MVPs).

Why you?

Basic Qualifications:

We are looking for professionals with these required skills to achieve our goals:

  • Bachelor’s degree
  • Experience with Continuous Improvement tools
  • Experience working with data to extract, transform, and load processes and controls
  • Experience working aligned to Software Engineering best practices

Preferred Qualifications:

If you have the following characteristics, it would be a plus:

  • Experience building and operating GxP-regulated technology in Healthcare
  • 5+ years utilizing Agile Product Management- Scrum and SAFe methodologies
  • Strong stakeholder management experience

We are looking for professionals with these skills to achieve our goals. If you have them, we would like to speak to you.

  • Expertise with Data Engineering or Site Reliability Engineering
  • Expertise with non-imperative paradigms – Scala, Haskell, F#, Typescript or OPA Rego
  • Minimum 2 years working on Big Data platforms, preferably Spark
  • Minimum 3 years deploying solutions on Cloud Platforms, preferably Azure or GCP
  • Infrastructure-as-Code experience: Terraform, Ansible or Cloud templates (Azure, GCP)
  • Expertise with container technologies: Kubernetes, Helm or Dockers
  • Professional DevOps experience: Jenkins, Azure DevOps, CI/CD or Junit
  • Ability to design and implement logging, tracing, and application monitoring systems
  • Experience building and maintaining APIs

Preferred Qualification:

  • Minimum 10 years as a full-time software engineer
  • Bachelor’s Degree in Engineering, Mathematics, Statistics, or Computer Science

Additional Qualifications:

  • Streamlining data experience with technologies like Apache Kafka
  • Cryptography / Cyber Security experience
  • Experience operating in a highly regulated and secure environment

Additional Job Description

Senior Data Engineer

Why Consumer Healthcare?

In Haleon, we are on an incredible journey as we prepare to create a new, standalone, world leading company with a 100% single-minded focus on everyday health. We are doing this at a time when the work we do has never mattered more. With the COVID pandemic, people are increasingly looking for ways to manage their own health and wellbeing and to take care of their families. This is where we come in. With category leading brands such as Sensodyne, Voltaren and Centrum, built on trusted science and human understanding, and combined with our passion, knowledge and expertise, we are uniquely placed to deliver better everyday health to millions of people around the world and grow a strong, successful business.  This is an opportunity to be part of something special.

This role will provide YOU the opportunity to lead key activities to progress YOUR career.  These responsibilities include some of the following.

  • Designing automated Infrastructures that creates new auto-healing capabilities
  • Creation and integration of storage technology and DFS independency into the solution landscape
  • Data Pipeline development leveraging DevOps standards
  • Use of Continuous Integration (CI) and Continuous Deployment (CD) to build Data Engines
  • Creation of secure and private anonymization data systems using declarative programming languages that will interface between Data Silos, Data Engines and Graph Databases. These systems are fundamental for executing AI/ML workflows to accelerate drug discovery and to optimize the manufacturing processes
  • Creation of holistic (e.g. integrated) data views through the ingestion, cleaning, linking, harmonization and contextualization of multiple systems. These views will enable our AI/ML work on complex high-value, multi-root cause problems
  • Active involvement in all stages of the project lifecycle – from ideation to industrialization – in an Agile development environment. Discover and develop new promising technologies in a collaborative way, create Proof-of-Concepts (POCs), Proof-of-Values (POVs) and Minimal-Viable Products (MVPs).

We are looking for professionals with these skills to achieve our goals. If you have them, we would like to speak to you.

  • Expertise with Data Engineering or Site Reliability Engineering
  • Expertise with non-imperative paradigms – Scala, Haskell, F#, Typescript or OPA Rego
  • Minimum 2 years working on Big Data platforms, preferably Spark
  • Minimum 3 years deploying solutions on Cloud Platforms, preferably Azure or GCP
  • Infrastructure-as-Code experience: Terraform, Ansible or Cloud templates (Azure, GCP)
  • Expertise with container technologies: Kubernetes, Helm or Dockers
  • Professional DevOps experience: Jenkins, Azure DevOps, CI/CD or Junit
  • Ability to design and implement logging, tracing, and application monitoring systems
  • Experience building and maintaining APIs

Preferred Qualification:

  • Minimum 10 years as a full-time software engineer
  • Bachelor’s Degree in Engineering, Mathematics, Statistics, or Computer Science

Additional Qualifications:

  • Streamlining data experience with technologies like Apache Kafka
  • Cryptography / Cyber Security experience
  • Experience operating in a highly regulated and secure environment

 

 

 

 Job Posting End Date

 

 

2026-02-20

 

 

 

Equal Opportunities

Haleon are committed to mobilising our purpose in a way that represents the diverse consumers and communities who rely on our brands every day. It guides us in creating an inclusive culture, where different backgrounds and views are valued and respected – all in support of understanding and best serving the needs of our consumers and unleashing the full potential of our people. It’s important to us that Haleon is a place where all our employees feel they truly belong.

During the application process, we may ask you to share some personal information, which is entirely voluntary. This information ensures we meet certain regulatory and reporting obligations and supports the development, refinement, and execution of our inclusion and belonging programmes that are open to all Haleon employees. 

The personal information you provide will be kept confidential, used only for legitimate business purposes, and will never be used in making any employment decisions, including hiring decisions.

 

 

 

Adjustment or Accommodations Request

If you require a reasonable adjustment or accommodation or other assistance to apply for a job at Haleon at any stage of the application process, please let your recruiter know by providing them with a description of specific adjustments you are requesting. We’ll provide all reasonable adjustments to support you throughout the recruitment process and treat all information you provide us in confidence. 

 

 

 

Note to candidates

The Haleon recruitment team will contact you using a Haleon email account (@haleon.com). If you are not sure whether the email you received is from Haleon, please get in touch.

Other facts

Tech stack
Data Engineering,Azure,DevOps,Agile,Big Data,Kubernetes,Terraform,Apache Kafka,Cloud Platforms,APIs,Continuous Integration,Continuous Deployment,Data Streaming,Software Engineering,Data Management,Data Analytics

About Haleon

Delivering better everyday health with our superior brands from Sensodyne to Centrum. Made using trusted ingredients and backed by science, our products are recommended by healthcare professionals. #WeAreHaleon 💚

Team size: 10,001+ employees
LinkedIn: Visit
Industry: Consumer Services
Founding Year: 2022

What you'll do

  • The Senior Data Engineer will be responsible for driving compute and storage optimization, ensuring deliverables align with architecture standards, and implementing high levels of data platform automation. They will also be involved in all stages of the project lifecycle in an Agile environment.

Ready to join Haleon?

Take the next step in your career journey

Frequently Asked Questions

What does a Senior Data Engineer do at Haleon?

As a Senior Data Engineer at Haleon, you will: the Senior Data Engineer will be responsible for driving compute and storage optimization, ensuring deliverables align with architecture standards, and implementing high levels of data platform automation. They will also be involved in all stages of the project lifecycle in an Agile environment..

Why join Haleon as a Senior Data Engineer?

Haleon is a leading Consumer Services company.

Is the Senior Data Engineer position at Haleon remote?

The Senior Data Engineer position at Haleon is based in Shanghai, Shanghai, China. Contact the company through Clera for specific work arrangement details.

How do I apply for the Senior Data Engineer position at Haleon?

You can apply for the Senior Data Engineer position at Haleon directly through Clera. Click the "Apply Now" button above to start your application. Clera's AI-powered platform will help match your profile with this opportunity and guide you through the application process. You can also learn more about Haleon on their website.