UPS logo
Data Engineer - GCP, SQL,BiqQuery, Dataflow, Pub/Sub, Logging, IAM
full-timeIndia

Summary

Location

India

Type

full-time

Explore Jobs

About this role

Before you apply to a job, select your language preference from the options available at the top right of this page.

Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level.

Job Description:

Job Description

We're seeking a talented Data Engineer with hands-on experience in the Google Cloud data ecosystem and a proven track record. You'll be instrumental in designing, building, and maintaining our data infrastructure and pipelines, enabling critical insights and supporting data-driven initiatives across the organization. 

Responsibilities

  • Data Pipeline Development: Design, build, and optimize robust and scalable data pipelines to ingest, transform, and load data from various sources into our data warehouse and knowledge graphs. 

  • Cloud Data Stack Expertise: Implement and manage data solutions using Google Cloud Platform (GCP) services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, Spanner and Dataproc and Azure Cloud Services  

  • Knowledge Graph Engineering: Develop and maintain data models, ingest data, and create efficient queries within Neo4j and/or Stardog. Leverage your expertise to build and expand our enterprise knowledge graph. 

  • Data Quality & Governance: Implement best practices for data quality, data validation, and data governance, ensuring data accuracy, consistency, and reliability. 

  • Performance Optimization: Continuously monitor and optimize the performance of data pipelines and database queries, identifying and resolving bottlenecks. 

  • Collaboration: Work closely with data scientists, analysts, and other engineering teams to understand data requirements and deliver effective data solutions. 

  • Documentation: Create and maintain comprehensive documentation for data pipelines, data models, and knowledge graph schemas. 

Required Qualifications

Education:

Bachelor’s degree in computer science, Engineering, or a related quantitative field. 

Experience:  

  • 5+ years of professional experience as a Data Engineer or in a similar role. 

  • Strong hands-on experience with Google Cloud Platform (GCP) data services, including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Composer (Apache Airflow). 

  • Proficiency in Python for data manipulation and pipeline orchestration. 

  • Experience with SQL and data warehousing concepts. 

  • Familiarity with data modeling techniques relational database

  • Experience with version control systems (e.g., Git).

  • Application Support

  • Terraform or any other IaC tools

Preferred Qualifications

  • Experience with other GCP services

  • Knowledge of streaming data technologies (e.g., Kafka, Google Cloud Dataflow streaming). 

  • Familiarity with data governance tools and principles. 

  • Certifications in Google Cloud Platform data engineering. 


Employee Type:
 

Permanent


UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.

Other facts

Tech stack
Data Engineering,Google Cloud Platform,BigQuery,Dataflow,Pub/Sub,Cloud Storage,Python,SQL,Data Warehousing,Data Modeling,Version Control,Terraform,Knowledge Graphs,Data Quality,Data Governance,Performance Optimization,Collaboration

About UPS

Operating in more than 200 countries and territories, we’re committed to moving our world forward by delivering what matters. Beginning as a small messenger service, UPS was started by two enterprising teenagers and a $100 loan. Now, we’re almost 500,000 UPSers strong, with operations around the globe.

As a transportation and logistics leader, we are proud to offer innovative solutions to our customers—both big and small. We also support the communities we serve. Just take a look at The UPS Foundation’s social impact report!

Headquartered in Atlanta, we can be found on the web at ups.com and about.ups.com. Job seekers can visit upsjobs.com to learn more. Our active social media channels include Facebook, Instagram, Twitter, YouTube, and TikTok.

Facebook: www.facebook.com/ups
Instagram: www.instagram.com/ups/
Twitter: www.twitter.com/ups
TikTok: UPS
YouTube: www.youtube.com/ups

Website
https://about.ups.com/
The UPS Foundation’s social impact report:
https://about.ups.com/us/en/social-impact/reporting/the-ups-foundations-social-impact-report.html
Career Site
upsjobs.com

Team size: 10,001+ employees
LinkedIn: Visit
Industry: Truck Transportation
Founding Year: 1907

What you'll do

  • The Data Engineer will design, build, and maintain data infrastructure and pipelines to support data-driven initiatives. Responsibilities include developing data pipelines, managing cloud data solutions, and ensuring data quality and governance.

Ready to join UPS?

Take the next step in your career journey

Frequently Asked Questions

What does a Data Engineer - GCP, SQL,BiqQuery, Dataflow, Pub/Sub, Logging, IAM do at UPS?

As a Data Engineer - GCP, SQL,BiqQuery, Dataflow, Pub/Sub, Logging, IAM at UPS, you will: the Data Engineer will design, build, and maintain data infrastructure and pipelines to support data-driven initiatives. Responsibilities include developing data pipelines, managing cloud data solutions, and ensuring data quality and governance..

Why join UPS as a Data Engineer - GCP, SQL,BiqQuery, Dataflow, Pub/Sub, Logging, IAM?

UPS is a leading Truck Transportation company.

Is the Data Engineer - GCP, SQL,BiqQuery, Dataflow, Pub/Sub, Logging, IAM position at UPS remote?

The Data Engineer - GCP, SQL,BiqQuery, Dataflow, Pub/Sub, Logging, IAM position at UPS is based in India, India. Contact the company through Clera for specific work arrangement details.

How do I apply for the Data Engineer - GCP, SQL,BiqQuery, Dataflow, Pub/Sub, Logging, IAM position at UPS?

You can apply for the Data Engineer - GCP, SQL,BiqQuery, Dataflow, Pub/Sub, Logging, IAM position at UPS directly through Clera. Click the "Apply Now" button above to start your application. Clera's AI-powered platform will help match your profile with this opportunity and guide you through the application process. You can also learn more about UPS on their website.