full-timeOakland$122k - $194k

Summary

Location

Oakland

Salary

$122k - $194k

Type

full-time

Explore Jobs

About this role

Requisition ID # 169960 

Job Category: Information Technology 

Job Level: Individual Contributor

Business Unit: Electric Operations

Work Type: Hybrid

Job Location: Oakland

 

 

 

Position Summary

 

PG&E is seeking a Data Engineer to lead the development of a modern, cloud-native data platform that supports vegetation management operations, regulatory compliance, and advanced analytics. This role is central to transforming fragmented legacy systems into a unified, audit-ready Snowflake Data Lakehouse, enabling scalable, secure, and transparent data access across the organization.


 

The engineer will design and implement robust ELT pipelines using Informatica as the primary tool, alongside a suite of AWS services including Step Functions, Fargate, Lambda, DynamoDB, and S3. The role spans structured, semi-structured, and unstructured data domains, integrating sources such as Salesforce, SAP, SharePoint, ArcGIS, and remote sensing imagery (LiDAR, Ortho imagery, surface reflectance, etc.). 

 

PG&E is providing the salary range that the company in good faith believes it might pay for this position at the time of the job posting. This compensation range is specific to the locality of the job.  The actual salary paid to an individual will be based on multiple factors, including, but not limited to, particular skills, education, licenses or certifications, experience, market value, geographic location, collective bargaining agreements, and internal equity. Although we estimate the successful candidate hired into this role will be placed towards the middle or entry point of the range, the decision will be made on a case-by-case basis related to these factors. This job can also participate in PG&E’s discretionary incentive compensation programs.
 
A reasonable salary range is:
Bay Area Minimum: $122,000
Bay Area Mid-point: $158,000
Bay Area Maximum: $194,000
•    This position follows a hybrid work model, requiring employees to report to their assigned office location at least one to two days per week. The remaining days may be worked remotely, depending on business needs. The headquarters is located in Oakland, CA. 
•    The first round interview for this role will be conducted in person at our Oakland headquarters.

 

Job Responsibilities

•    Conceptualizes and generates infrastructure that allows big data to be accessed and analyzed. 
•    Collaborate with cross-functional teams—including data scientists, analysts, and business stakeholders—to understand data requirements and deliver high-quality, analytics-ready datasets. 
•    Support the deployment of machine learning models by enabling feature pipelines, model input/output data flows, and integration with platforms like SageMaker or Foundry.  
•    Resolves application programming analysis problems of moderate to complex scope within procedural guidelines. May seek assistance from the supervisor or more skilled programmers/analysts on unusual or especially complex issues that cross multiple functional/technology areas. 
•    Works on complex data and analytics-centric problems having a moderate impact that require in-depth analysis and judgment to obtain results or solutions
•    Plans work to meet assigned general objectives; progress is reviewed upon completion, and solutions may provide an opportunity for creative/non-standard approaches.
•    Communicates (oral and written) recommendations.
•    Mentors/guides less experienced colleagues.  

 

Qualifications

Minimum:

 

•    BA/BS in Computer Science, Management Information Systems, related field of study, or equivalent experience.
•    5 years of experience with data engineering/ETL ecosystem, such as Palantir Foundry, Spark, Informatica, SAP BODS, OBIEE. 

 

Desired:

 

•    Experience with infrastructure-as-code tools (e.g., Terraform, CloudFormation).
•    Familiarity with business intelligence tools such as Power BI, Tableau, or Foundry for data visualization and reporting. 
•    Knowledge of software engineering principles such as unit testing, CI/CD, and source control.
•    Experience working with geospatial data and tools such as ArcGIS.
•    Experience building data migration pipelines in Informatica
•    Experience with machine learning algorithm deployment.

Other facts

Tech stack
Data Engineering,ETL,Informatica,AWS,Snowflake,Machine Learning,Data Visualization,ArcGIS,Salesforce,SAP,DynamoDB,S3,Feature Pipelines,Cloud-Native,Big Data,Data Lakehouse

About Pacific Gas And Electric Company

Pacific Gas and Electric Company, incorporated in California in 1905, is one of the largest combination natural gas and electric utilities in the United States. Based in San Francisco, the company is a subsidiary of PG&E Corporation.

There are approximately 20,000 employees who carry out Pacific Gas and Electric Company's primary business—the transmission and delivery of energy. The company provides natural gas and electric service to approximately 15 million people throughout a 70,000-square-mile service area in northern and central California.

Fast Facts

Service area stretches from Eureka in the north to Bakersfield in the south, and from the Pacific Ocean in the west to the Sierra Nevada in the east

141,215 circuit miles of electric distribution lines and 18,616 circuit miles of interconnected transmission lines

42,141 miles of natural gas distribution pipelines and 6,438 miles of transportation pipelines

5.1 million electric customer accounts

4.3 million natural gas customer accounts

Team size: 10,001+ employees
LinkedIn: Visit
Industry: Utilities

What you'll do

  • The Data Engineer will lead the development of a cloud-native data platform for vegetation management operations and regulatory compliance. Responsibilities include designing and implementing ELT pipelines and collaborating with cross-functional teams to deliver analytics-ready datasets.

Ready to join Pacific Gas And Electric Company?

Take the next step in your career journey

Frequently Asked Questions

What does Pacific Gas And Electric Company pay for a Data Engineer, Senior?

Pacific Gas And Electric Company offers a competitive compensation package for the Data Engineer, Senior role. The salary range is USD 122k - 194k per year. Apply through Clera to learn more about the full compensation details.

What does a Data Engineer, Senior do at Pacific Gas And Electric Company?

As a Data Engineer, Senior at Pacific Gas And Electric Company, you will: the Data Engineer will lead the development of a cloud-native data platform for vegetation management operations and regulatory compliance. Responsibilities include designing and implementing ELT pipelines and collaborating with cross-functional teams to deliver analytics-ready datasets..

Why join Pacific Gas And Electric Company as a Data Engineer, Senior?

Pacific Gas And Electric Company is a leading Utilities company. The Data Engineer, Senior role offers competitive compensation.

Is the Data Engineer, Senior position at Pacific Gas And Electric Company remote?

The Data Engineer, Senior position at Pacific Gas And Electric Company is based in Oakland, California, United States. Contact the company through Clera for specific work arrangement details.

How do I apply for the Data Engineer, Senior position at Pacific Gas And Electric Company?

You can apply for the Data Engineer, Senior position at Pacific Gas And Electric Company directly through Clera. Click the "Apply Now" button above to start your application. Clera's AI-powered platform will help match your profile with this opportunity and guide you through the application process. You can also learn more about Pacific Gas And Electric Company on their website.