Clera - Your AI talent agent
LoginStart
Start
BC
Blue Cross of Idaho

Principal Data Architect

full-time•Bozeman•$147k - $192k

Summary

Location

Bozeman

Salary

$147k - $192k

Type

full-time

Experience

10+ years

Company links

WebsiteLinkedInLinkedIn

About this role

Blue Cross of Idaho is seeking a Principal Data Architect to join our Data Strategy & Engineering team's Cloud Enterprise Data Platform (EDP) journey. As we modernize and optimize our Data & Analytics program through legacy data platforms consolidation onto AWS & Snowflake re-design, we will be enabling a suite of leading capabilities to support the spectrum of business needs with key cloud and big data technologies to include: AI/ML, streaming data, data lake and data warehouse, as well as self-service and delivered reporting. This important role will help lead our EDP architecture, roadmap, and integration strategy, as well as play a key role in the build of these capabilities.

We'll look to you to demonstrate a robust track record of leading data engineering and platform enablement projects and enhancement efforts on Snowflake. As a subject matter expert on Snowflake capabilities, services, tools, and best practices for data engineering, data warehousing, and data delivery, you will help drive our key initiatives and high value projects, including the migration of our existing on-prem data warehouse platforms (Data Vault/WhereScape) onto Snowflake. You'll also integrate with operational systems located within AWS, Azure, hosted as SaaS, and on-prem using the best fit integration technologies and methodologies. Working with an internal data platform engineering/enablement team augmented by third-party onshore/offshore project resources, you'll also partner very closely with Data Governance, Product, Security, IT engineering teams, and other business teams to build secure and sustainable solutions.

Location: strong preference for flexible hybrid location (onsite Meridian Idaho campus + local work-from-home); there may be opportunity for fully remote work within a mutually acceptable location. #LI-Remote; #LI-Hybrid

Responsibilities:

  • Define multi-tenant enterprise data architecture for the platforms

  • Lead data architecture practice and represent Data Engineering at the enterprise level

  • Technical subject matter expert in the data ecosystem, including AWS/Snowflake, providing input into architecture, platforms, and development strategies and methodologies – includes mentorship of engineering and platform teams via design reviews, code reviews, etc.

  • Define data engineering and data platform integration framework and standards. Mentor, coach and build learning program to efficiently and quickly onboard new people in the team and ensure standards are understood and followed.

  • Facilitate cross-team collaboration for defining and building enterprise data management architecture from principles to tools, oversee cross-functional adoption of the new architecture, and enable a new level of engineering efficiency when working with data.

  • Direct the strategy and implementation for migration from the existing data warehouse platforms onto the new Enterprise Data Platform (EDP) on AWS/Snowflake.

  • Lead technical direction of the team, driving the necessary changes and recommending appropriate technology choices working collaboratively with Architecture, Platform, DevOps, Security, and Project teams; influence technical direction with expert input into project decisions.

  • Lead the shift towards DevSecOps processes for the Data & Analytics delivery functions, emphasizing continuous integration, release management, and automated testing to maximize development agility and improve time to market.

  • As the data platform product manager, drive the data platform technical roadmap to prioritize and build new features that serve the business teams and standardize templates for technical work

  • Interface with key business functions (i.e., Compliance, Actuarial, Marketing, Operations, Clinical, etc.) and IT Analysis teams to assess business functional requirements and translate them into data and integration requirements.

  • Guide onsite/onshore/offshore technical resources from consulting partners, communicate architecture standards and best practices, establish a high impact development process, drive excellence in all deliverables.

  • Establish daily cadence with project team to prioritize and execute work items through adoption of Agile principles and processes.

  • Manage relationships with external vendors to determine technical competence and identify integration opportunities.

  • Be a hands-on practitioner for end-to-end delivery of AWS/Snowflake platforms, capabilities, and content.

Success Factors:

  • Ability to understand, drive, and deliver technology solutions in AWS and Snowflake.

  • Demonstrated understanding of Data architecture and components, especially for Data Warehousing, including Snowflake, dbt Data transformations, Airflow, Data Cataloging tools like Alation; other key integration platforms like Boomi/Fivetran, BI tools like Tableau/Power BI/Sigma, and AI/ML tools like Dataiku/Cortex/Snowflake Intelligence.

  • Demonstrated understanding of and experience with modern Data Warehouse concepts, including Data Lake/Data Warehouse/Data Mart implementations, SQL-based transformations, and ELT methodology.

  • Understanding of Data Mesh principles, treating data as product, emphasizing domain specific data ownership and governance. Ability to foster/work in decentralized ownership of data.

  • Solid understanding of DevSecOps and Agile methodology, comfortable with Jira and AWS DevOps and GitLab.

  • Ability to lead project and technical teams and deliver solutions.

  • Ability to partner with Data Solution Architects in building APIs and other integrations required.

  • Ability to provide mentorship to Snowflake administrators in support of role/policy administration, replication, zero-copy clone etc. Ability to provide technical leadership to Snowflake-related project and support teams.

  • Demonstrated planning, coordination, and execution skills.

  • Ability to conduct POCs, and demonstrable work products when adapting new features.

  • Ability to successfully collaborate with business partners.

  • Knowledgeable and skilled in Snowflake-related technologies, including platforms and tools for meeting business use cases for all forms of analytics and AI/ML functionality.

  • Ability to prioritize well, communicate clearly, and understand how to drive a high level of focus and excellence in deliverables. Communication is diplomatic, accurate and concise, across internal and external organizations.

  • Ability to collaborate, propose solutions, owning the data platform items related to architecture reviews and advise on the best course of action.

  • Conscientious, reliable, and inquisitive with a keen desire to learn, not just gain the knowledge necessary for the job but also the underlying reasons and drivers.

Required Education: Bachelor’s degree in Computer Science, Information Systems, or equivalent work experience is required (two years’ relevant work experience is equivalent to one-year college). Certifications such as Snowflake Pro and/or AWS-related are highly preferred.

Required Experience: 10/+ years of experience in Data & Analytics field, to include:

  • 2/+ years' working with hybrid (onshore + offshore) data teams.

  • 5/+ years designing and building data storage capabilities, such as data warehouse or data lake. This should include 3/+ years in AWS/Snowflake-specific architecture, development, or support capacity.

Overall experience should also span:

  • Understanding of large-scale computing solutions, including software design and development, and database architectures.

  • Implementation involving batch, streaming, event-driven and API integrations on the data platform.

  • Knowledge of AWS/Snowflake cloud security, orchestration, management, data management (in particular metadata management and data quality checks), role-based access controls, and policy-based access controls.

  • Knowledge of FinOps, preferably involving Snowflake.

  • Ability to build data pipelines as part of integrations design.

  • Strong knowledge of ELT, ETL, CDC, API, messaging, streaming, and all forms of data ingestion techniques applicable to cloud-based Data & Analytics.

  • Strong skills in SQL

  • DevSecOps and Agile projects

  • No-SQL; relational & non-relational platforms; multiple file formats, including Parquet, AVRO, JSON, XML, CSV, etc.

Preferred Experience:

  • dbt

  • Boomi, Mule or other integration platforms

  • Python, Snowpark, Sage Maker, Snowflake Intelligence

  • Tableau, Power BI or Sigma

  • Data science platforms, such as Dataiku

  • Designing and building frameworks, API interfaces for efficient data extraction

  • Working in environments where data privacy and protection is critical (Healthcare, HIPAA, etc.).

As of the date of this posting, a good faith estimate of the current pay range is: $147,983 - $192,969. The position is eligible for an annual incentive bonus (variable depending on company and employee performance). The pay range for this position takes into account a wide range of factors including, but not limited to, specific competencies, relevant education, qualifications, certifications, relevant experience, skills, seniority, performance, shift, travel requirements, internal equity, geography, business or organizational needs, and alignment with market data. At Blue Cross of Idaho, it is not typical for an individual to be hired at or near the top range for the position. Compensation decisions are dependent on factors and circumstances at the time of offer.

We offer a robust package of benefits including paid time off, paid holidays, community service and self-care days, medical/dental/vision/pharmacy insurance, 401(k) matching and non-contributory plan, life insurance, short and long term disability, education reimbursement, employee assistance plan (EAP), adoption assistance program and paid family leave program.

We will adhere to all relevant state and local laws concerning employee leave benefits, in line with our plans and policies.

Reasonable accommodations

To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed above are representative of the knowledge, skill and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.

We are an Equal Opportunity Employer and do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any other federal, state or local protected class.

What you'll do

  • The Principal Data Architect will lead the architecture, roadmap, and integration strategy for the Cloud Enterprise Data Platform, focusing on migrating existing data warehouse platforms to Snowflake. This role involves collaborating with various teams to build secure and sustainable data solutions.

About Blue Cross of Idaho

Since 1945, we’ve taken our role as an Idaho-based health insurance company to heart. While the health insurance marketplace has experienced lots of change in recent years, our mission has remained the same. We’re driven to help connect Idahoans to quality healthcare that is affordable and build strong networks and services with our members in mind. Blue Cross of Idaho is a leader in the health insurance industry- addressing healthcare costs and delivering exceptional customer experiences through innovative tools and services. We are dedicated to attracting and developing leaders who share our vision of transforming the healthcare experience of the communities we serve in Idaho and beyond.

Ready to join Blue Cross of Idaho?

Take the next step in your career journey

Frequently Asked Questions

What does Blue Cross of Idaho pay for a Principal Data Architect?

Toggle
Blue Cross of Idaho offers a competitive compensation package for the Principal Data Architect role. The salary range is USD 148k - 193k per year. Apply through Clera to learn more about the full compensation details.

What does a Principal Data Architect do at Blue Cross of Idaho?

Toggle
As a Principal Data Architect at Blue Cross of Idaho, you will: the Principal Data Architect will lead the architecture, roadmap, and integration strategy for the Cloud Enterprise Data Platform, focusing on migrating existing data warehouse platforms to Snowflake. This role involves collaborating with various teams to build secure and sustainable data solutions..

Is the Principal Data Architect position at Blue Cross of Idaho remote?

Toggle
The Principal Data Architect position at Blue Cross of Idaho is based in Bozeman, Montana, United States. Contact the company through Clera for specific work arrangement details.

How do I apply for the Principal Data Architect position at Blue Cross of Idaho?

Toggle
You can apply for the Principal Data Architect position at Blue Cross of Idahodirectly through Clera. Click the "Apply Now" button above to start your application. Clera's AI-powered platform will help match your profile with this opportunity and guide you through the application process.
Clera - Your AI talent agent
© 2026 Clera Labs, Inc.TermsPrivacyHelp

Join Clera's Talent Pool

Get matched with similar opportunities at top startups

This role is hosted on Blue Cross of Idaho's careers site.
Join our talent pool first to get notified about similar roles that match your profile.