Clera - Your AI talent agent
LoginStart
Start

This position is no longer available

Attunement logo
Attunement

Founding Engineer

on-site•San Francisco•$0k - $0k+ Early equity (meaningful ownership)

Summary

Location

San Francisco

Salary

$0k - $0k

Equity

Early equity (meaningful ownership)

Workplace

On-site

Experience

3+ years

Visa

Will sponsor

Company links

Website

This position is no longer available

This job listing has been removed by the employer and is no longer accepting applications.

Browse Similar Jobs

About this role

Founding Engineer at Attunement (W24)

$150 - $300

We automate compliance for behavioral health, saving clinicians hours and protecting millions in revenue.

San Francisco, CA, US

Full-time

Will sponsor

About Attunement

Attunement (YC W24) is engineering observability and accountability into AI for behavioral health. We’re building secure infrastructure that connects clinical data pipelines, model outputs, and audit systems so every AI-assisted decision in care is traceable and explainable. Clinics using Attunement stay continuously audit-ready, protect revenue, and set a new bar for transparency in digital mental-health tools.

About the role

Skills: Python, React, Software Security, Amazon Web Services (AWS)

Location: Onsite / San Francisco

Stage: Seed

Type: Full-time, founding team

attunement.ai Engineering Observability and Accountability into AI for Behavioral Health

Attunement is building the compliance infrastructure for AI in behavioral health. Our goal is to make AI systems in clinical settings auditable, explainable, and accountable by design. Today, clinics using Attunement cut audit preparation time by 80% and documentation costs by 40%. We are building the technical standard for safety and integrity in AI-assisted behavioral care.

What You'll Do

As an early engineer, you’ll design and implement the technical foundation for compliant and reliable AI in healthcare.

You’ll build systems with our forward deployment engineer and product designer to make compliance and transparency operational.

Your work will include:

The core compliance intelligence layer : secure, explainable, and continuously learning from real clinical workflows.

Data pipelines that connect with EHRs and healthcare APIs (FHIR, HL7) to create real-time, auditable feedback loops.

You Might Be Right for This

You’ve built production systems end-to-end , backend to frontend in security-sensitive or regulated environments (HIPAA, SOC 2, or similar).

You’ve worked with healthcare data standards (FHIR, HL7, or EHR integrations) and understand the nuance of data lineage, auditability, and interoperability.

You have experience with LLMs or ML Ops, particularly in designing explainability, safety, or audit systems around AI models.

You’re fluent in React / Next.js and Python / FastAPI (or equivalent frameworks), with strong fundamentals in database architecture, API design, and observability.

You care deeply about reliability, data integrity, and user trust

(Bonus) You have a background or strong interest in clinical psychology, AI safety, or human-centered systems design, and you want to build software that genuinely improves human wellbeing.

Why this matters

This role shapes how AI systems are integrated into healthcare. You’ll collaborate with a founding team with backgrounds in neuroscience, AI safety, and clinical psychology to define the technical and ethical standards for responsible AI in clinical environments. You’ll have meaningful ownership, early equity, and the opportunity to influence not only the product architecture but also the principles that govern how AI supports human decision-making in care.

Technology

Attunement (YC W24) is building the observability and accountability layer for AI in behavioral health: real-time infrastructure that makes model decisions explainable, auditable, and compliant by design. Engineers here work at the intersection of ML ops, healthcare data, and human-centered safety, defining how trustworthy AI is built and deployed in clinical systems.

What you'll do

  • As an early engineer, you’ll design and implement the technical foundation for compliant and reliable AI in healthcare.
  • You’ll build systems with our forward deployment engineer and product designer to make compliance and transparency operational.
  • The core compliance intelligence layer : secure, explainable, and continuously learning from real clinical workflows.
  • Data pipelines that connect with EHRs and healthcare APIs (FHIR, HL7) to create real-time, auditable feedback loops.

About Attunement

About AttunementAttunement (YC W24) is engineering observability and accountability into AI for behavioral health. We’re building secure infrastructure that connects clinical data pipelines, model outputs, and audit systems so every AI-assisted decision in care is traceable and explainable. Clinics using Attunement stay continuously audit-ready, protect revenue, and set a new bar for transparency in digital mental-health tools.Attunement is building the compliance infrastructure for AI in behavioral health. Our goal is to make AI systems in clinical settings auditable, explainable, and accountable by design. Today, clinics using Attunement cut audit preparation time by 80% and documentation costs by 40%. We are building the technical standard for safety and integrity in AI-assisted behavioral care.

Looking for similar opportunities?

Browse other open positions that match your skills

Frequently Asked Questions

What does Attunement pay for a Founding Engineer?

Toggle
Attunement offers a competitive compensation package for the Founding Engineer role. The salary range is USD 0k - 0k per year, plus Early equity (meaningful ownership) equity. Apply through Clera to learn more about the full compensation details.

What does a Founding Engineer do at Attunement?

Toggle
As a Founding Engineer at Attunement, you will: as an early engineer, you’ll design and implement the technical foundation for compliant and reliable AI in healthcare.; you’ll build systems with our forward deployment engineer and product designer to make compliance and transparency operational.; the core compliance intelligence layer : secure, explainable, and continuously learning from real clinical workflows.; and more.

Is the Founding Engineer position at Attunement remote?

Toggle
The Founding Engineer position at Attunement is based in San Francisco, United States and is on-site. Contact the company through Clera for specific work arrangement details.

How do I apply for the Founding Engineer position at Attunement?

Toggle
You can apply for the Founding Engineer position at Attunementdirectly through Clera. Click the "Apply Now" button above to start your application. Clera's AI-powered platform will help match your profile with this opportunity and guide you through the application process.
Clera - Your AI talent agent
© 2026 Clera Labs, Inc.TermsPrivacyHelp