Clera - Your AI talent agent
LoginStart
Start
Meta logo
Meta

AI Research Engineer, PAR Media

full-time•Menlo Park•$8k - $257k

Summary

Location

Menlo Park

Salary

$8k - $257k

Type

full-time

Experience

2-5 years

Company links

WebsiteLinkedInLinkedIn

About this role

We are seeking AI Researchers to join the Product and Applied Research (PAR) Media group within Meta Superintelligence Labs (MSL). As a member of the PAR Media group, you will drive innovation in image and video understanding, generation, and narrative creation at an unprecedented scale. We own the research, development and deployment of cutting edge multimodal models across Meta AI, FoA, and the entire Meta creator and developer ecosystem. Our work directly powers product roadmaps with flexible, state-of-the-art solutions designed to lead, not follow. We partner closely with AI product teams across Meta to translate our research into impactful, real-world experiences. This means we’re not just building technology…we’re building the future of how people create, communicate, and connect. If you’re passionate about advancing the future of AI-driven media experiences and eager to make a tangible impact on billions of users, we invite you to join us on this journey.

Responsibilities

  • Contribute to the training of next-generation multimodal foundation models, advance their capabilities in understanding, generation, and grounding, and enable them for downstream product use-cases
  • Support creative data sourcing, high-quality pre/mid/post-training data curation, and scale and optimize data pipelines for multimodal large language models (LLMs)
  • Lead, collaborate, and execute on research that pushes forward the state of the art in multimodal reasoning and generation research, and prioritize research that can be directly applied to Meta’s product development


Minimum Qualifications

  • Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
  • 2+ years of industry research experience in LLM/NLP, computer vision, or related AI/ML models
  • Experience as a formal technical lead, leading major technical initiatives with XFN impact, and/or influencing strategy across multiple teams
  • Skilled in model training, data, or inference & efficiency for image, video, and/or related multimodal models
  • Proficiency in media generation, understanding, and/or grounding
  • Programming experience in Python and hands-on experience with frameworks like PyTorch or Spark


Preferred Qualifications

  • Experience working on frontier-quality/state-of-the-art Large Media Models
  • Masters degree or PhD in Computer Science, AI/ML, or a relevant technical field
  • Demonstrated significant industry influence in the field of AI and/or published research in leading peer-reviewed conferences (e.g., ACL, NeurIPS, ICML, ICLR, AAAI, KDD, CVPR, ICCV)


$88.46/hour to $257,000/year + bonus + equity + benefits

What you'll do

  • Contribute to the training of next-generation multimodal foundation models and support data curation for large language models. Lead research initiatives that advance multimodal reasoning and generation applicable to Meta’s product development.

About Meta

Meta's mission is to build the future of human connection and the technology that makes it possible. Our technologies help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps like Messenger, Instagram and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology. To help create a safe and respectful online space, we encourage constructive conversations on this page. Please note the following: • Start with an open mind. Whether you agree or disagree, engage with empathy. • Comments violating our Community Standards will be removed or hidden. Please treat everybody with respect. • Keep it constructive. Use your interactions here to learn about and grow your understanding of others. • Our moderators are here to uphold these guidelines for the benefit of everyone, every day. • If you are seeking support for issues related to your Facebook account, please reference our Help Center (https://www.facebook.com/help) or Help Community (https://www.facebook.com/help/community). For a full listing of our jobs, visit https://www.metacareers.com

Ready to join Meta?

Take the next step in your career journey

Frequently Asked Questions

What does Meta pay for a AI Research Engineer, PAR Media?

Toggle
Meta offers a competitive compensation package for the AI Research Engineer, PAR Media role. The salary range is USD 9k - 257k per year. Apply through Clera to learn more about the full compensation details.

What does a AI Research Engineer, PAR Media do at Meta?

Toggle
As a AI Research Engineer, PAR Media at Meta, you will: contribute to the training of next-generation multimodal foundation models and support data curation for large language models. Lead research initiatives that advance multimodal reasoning and generation applicable to Meta’s product development..

Is the AI Research Engineer, PAR Media position at Meta remote?

Toggle
The AI Research Engineer, PAR Media position at Meta is based in Menlo Park, California, United States. Contact the company through Clera for specific work arrangement details.

How do I apply for the AI Research Engineer, PAR Media position at Meta?

Toggle
You can apply for the AI Research Engineer, PAR Media position at Meta directly through Clera. Click the "Apply Now" button above to start your application. Clera's AI-powered platform will help match your profile with this opportunity and guide you through the application process.
Clera - Your AI talent agent
© 2026 Clera Labs, Inc.TermsPrivacyHelp

Join Clera's Talent Pool

Get matched with similar opportunities at top startups

This role is hosted on Meta's careers site.
Join our talent pool first to get notified about similar roles that match your profile.