Clera - Your AI talent agent
LoginStart
Start
C
Codvo.ai

Databricks Solution Architect (Remote)

full-time•Pune

Summary

Location

Pune

Type

full-time

Experience

10+ years

Company links

WebsiteLinkedInLinkedIn

About this role

<p style="margin-bottom:0cm;"><strong>Job Title:</strong> Databricks Solution Architect</p><p style="margin-bottom:0cm;"><strong>Experience:</strong> 10+ years</p><p style="margin-bottom:0cm;"><strong>&nbsp;</strong></p><p style="margin-bottom:0cm;"><strong>About Us</strong></p><p style="margin-bottom:0cm;">At Codvo, we are committed to building scalable, future-ready data platforms that power business impact. We believe in a culture of innovation, collaboration, and growth, where engineers can experiment, learn, and thrive. Join us to be part of a team that solves complex data challenges with creativity and cutting-edge technology.</p><div><br></div><p style="margin-top:12.0pt;"><strong>Senior / Expert Databricks Solutions Architect — Job Description</strong></p><p style="margin-top:12.0pt;"><strong>About the role</strong></p><p style="margin-top:4.0pt;margin-right:0cm;margin-bottom:4.0pt; margin-left:0cm;">We are seeking a highly experienced Solutions Architect to lead the design and delivery of end-to-end data and analytics solutions on the Databricks platform. You will translate complex business needs into scalable, secure, and cost-efficient data lakehouse architectures, collaborate with cross-functional teams, and guide customers from concept through implementation and adoption.</p><p style="margin-top:12.0pt;"><strong>What you’ll do (Responsibilities)</strong></p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Engage with business stakeholders to understand goals, data sources, and analytic use cases; translate into a holistic Databricks-based solution.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Design scalable data lakehouse architectures on Databricks (Delta Lake, Databricks SQL, Unity Catalog, Delta Live Tables) that support data ingestion, cleansing, modeling, governance, security, and analytics.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Lead technical architecture decisions and produce high-quality artifacts (reference architectures, solution blueprints, data models, data governance models, and integration plans).</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Architect data pipelines end-to-end (ingestion, transformation, storage, cataloging) with best practices for reliability, observability, and cost optimization.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Enable data science and ML workflows on Databricks (MLflow, feature store, notebooks, Automated ML) and design end-to-end MLOps strategies.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Ensure data governance, security, and compliance (IAM, encryption, Unity Catalog, data masking, lineage, access controls).</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Collaborate closely with data engineers, data scientists, software engineers, and DevOps to deliver production-ready solutions; implement CI/CD for data and ML pipelines.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Lead customer-facing activities: workshops, solution demos, proofs of concept, and responses to RFPs/RFIs; provide strategic guidance on platform adoption and ROI.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Mentor and coach junior architects and engineers; develop training materials and run knowledge-sharing sessions.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Monitor performance, optimize SQL and Spark workloads, manage cluster configurations, and drive cost/performance improvements.</p><p style="margin-top:12.0pt;"><strong>What you’ll bring (Required qualifications)</strong></p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• 8+ years of experience in solutions/enterprise architecture or senior data engineering roles; 3+ years of hands-on experience with the Databricks platform and Spark-based architectures.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Deep expertise in Databricks components: Delta Lake, Unity Catalog, Databricks SQL, Delta Live Tables, notebooks, and orchestration patterns.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Strong cloud experience (AWS, Azure, or GCP) with data storage and compute services (e.g., S3/Blob, ADLS, GCS, Redshift, BigQuery, Synapse, EMR/Databricks on cloud).</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Proficiency in data integration and orchestration tools (e.g., Apache Airflow, dbt, Kafka, Spark Structured Streaming).</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Advanced SQL and programming skills (Python or Scala); ability to prototype and review data pipelines, models, and analytics solutions.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Excellent communication and stakeholder management skills; ability to present complex technical concepts to both technical and non-technical audiences.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Experience delivering large-scale data lakehouse migrations/transformations, performance tuning, and cost optimization.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Databricks certification(s) or equivalent demonstrable expertise; willingness to obtain relevant certifications if not already held.</p><p style="margin-top:12.0pt;"><strong>Preferred qualifications</strong></p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Experience with ML and MLOps on Databricks (MLflow, feature stores, model registry, CI/CD for ML).</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Domain expertise in industries such as financial services, healthcare, retail, or telecommunications.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Familiarity with data governance, privacy regulations, and security frameworks (e.g., GDPR, HIPAA, SOC 2).</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Familiarity with real-time data processing and streaming architectures.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Prior experience in pre-sales or solutioning for customers, including building compelling ROI stories and technical demos.</p><p style="margin-top:12.0pt;"><strong>About you (soft skills and capabilities)</strong></p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Strategic thinker with a hands-on mindset; comfortable operating at both business and technical levels.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Strong analytical, problem-solving, and decision-making capabilities.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Collaborative team player who can lead without authority and influence stakeholders.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Comfortable working in a fast-paced, client-facing environment with travel as needed.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Submit your resume/CV and a brief cover letter outlining your Databricks projects and impact.</p><p style="margin-top:3.0pt;margin-right:0cm;margin-bottom:3.0pt; margin-left:36.0pt;">• Include links to relevant work (e.g., public case studies, GitHub repositories, or portfolio demos) if available.</p><p style="margin-top:4.0pt;margin-right:0cm;margin-bottom:4.0pt; margin-left:0cm;"><br></p>

What you'll do

  • The Solutions Architect will lead the design and delivery of end-to-end data and analytics solutions on the Databricks platform, translating complex business needs into scalable, secure, and cost-efficient data lakehouse architectures. Responsibilities include architecting data pipelines, enabling data science/ML workflows, ensuring data governance, and leading customer-facing activities like workshops and solution demos.

About Codvo.ai

At Codvo.ai, we specialize in leveraging artificial intelligence, cloud, and data to solve complex business problems and drive innovation. Our passion for innovation drives us to deliver solutions that not only meet but exceed your unique business needs, fostering smarter, more productive teams. Here’s why our approach has earned widespread acclaim from our clients: 67 Customer NPS: Our Net Promoter Score is a testament to the high level of satisfaction and loyalty among our clients. It underscores our ability to deliver quality and value through our specialized services, making us a preferred partner for businesses looking to leverage AI and data for competitive advantage. 78 Employee NPS: The satisfaction and engagement of our team directly influence the quality of service we provide. Our high employee NPS signifies a motivated, dedicated team that's committed to excellence. This positive work culture ensures that we can deliver exceptional AI-first engineering and enterprise data application services to you. Our approach goes beyond traditional software development; we're dedicated to partnering with you to harness the power of AI and data. The combination of our high trial and engagement success rates, extensive experience, and positive feedback from both clients and employees positions us as more than just a service provider. We're your trusted ally in navigating the complexities of today's digital landscape, committed to transforming your vision into a reality with cutting-edge AI and data solutions.

Ready to join Codvo.ai?

Take the next step in your career journey

Frequently Asked Questions

What does a Databricks Solution Architect (Remote) do at Codvo.ai?

Toggle
As a Databricks Solution Architect (Remote) at Codvo.ai, you will: the Solutions Architect will lead the design and delivery of end-to-end data and analytics solutions on the Databricks platform, translating complex business needs into scalable, secure, and cost-efficient data lakehouse architectures. Responsibilities include architecting data pipelines, enabling data science/ML workflows, ensuring data governance, and leading customer-facing activities like workshops and solution demos..

Is the Databricks Solution Architect (Remote) position at Codvo.ai remote?

Toggle
The Databricks Solution Architect (Remote) position at Codvo.ai is based in Pune, India. Contact the company through Clera for specific work arrangement details.

How do I apply for the Databricks Solution Architect (Remote) position at Codvo.ai?

Toggle
You can apply for the Databricks Solution Architect (Remote) position at Codvo.ai directly through Clera. Click the "Apply Now" button above to start your application. Clera's AI-powered platform will help match your profile with this opportunity and guide you through the application process.
Clera - Your AI talent agent
© 2026 Clera Labs, Inc.TermsPrivacyHelp

Join Clera's Talent Pool

Get matched with similar opportunities at top startups

This role is hosted on Codvo.ai's careers site.
Join our talent pool first to get notified about similar roles that match your profile.