Zensar logo
DE&A - Core - Advanced Data Engineering - Ab Initio
full-timeBangalore South

Summary

Location

Bangalore South

Type

full-time

Explore Jobs

About this role

Job Title
Senior Ab Initio Developer – Data Engineering & Analytics (AI Integration & BI Enablement)
Location
Bangalore, Pune (Hybrid)
Role Summary
We are seeking a seasoned Ab Initio Senior Developer to design, build, and optimize enterprise-grade data pipelines and metadata-driven frameworks on the Ab Initio platform (GDE, Co>Operate, EME, Metadata Hub, Conduct>IT). The role will work hand-in-hand with the AI/ML team and BI/Reporting teams to deliver trusted, well-governed, and performant data for analytics, machine learning, and executive dashboards. You will lead integration patterns between Ab Initio and modern BI tools, automate data quality controls, and establish standards for observability, lineage, and governance.

Key Responsibilities
Ab Initio Engineering & Platform Excellence
  • Design and implement ETL/ELT pipelines using Ab Initio GDE, Co>Operate, Conduct>IT, and EME with a focus on scalability, reliability, and maintainability.
  • Develop reusable components, graphs, and plans; enforce coding standards, modularization, and metadata-driven designs.
  • Configure and maintain scheduling, parameterization, error handling, restartability, and performance tuning (parallelism, partitioning, vectorization).
Data Quality, Governance & Lineage
  • Implement DQ rules, profiling, reconciliation checks, and threshold-based alerts within Ab Initio (e.g., Data Quality/Metadata Hub).
  • Integrate with data catalog/lineage tools; ensure EME version control, audit trails, and regulatory compliance (PII handling, encryption, masking).
Integration with AI & BI
  • Partner with AI engineers/data scientists to provision feature-ready datasets (batch/near-real-time), standardize feature stores, and MLOps handoffs.
  • Establish integration patterns between Ab Initio and BI tools (e.g., Power BI, Tableau, Qlik, Looker) via warehouse/semantic layers, materialized views, and data marts.
  • Optimize data refresh SLAs and semantic consistency for downstream BI; design CDC/streaming ingestion (Kafka, CDC tools) where applicable.
Architecture & Cloud/Data Platforms
  • Contribute to data architecture decisions: warehouse/lakehouse design, star/snowflake schemas, conformed dimensions, and SCD strategies.
  • Orchestrate data movement across on-prem and cloud platforms (e.g., Azure, AWS, GCP) while maintaining cost, performance, and security posture.
  • Evaluate and implement APIs, microservices, or data product contracts for cross-team consumption.
Stakeholder Management & Delivery
  • Work with product owners, solution architects, and business analysts to refine requirements and translate them into technical designs.
  • Lead code reviews, mentor junior engineers, and drive DevOps practices (CI/CD pipelines for Ab Initio artifacts, automated testing).
  • Produce high-quality documentation (design specs, runbooks, lineage maps, SLAs) and contribute to operational readiness.

Required Skills & Experience
  • 8–12+ years of hands-on experience in Ab Initio (GDE, Co>Operate, EME, Metadata Hub, Conduct>IT).
  • Deep expertise in ETL performance tuning, parallelism, graph optimization, error handling, and restartability.
  • Strong SQL skills and data modeling (dimensional/star/snowflake, SCD, surrogate keys, CDC).
  • Experience integrating Ab Initio outputs with BI tools (e.g., Power BI/Tableau/Qlik), data warehouses (e.g., Azure Synapse, Snowflake, BigQuery, Redshift), and lakehouses (e.g., Delta Lake).
  • Solid understanding of data quality, lineage, metadata management, and governance controls.
  • Familiarity with cloud services (Azure/AWS/GCP), storage formats (Parquet/ORC/Avro), and security (KMS, encryption, RBAC).
  • Exposure to AI/ML data needs: feature engineering workflows, dataset versioning, reproducibility, and MLOps handoff patterns.
  • Experience with scripting (Unix shell, Python) and DevOps/CI-CD (Git, pipelines, artifacts).
  • Strong communication and stakeholder skills; ability to lead design reviews and mentor.

Nice-to-Have (Preferred)
  • Experience with Kafka, Debezium, or other streaming/CDC tools.
  • Hands-on with feature stores (e.g., Feast) or ML pipelines (Azure ML, SageMaker).
  • Knowledge of data catalog tools (e.g., Purview, Collibra, Alation) and lineage integration.
  • Familiarity with policy-as-code and data contracts for governed integrations.
  • Exposure to containerization (Docker/Kubernetes) for ancillary services.

At Zensar, we’re “experience-led everything”. We are committed to conceptualizing, designing, engineering, marketing, and managing digital solutions and experiences for over 130 leading enterprises. We are a company driven by a bold purpose: Together, we shape experiences for better futures. Whether for our clients, our people, or the world around us, this belief powers everything we do. At the heart of our culture is ONE with Client - a set of four core values that reflect who we are and how we work: One Zensar, Nurturing, Empowering, and Client Focus.

Part of the $4.8 billion RPG Group, we’re a community of 10,000+ innovators across 30+ global locations, including Milpitas, Seattle, Princeton, Cape Town, London, Zurich, Singapore, and Mexico City. Explore Life at Zensar and join us to Grow. Own. Achieve. Learn. to be the best version of yourself.

We believe the best work happens when individuality is celebrated, growth is encouraged, and well-being is prioritized. We are an equal employment opportunity (EEO) and affirmative action employer, committed to creating an inclusive workplace. All qualified applicants will be considered without regard to race, creed, color, ancestry, religion, sex, national origin, citizenship, age, sexual orientation, gender identity, disability, marital status, family medical leave status, or protected veteran status.

Other facts

Tech stack
Ab Initio,ETL,Data Quality,Data Governance,SQL,Data Modeling,Cloud Services,AI Integration,BI Tools,Data Architecture,DevOps,Scripting,Performance Tuning,Metadata Management,Streaming,MLOps,Containerization

About Zensar

Zensar stands out as a premier technology consulting and services company, embracing an ‘experience-led everything’ philosophy. We are creators, thinkers, and problem solvers passionate about designing digital experiences that are engineered into scale-ready products, services, and solutions to deliver superior engagement to high-growth companies. This full lifecycle capability – from experience to engineering to engagement – is what makes us unique. This integrated approach also means that we harness the power of technology, creativity, and insight to deliver impact — ensuring our work focuses not just on technology but also on the people who use it.

Part of the $4.4 billion RPG Group, Zensar is headquartered in Pune, India. Our 10,000+ employees work across 30+ locations worldwide, including Seattle, Princeton, Cape Town, London, Singapore, and Mexico City. As an organization, we are diverse and multi-dimensional and unite across geographies and skill sets to deliver products and services that are value-driven, environmentally conscious, and human-centered.

To know more, visit us at www.zensar.com.

Team size: 10,001+ employees
LinkedIn: Visit
Industry: IT Services and IT Consulting
Founding Year: 2001

What you'll do

  • The role involves designing, building, and optimizing data pipelines on the Ab Initio platform while collaborating with AI/ML and BI teams. Responsibilities include implementing data quality controls, establishing governance standards, and leading integration patterns with BI tools.

Ready to join Zensar?

Take the next step in your career journey

Frequently Asked Questions

What does a DE&A - Core - Advanced Data Engineering - Ab Initio do at Zensar?

As a DE&A - Core - Advanced Data Engineering - Ab Initio at Zensar, you will: the role involves designing, building, and optimizing data pipelines on the Ab Initio platform while collaborating with AI/ML and BI teams. Responsibilities include implementing data quality controls, establishing governance standards, and leading integration patterns with BI tools..

Why join Zensar as a DE&A - Core - Advanced Data Engineering - Ab Initio?

Zensar is a leading IT Services and IT Consulting company.

Is the DE&A - Core - Advanced Data Engineering - Ab Initio position at Zensar remote?

The DE&A - Core - Advanced Data Engineering - Ab Initio position at Zensar is based in Bangalore South, Karnataka, India. Contact the company through Clera for specific work arrangement details.

How do I apply for the DE&A - Core - Advanced Data Engineering - Ab Initio position at Zensar?

You can apply for the DE&A - Core - Advanced Data Engineering - Ab Initio position at Zensar directly through Clera. Click the "Apply Now" button above to start your application. Clera's AI-powered platform will help match your profile with this opportunity and guide you through the application process. You can also learn more about Zensar on their website.