Cygnify logo
DataOps Engineer
full-timeSingapore

Summary

Location

Singapore

Type

full-time

Explore Jobs

About this role

As a DataOps Engineer, you will operate large-scale big data platforms across hybrid (on-premises and cloud) environments, enabling reliable analytics and data-driven use cases. You will work closely with data engineers, data scientists, infrastructure, security, and business stakeholders to ensure data quality, platform stability, and operational excellence. This role focuses on building, running, and optimizing production-grade data platforms and pipelines, with strong ownership of infrastructure, automation, reliability, and operations. 

Key Responsibilities


1. Manage On-Prem and Cloud Data Platforms

  • Maintain and support on-premises clusters, including compute, storage, networking, and system configurations.
  • Provision, configure, and manage cloud infrastructure, including AWS S3, EMR, Redshift, and RDS.
  • Monitor platform performance, capacity, and availability; implement operational alerts, monitoring, and runbooks.


2. Build, Operate & Support Data Pipelines

  • Develop and support ETL/ELT pipelines using PySpark and Airflow, ensuring reliable ingestion, transformation, and data loading.
  • Operate pipelines across hybrid environments, handling job execution, retries, basic performance tuning, and failure recovery.
  • Validate, clean, and standardize datasets; monitor pipeline health, data freshness SLAs, and failure patterns, and perform root cause analysis and remediation.
  • Support data storage platforms such as S3, PostgreSQL, Redshift, and MongoDB from an operational and platform perspective.
  • Automate DataOps and MLOps workflows using CI/CD pipelines (e.g., GitLab CI/CD, Jenkins).

3. Security, Compliance & Governance

  • Implement secure access controls (IAM, VPC, security groups), encryption, backups, and disaster recovery mechanisms.
  • Partner with infrastructure and security teams to ensure compliance with PDPA, GDPR, and internal governance policies.
  • Degree in Computer Science, Software Engineering, Data Science, or equivalent experience
  • 2–5+ years of hands-on experience in DevOps, Data Platform Engineering, or Infrastructure Engineering, supporting big data or analytics platforms.
  • Experience managing on-premises and cloud infrastructure, including compute, storage, and network configuration.
  • Hands-on experience with cloud and/or hybrid environments supporting big data pipelines, with Spark / PySpark and Airflow at an operational level (job execution, basic tuning, failure handling).
  • Knowledge of containerization and orchestration, including Docker fundamentals and Kubernetes (deployments, services, ingress, scaling, resource limits).
  • Strong understanding of networking fundamentals, including VPC design, routing, DNS, firewalls, and load balancing.
  • Knowledge of security concepts, such as IAM, access control, and compliance basics.
  • Experience with CI/CD pipelines and automation using tools such as GitLab CI/CD, Jenkins, or similar.
  • Basic understanding of MLOps concepts from a platform and infrastructure support perspective.
  • Strong troubleshooting and problem-solving skills in production environments.
  • Comfortable working with cross-functional teams (data engineers, ML engineers, security, product).

Other facts

Tech stack
Dataops,Big Data,Cloud Infrastructure,AWS,Pyspark,Airflow,ETL,CI/CD,Docker,Kubernetes,Networking,Security,Compliance,Troubleshooting,Automation,Data Quality

About Cygnify

Cygnify is an on-demand, plug & play TA team on a month-to-month subscription, delivering unlimited global hires with no placement fees.

Our Talent Acquisition as a Service (TAaaS) offers companies instant access to a fully managed team of recruitment experts, cutting-edge AI tools, and a 100M+ candidate database.

All our monthly plans are transparent, and flexible, with no lock-ins, supporting all roles, levels, and locations globally.

Press Play to supercharge your Talent Acquisition—streamlining hiring with a single partner across every location, leveraging our deep market expertise, extensive networks, and proven success in securing top talent.

Avoid the high costs of growing an in-house team and agency placement fees. We have it all in our plug & play TA solution.

Team size: 11-50 employees
LinkedIn: Visit
Industry: Business Consulting and Services
Founding Year: 2024

What you'll do

  • The DataOps Engineer will manage on-prem and cloud data platforms, ensuring platform stability and operational excellence. They will also build, operate, and support data pipelines while implementing security and compliance measures.

Ready to join Cygnify?

Take the next step in your career journey

Frequently Asked Questions

What does a DataOps Engineer do at Cygnify?

As a DataOps Engineer at Cygnify, you will: the DataOps Engineer will manage on-prem and cloud data platforms, ensuring platform stability and operational excellence. They will also build, operate, and support data pipelines while implementing security and compliance measures..

Why join Cygnify as a DataOps Engineer?

Cygnify is a leading Business Consulting and Services company.

Is the DataOps Engineer position at Cygnify remote?

The DataOps Engineer position at Cygnify is based in Singapore, Singapore. Contact the company through Clera for specific work arrangement details.

How do I apply for the DataOps Engineer position at Cygnify?

You can apply for the DataOps Engineer position at Cygnify directly through Clera. Click the "Apply Now" button above to start your application. Clera's AI-powered platform will help match your profile with this opportunity and guide you through the application process. You can also learn more about Cygnify on their website.