Summary:
The Data Operations Engineer is responsible for ensuring the reliability, quality, and performance of enterprise data systems. This role oversees daily operations of data pipelines, resolves data issues, monitors system health, and supports end-to-end data workflows. The position requires strong analytical thinking, technical skills, and a proactive approach to improving data processes and platform stability.
Key Responsibilities:
- Operate and monitor data pipelines, ETL/ELT processes, and cloud data platforms.
- Manage data quality validation, anomaly detection, and reconciliation.
- Handle issue resolution, system alerts, and on-call support (if required).
- Collaborate with cross-functional teams on data improvements.
- Maintain documentation, runbooks, dashboards, and operational metrics.
- Ensure adherence to data governance, security, and compliance standards.
Qualifications
Education
- Bachelor's degree in Computer Science, Information Systems, Engineering, Statistics, or related field.
- 13+ years of experience in Data Operations, Data Engineering, or relevant technical roles.
- Experience working with cloud platforms (GCP, AWS, Azure).
- Experience supporting ETL/ELT pipelines, data warehouses, or data lake platforms.
Technical Knowledge
- Familiarity with SQL and data manipulation.
- Understanding of data modeling, ETL processes, and cloud-native data tools.
- Exposure to scripting (Python, Shell) is an advantage.
- Knowledge of monitoring tools, alerting systems, and workflow orchestration.