The Data Operations Engineer’s primary responsibility is to support all data pipelines, and maintain their service level agreements (SLAs) by monitoring our infrastructure, data stores, and ETL/ELT tasks. We will work on improving reliability, scalability and efficiency of data pipelines developed by our data engineering, and analytics teams. Other critical responsibilities are to assist during outages by performing RCA's, and impact analysis during changes. This role will also be responsible for supporting our DBA by helping with database administrative tasks, and assistance with Moda's third-party applications.
• Assist our DBA with monitoring, maintaining, and administration with Moda's SQL/NoSQL data stores in our OLTP/OLAP environments
• Monitor, provision, and maintain our ETL/ELT infrastructure using a combination of ansible, terraform, and docker
• Document ETL/ELT outages via JIRA, assist with RCA, impact analysis, and resolution
• Deploy, test, run and schedule ETL/ELT pipelines via Airflow
• Assist data profiling new and existing data sources
• Assist administering, and troubleshooting third-party applications such as Tableau, Looker, ANT etc.
Necessary qualifications for success:
• Bachelor’s degree or equivalent experience. Degree in an engineering or computer science field preferred
• Experience in at least one scripting language (shell, python, etc.)
• Strong experience with SQL for data profiling
• Experience with at least one ETL/ELT orchestration framework (Airflow, Luigi, Pinball, etc.)
• Experience in a Linux environment on AWS with provisioning, and configuration management, and code deployment (ansible, terraform)
• Demonstrated ability to quickly adapt, learn new skill sets, and be able to understand operational challenges
• Excellent interpersonal skills, including relationship building with diverse, global, cross-functional team
• Proven ability to troubleshoot and problem solve in complex systems