Talent.com
Senior Data Engineer – DevOps [Gitlab, Terraform]
Senior Data Engineer – DevOps [Gitlab, Terraform]First Citizens Bank • Raleigh, North Carolina, US
Senior Data Engineer – DevOps [Gitlab, Terraform]

Senior Data Engineer – DevOps [Gitlab, Terraform]

First Citizens Bank • Raleigh, North Carolina, US
[job_card.variable_days_ago]
[job_preview.job_type]
  • [job_card.full_time]
[job_card.job_description]

Overview

This is a remote role that may only be hired in the following locations : NC, AZ, TX

We are seeking an experienced DevOps Engineer to design, build, and maintain CI / CD pipelines, infrastructure automation, and deployment workflows supporting our data engineering platform. This role focuses on infrastructure as code, configuration management, cloud operations, and enabling data engineers to deploy reliably and rapidly across AWS and Azure environments.

Responsibilities

CI / CD Pipeline & Deployment Automation

  • Design and implement robust CI / CD pipelines using Azure DevOps or GitLab; automate build, test, and deployment processes for data applications, dbt Cloud jobs, and infrastructure changes.
  • Build deployment orchestration for multi-environment (dev, qa, uat, production) workflows with approval gates, rollback mechanisms, and artifact management.
  • Implement GitOps practices for infrastructure and application deployments; maintain version control and audit trails for all changes.
  • Optimize pipeline performance, reduce deployment times, and enable fast feedback loops for rapid iteration.

Infrastructure as Code (IaC) & Cloud Operations

  • Design and manage Snowflake, AWS and Azure infrastructure using Terraform; ensure modularity, reusability, and consistency across environments.
  • Provision and manage Cloud resources
  • Implement tagging strategies and resource governance; maintain Terraform state management and implement remote state backends.
  • Support multi-cloud architecture patterns and ensure portability between AWS and Azure where applicable.
  • Configuration Management & Infrastructure Automation

  • Deploy and manage Ansible playbooks for configuration management, patching, and infrastructure orchestration across cloud environments.
  • Utilize Puppet for infrastructure configuration, state management, and compliance enforcement; maintain Puppet modules and manifests for reproducible environments.
  • Automate VM provisioning, OS hardening, and application stack deployment; reduce manual configuration and ensure environment consistency.
  • Build automation for scaling, failover, and disaster recovery procedures.
  • Snowflake Cloud Operations & Integration

  • Automate Snowflake provisioning, warehouse sizing, and cluster management via Terraform; integrate Snowflake with CI / CD pipelines.
  • Implement Infrastructure as Code patterns for Snowflake roles, permissions, databases, and schema management.
  • Build automated deployment workflows for dbt Cloud jobs and Snowflake objects; integrate version control with Snowflake changes.
  • Monitor Snowflake resource utilization, costs, and performance; implement auto-suspend / auto-resume policies and scaling strategies.
  • Python Development & Tooling

  • Develop Python scripts and tools for infrastructure automation, cloud operations, and deployment workflows.
  • Build custom integrations between CI / CD systems, cloud platforms, and Snowflake; create monitoring and alerting automation.
  • Monitoring, Logging & Observability

  • Integrate monitoring and logging solutions (Splunk, Dynatrace, CloudWatch, Azure Monitor) into CI / CD and infrastructure stacks.
  • Build automated alerting for infrastructure health, deployment failures, and performance degradation.
  • Implement centralized logging for applications, infrastructure, and cloud audit trails; maintain log retention and compliance requirements.
  • Create dashboards and metrics for infrastructure utilization, deployment frequency, and change failure rates.
  • Data Pipeline & Application Deployment

  • Support deployment of data processing jobs, Airflow DAGs, and dbt Cloud transformations through automated pipelines.
  • Implement blue-green or canary deployment patterns for zero-downtime updates to data applications.
  • Build artifact management workflows (Docker images, Python packages, dbt artifacts); integrate with Artifactory or cloud registries.
  • Collaborate with data engineers on deployment best practices and production readiness reviews.
  • Disaster Recovery & High Availability

  • Design backup and disaster recovery strategies for data infrastructure; automate backup provisioning and testing.
  • Implement infrastructure redundancy and failover automation using AWS / Azure native services.
  • Documentation & Knowledge Sharing

  • Maintain comprehensive documentation for infrastructure architecture, CI / CD workflows, and operational procedures.
  • Create runbooks and troubleshooting guides for common issues; document infrastructure changes and design decisions.
  • Establish DevOps best practices and standards; share knowledge through documentation, lunch-and-learns, and mentoring.
  • Qualifications

    Bachelor's Degree and 4 years of experience in Data engineering, big data technologies, cloud platforms OR High School Diploma or GED and 8 years of experience in Data engineering, big data technologies, cloud platforms

    Preferred :

    Technical / Business Skills :

    CI / CD tools : Azure DevOps Pipelines or GitLab CI / CD (hands-on pipeline development)Infrastructure as Code : Terraform (AWS and Azure providers) — production-grade experienceConfiguration Management : Ansible and / or Puppet — ability to write playbooks / manifests and manage infrastructure stateCloud platforms : AWS (EC2, S3, RDS, VPC, IAM, Lambda, Glue, Lakeformation) and Azure (VMs, App Services, Blob Storage, Cosmos DB, networking)Python programming : scripting, automation, API integration, and tooling developmentSnowflake : operational knowledge of warehouse management, cost optimization, and cloud integrationGit / GitLab / GitHub : version control, branching strategies, and repository managementLinux / Unix system administration and command-line proficiencyNetworking fundamentals : VPCs, subnets, security groups, DNS, load balancingScripting languages : Bash, Python, or similar for automation5+ years in DevOps, Platform Engineering, or Infrastructure Engineering3+ years hands-on with Terraform and Infrastructure as Code3+ years with CI / CD tools (Jenkins, GitLab CI, Azure DevOps, or similar)2+ years with configuration management tools (Ansible, Puppet, or similar)2+ years supporting cloud platforms (AWS and / or Azure in production)1+ years with Python automation and scriptingExperience supporting or integrating with Snowflake or modern data warehousesFinancial banking experience is a plus.Must have one or more certifications in the relevant technology fields.

    Functional Skills / Core Competencies :

    Strong automation mindset : identify and eliminate manual toilSystems thinking : understand full deployment pipelines and infrastructure dependenciesComfortable with continuous learning of new tools and cloud servicesAbility to balance speed of delivery with stability and safetyTeam Player : Support peers, team, and department management.Communication : Excellent verbal, written, and interpersonal communication skills.Problem Solving : Excellent problem-solving skills, incident management, root cause analysis, and proactive solutions to improve quality.Partnership and Collaboration : Develop and maintain partnership with business and IT stakeholdersAttention to Detail : Ensure accuracy and thoroughness in all tasks.

    #LI-XG1

    Benefits are an integral part of total rewards and First Citizens Bank is committed to providing a competitive, thoughtfully designed and quality benefits program to meet the needs of our associates. More information can be found at benefits.

    CI / CD Pipeline & Deployment Automation

  • Design and implement robust CI / CD pipelines using Azure DevOps or GitLab; automate build, test, and deployment processes for data applications, dbt Cloud jobs, and infrastructure changes.
  • Build deployment orchestration for multi-environment (dev, qa, uat, production) workflows with approval gates, rollback mechanisms, and artifact management.
  • Implement GitOps practices for infrastructure and application deployments; maintain version control and audit trails for all changes.
  • Optimize pipeline performance, reduce deployment times, and enable fast feedback loops for rapid iteration.
  • Infrastructure as Code (IaC) & Cloud Operations

  • Design and manage Snowflake, AWS and Azure infrastructure using Terraform; ensure modularity, reusability, and consistency across environments.
  • Provision and manage Cloud resources
  • Implement tagging strategies and resource governance; maintain Terraform state management and implement remote state backends.
  • Support multi-cloud architecture patterns and ensure portability between AWS and Azure where applicable.
  • Configuration Management & Infrastructure Automation

  • Deploy and manage Ansible playbooks for configuration management, patching, and infrastructure orchestration across cloud environments.
  • Utilize Puppet for infrastructure configuration, state management, and compliance enforcement; maintain Puppet modules and manifests for reproducible environments.
  • Automate VM provisioning, OS hardening, and application stack deployment; reduce manual configuration and ensure environment consistency.
  • Build automation for scaling, failover, and disaster recovery procedures.
  • Snowflake Cloud Operations & Integration

  • Automate Snowflake provisioning, warehouse sizing, and cluster management via Terraform; integrate Snowflake with CI / CD pipelines.
  • Implement Infrastructure as Code patterns for Snowflake roles, permissions, databases, and schema management.
  • Build automated deployment workflows for dbt Cloud jobs and Snowflake objects; integrate version control with Snowflake changes.
  • Monitor Snowflake resource utilization, costs, and performance; implement auto-suspend / auto-resume policies and scaling strategies.
  • Python Development & Tooling

  • Develop Python scripts and tools for infrastructure automation, cloud operations, and deployment workflows.
  • Build custom integrations between CI / CD systems, cloud platforms, and Snowflake; create monitoring and alerting automation.
  • Monitoring, Logging & Observability

  • Integrate monitoring and logging solutions (Splunk, Dynatrace, CloudWatch, Azure Monitor) into CI / CD and infrastructure stacks.
  • Build automated alerting for infrastructure health, deployment failures, and performance degradation.
  • Implement centralized logging for applications, infrastructure, and cloud audit trails; maintain log retention and compliance requirements.
  • Create dashboards and metrics for infrastructure utilization, deployment frequency, and change failure rates.
  • Data Pipeline & Application Deployment

  • Support deployment of data processing jobs, Airflow DAGs, and dbt Cloud transformations through automated pipelines.
  • Implement blue-green or canary deployment patterns for zero-downtime updates to data applications.
  • Build artifact management workflows (Docker images, Python packages, dbt artifacts); integrate with Artifactory or cloud registries.
  • Collaborate with data engineers on deployment best practices and production readiness reviews.
  • Disaster Recovery & High Availability

  • Design backup and disaster recovery strategies for data infrastructure; automate backup provisioning and testing.
  • Implement infrastructure redundancy and failover automation using AWS / Azure native services.
  • Documentation & Knowledge Sharing

  • Maintain comprehensive documentation for infrastructure architecture, CI / CD workflows, and operational procedures.
  • Create runbooks and troubleshooting guides for common issues; document infrastructure changes and design decisions.
  • Establish DevOps best practices and standards; share knowledge through documentation, lunch-and-learns, and mentoring.
  • Bachelor's Degree and 4 years of experience in Data engineering, big data technologies, cloud platforms OR High School Diploma or GED and 8 years of experience in Data engineering, big data technologies, cloud platforms

    Preferred :

    Technical / Business Skills :

    CI / CD tools : Azure DevOps Pipelines or GitLab CI / CD (hands-on pipeline development)Infrastructure as Code : Terraform (AWS and Azure providers) — production-grade experienceConfiguration Management : Ansible and / or Puppet — ability to write playbooks / manifests and manage infrastructure stateCloud platforms : AWS (EC2, S3, RDS, VPC, IAM, Lambda, Glue, Lakeformation) and Azure (VMs, App Services, Blob Storage, Cosmos DB, networking)Python programming : scripting, automation, API integration, and tooling developmentSnowflake : operational knowledge of warehouse management, cost optimization, and cloud integrationGit / GitLab / GitHub : version control, branching strategies, and repository managementLinux / Unix system administration and command-line proficiencyNetworking fundamentals : VPCs, subnets, security groups, DNS, load balancingScripting languages : Bash, Python, or similar for automation5+ years in DevOps, Platform Engineering, or Infrastructure Engineering3+ years hands-on with Terraform and Infrastructure as Code3+ years with CI / CD tools (Jenkins, GitLab CI, Azure DevOps, or similar)2+ years with configuration management tools (Ansible, Puppet, or similar)2+ years supporting cloud platforms (AWS and / or Azure in production)1+ years with Python automation and scriptingExperience supporting or integrating with Snowflake or modern data warehousesFinancial banking experience is a plus.Must have one or more certifications in the relevant technology fields.

    Functional Skills / Core Competencies :

    Strong automation mindset : identify and eliminate manual toilSystems thinking : understand full deployment pipelines and infrastructure dependenciesComfortable with continuous learning of new tools and cloud servicesAbility to balance speed of delivery with stability and safetyTeam Player : Support peers, team, and department management.Communication : Excellent verbal, written, and interpersonal communication skills.Problem Solving : Excellent problem-solving skills, incident management, root cause analysis, and proactive solutions to improve quality.Partnership and Collaboration : Develop and maintain partnership with business and IT stakeholdersAttention to Detail : Ensure accuracy and thoroughness in all tasks.

    #LI-XG1

    Benefits are an integral part of total rewards and First Citizens Bank is committed to providing a competitive, thoughtfully designed and quality benefits program to meet the needs of our associates. More information can be found at benefits.

    [job_alerts.create_a_job]

    Senior Data Engineer DevOps Gitlab Terraform • Raleigh, North Carolina, US

    [internal_linking.similar_jobs]
    Senior DevOps Engineer

    Senior DevOps Engineer

    VirtualVocations • Raleigh, North Carolina, United States
    [job_card.full_time]
    A company is looking for a Senior DevOps Engineer to lead their DevOps team in a remote-friendly environment.Key Responsibilities Lead and mentor DevOps team members, ensuring best practices in a...[show_more]
    [last_updated.last_updated_1_day] • [promoted]
    Senior Cognos Developer

    Senior Cognos Developer

    Kimley-Horn • Raleigh, NC, United States
    [job_card.full_time]
    Kimley-Horn is looking for a Senior Cognos Developer to join the corporate team in Raleigh, North Carolina (NC).We are looking for an experienced Senior Cognos Developer to lead the design, develop...[show_more]
    [last_updated.last_updated_1_day] • [promoted]
    Data Solutions Engineer

    Data Solutions Engineer

    Analytica, Inc. • Raleigh, NC, United States
    [job_card.full_time]
    Analytica is on the lookout for a talented Data Solutions Engineer to contribute to an essential, long-term project for a federal government client. The ideal candidate will thrive in an agile and c...[show_more]
    [last_updated.last_updated_1_day] • [promoted]
    Data Center Design Engineer - IC4

    Data Center Design Engineer - IC4

    Oracle • Raleigh, NC, United States
    [job_card.full_time]
    The Oracle OCI team is looking for a Data Center Designer to join our team, someone who innovates & shares our passion for winning in the cloud marketplace. You will work closely with the data cente...[show_more]
    [last_updated.last_updated_1_day] • [promoted]
    MO2-613-Senior Azure Cloud Engineer 11694-1

    MO2-613-Senior Azure Cloud Engineer 11694-1

    FHR • Raleigh, NC, US
    [job_card.full_time]
    [filters_job_card.quick_apply]
    Our direct client has an opening for.Senior Azure Cloud Engineer 11694-1.This position is up to 12 months, with the option of extension, in. Please send rates and a resume.DEA needs an Architect or ...[show_more]
    [last_updated.last_updated_30]
    Technical Specialist – Expert (Sr. Data Engineer)

    Technical Specialist – Expert (Sr. Data Engineer)

    Sunrise Systems • Raleigh, North Carolina, United States
    [job_card.full_time]
    [filters_job_card.quick_apply]
    Job Title : Technical Specialist Expert (Sr.Location : Raleigh, NC (Hybrid : 2 3 days onsite / week).Duration : 12 Months on W2 Contract. Data Solutions Engineer to lead database architecture for cloud ...[show_more]
    [last_updated.last_updated_30]
    Senior Cloud Technical Lead- Storage Engineer

    Senior Cloud Technical Lead- Storage Engineer

    SAS • Cary, NC, United States
    [job_card.full_time]
    Senior Cloud Technical Lead- Storage Engineer- Hybrid.Through our software and services, we inspire customers around the world to transform data into intelligence - and questions into answers.We're...[show_more]
    [last_updated.last_updated_1_day] • [promoted]
    Data Tech Lead- US

    Data Tech Lead- US

    Zortech Solutions • Raleigh, NC, United States
    [job_card.full_time]
    Location : Raleigh NC (Day 1 onsite).Person should know technical & should have ability to work on Account farming & mining. Senior technical person with strong CRO / pharma domain who can partner with...[show_more]
    [last_updated.last_updated_1_day] • [promoted]
    Director, Business Development - Data Centers (Engineering)

    Director, Business Development - Data Centers (Engineering)

    Chemelex • Raleigh, NC, United States
    [job_card.full_time]
    Chemelex is a global leader in electric thermal and sensing solutions, protecting the world's critical processes, places and people. With over 50 years of innovation and a commitment to excellence, ...[show_more]
    [last_updated.last_updated_1_day] • [promoted]
    Data Center Engineer

    Data Center Engineer

    ADEX • Cary, NC, United States
    [job_card.full_time]
    The engineer in this position should have a wealth of experience engineering / installing datacenter equipment.The engineer will typically take on the more complex work and pull all aspects of a job ...[show_more]
    [last_updated.last_updated_1_day] • [promoted]
    Senior Sales Engineer - Data Modernization

    Senior Sales Engineer - Data Modernization

    Rocket Software • Raleigh, NC, United States
    [job_card.full_time]
    Join an innovative company where passion and technology converge!.We are seeking a Senior Sales Engineer who is not only enthusiastic about technology but also excels at engaging clients to tackle ...[show_more]
    [last_updated.last_updated_variable_hours] • [promoted] • [new]
    Director- Data Engineering, Governance & DS

    Director- Data Engineering, Governance & DS

    NYC Staffing • Cary, NC, United States
    [job_card.full_time]
    Director, Data & Analytics Delivery.Location : Cary, NC; New York, NY; Tampa, FL.Reports to : VP - Data & Analytics (Corporate Functions). Team : Leads a multi-disciplinary team of ~10-15 comprising of...[show_more]
    [last_updated.last_updated_1_day] • [promoted]
    Senior Technical Manager, Data Interoperability

    Senior Technical Manager, Data Interoperability

    Thermo Fisher Scientific • Raleigh, NC, United States
    [job_card.full_time]
    Join Thermo Fisher Scientific Inc.Technical Manager, guiding developers to build flawless data interoperability solutions!. Lead and manage a team of developers passionate about building data intero...[show_more]
    [last_updated.last_updated_1_day] • [promoted]
    Sr. SSIS developer

    Sr. SSIS developer

    Coforge • Raleigh, NC, US
    [job_card.full_time]
    SSIS developer with the following skillset : .Designing and developing SSIS / SQL ETL solutions to acquire and prepare data from numerous upstream systems for processing by QRM.Builds data transform...[show_more]
    [last_updated.last_updated_variable_days] • [promoted]
    M-3-19 - Senior DevOps Engineer (758983)

    M-3-19 - Senior DevOps Engineer (758983)

    Focused HR Solutions • Raleigh, North Carolina, United States
    [job_card.full_time]
    [filters_job_card.quick_apply]
    Work currently can be performed remote with potential for onsite at the Client / manager’s discretion.Our client has an opening for a Senior DevOps Engineer (758983). This position is 12 months, with ...[show_more]
    [last_updated.last_updated_30]
    Azure Data Engineer

    Azure Data Engineer

    ATTAINX INC • Raleigh, North Carolina, United States, 27604
    [job_card.full_time]
    US Citizen w / Active Secret Clearance.We are seeking an experienced Azure Data Engineer with proven ability to evaluate, optimize, and modernize enterprise data warehouse environments.The role comb...[show_more]
    [last_updated.last_updated_variable_days]
    Snowflake Lead

    Snowflake Lead

    TechDigital Corporation • Raleigh, NC, United States
    [job_card.full_time]
    Mandatory Skills : Snowflake with Python Senior developer with Snowflake experience who understands Star Schema Methodology Mandatory Overall, 12 plus years of IT experience - Mandatory Good SQL Cod...[show_more]
    [last_updated.last_updated_variable_hours] • [promoted] • [new]
    Senior Software Engineer - SDET - Data Mobility

    Senior Software Engineer - SDET - Data Mobility

    Dell • Butner, NC, US
    [job_card.full_time]
    Senior Software Engineer - SDET – Data Mobility The Software Engineering team delivers next-generation application enhancements and new products for a changing world. Working at the cutting edge, ...[show_more]
    [last_updated.last_updated_1_day] • [promoted]