Data Engineer

Data EngineerData EngineerFull TimeRemote

Location

United States

Posted

3 days ago

Salary

Not specified

PythonSQLApache SparkETLELTData ModelingDockerKubernetesCi/cdInfrastructure AS CodeGit OpsAzureData GovernanceData LineageMessage BrokersStream ProcessingData CatalogingNIST 800 53Fed RAMPFISMAIAMEncryptionLoggingVulnerability Management

Job Description

This description is a summary of our understanding of the job description. Click on 'Apply' button to find out more.

Role Description

Bowhead seeks a Data Engineer to join our team in Mobile, AL. The Data Engineer will not only build scalable data pipelines, but also help operate and secure our cloud environment. This role blends data engineering, cloud platform administration, and cyber compliance to ensure our data platforms are reliable, auditable, and compliant with federal security standards. This position is fully remote.

Responsibilities

  • Data Engineering
    • Design, build, and maintain ETL/ELT pipelines for batch and streaming workloads.
    • Develop data processing and automation solutions in Python.
    • Write and optimize SQL for transformations, data modeling, and performance tuning.
    • Build and manage real-time data ingestion using message brokers and stream processing frameworks.
    • Inventory and catalog data assets — tracking lineage, classification, and ownership to support governance and compliance.
    • Troubleshoot data quality, pipeline, and performance issues across the platform.
  • Platform Operations
    • Administer and support cloud resources including identity, networking, storage, and compute.
    • Implement and maintain CI/CD pipelines for data workloads and infrastructure deployments.
    • Configure and manage infrastructure using IaC and GitOps practices.
    • Containerize and deploy workloads using Docker and Kubernetes.
    • Monitor system health, performance, and reliability; respond to incidents and outages.
    • Automate workflows across data pipelines, infrastructure provisioning, and operational processes.
    • Manage access controls, backups, logging, patching, and environment hardening.
  • Cyber Compliance
    • Implement and maintain security controls including logging, encryption, IAM, and configuration baselines aligned to NIST 800-53.
    • Support FedRAMP/FISMA compliance activities including control implementation, evidence collection, POA&M tracking, and audit preparation.
    • Work with security and governance teams to ensure environments meet required federal standards.
    • Assist with vulnerability management, remediation, and security documentation.

Qualifications

  • BA/S in relevant technical field preferred. Years of experience may substitute in education requirement.
  • Five (5+) years of relevant technical experience.
  • Proficiency in Python for data processing and automation.
  • Strong SQL skills across relational and non-relational databases.
  • Experience with distributed data processing frameworks (e.g., Spark) including streaming workloads.
  • Familiarity with message brokers and event-driven ingestion patterns.
  • Experience with data cataloging and inventory for governance and compliance.
  • Production cloud experience (Azure preferred).
  • Experience with Docker, Kubernetes, CI/CD pipelines, and IaC/GitOps practices.
  • Ability to design scalable solutions and troubleshoot complex data and system issues.
  • Experience supporting security and compliance in cloud environments — controls implementation, logging, IAM, encryption, and audit support aligned to NIST 800-53 or equivalent frameworks.

Nice to Have

  • Experience with Databricks, Delta Lake, or lakehouse architectures.
  • Familiarity with cloud-native governance and security tooling.
  • Experience with Terraform, Bicep, or similar IaC tools.
  • Prior experience supporting FedRAMP or FISMA authorization processes.
  • Experience with agentic coding tools and AI-assisted development workflows.
  • Experience with Navigation data and GIS.

Tech Stack

  • Python
  • SQL
  • Spark (batch & streaming)
  • Message Brokers
  • Docker
  • Kubernetes
  • CI/CD
  • IaC/GitOps
  • Cloud Platforms (Azure preferred)
  • Data Lakehouse
  • NIST 800-53 / FedRAMP

Physical Demands

  • Must be able to lift up to 25 pounds.
  • Must be able to stand and walk for prolonged amounts of time.
  • Must be able to twist, bend and squat periodically.

Security Clearance

Must be able to obtain a security clearance at the Public Trust level. US Citizenship is a requirement for Secret clearance at this location.

Job Requirements

  • BA/S in relevant technical field preferred. Years of experience may substitute in education requirement.
  • Five (5+) years of relevant technical experience.
  • Proficiency in Python for data processing and automation.
  • Strong SQL skills across relational and non-relational databases.
  • Experience with distributed data processing frameworks (e.g., Spark) including streaming workloads.
  • Familiarity with message brokers and event-driven ingestion patterns.
  • Experience with data cataloging and inventory for governance and compliance.
  • Production cloud experience (Azure preferred).
  • Experience with Docker, Kubernetes, CI/CD pipelines, and IaC/GitOps practices.
  • Ability to design scalable solutions and troubleshoot complex data and system issues.
  • Experience supporting security and compliance in cloud environments — controls implementation, logging, IAM, encryption, and audit support aligned to NIST 800-53 or equivalent frameworks.
  • Nice to Have
  • Experience with Databricks, Delta Lake, or lakehouse architectures.
  • Familiarity with cloud-native governance and security tooling.
  • Experience with Terraform, Bicep, or similar IaC tools.
  • Prior experience supporting FedRAMP or FISMA authorization processes.
  • Experience with agentic coding tools and AI-assisted development workflows.
  • Experience with Navigation data and GIS.
  • Tech Stack
  • Python
  • SQL
  • Spark (batch & streaming)
  • Message Brokers
  • Docker
  • Kubernetes
  • CI/CD
  • IaC/GitOps
  • Cloud Platforms (Azure preferred)
  • Data Lakehouse
  • NIST 800-53 / FedRAMP
  • Physical Demands
  • Must be able to lift up to 25 pounds.
  • Must be able to stand and walk for prolonged amounts of time.
  • Must be able to twist, bend and squat periodically.
  • Security Clearance
  • Must be able to obtain a security clearance at the Public Trust level. US Citizenship is a requirement for Secret clearance at this location.

Related Categories

Related Job Pages

More Data Engineer Jobs

Senior Data Engineer – Real-Time ML, Pricing Platform

Trumid

Delivering a full ecosystem of credit protocols and trading solutions in one easy to-use-platform.

Data Engineer3 days ago
Full TimeRemoteTeam 51-200Since 2014H1B Sponsor

Senior Data Engineer designing streaming data infrastructure for fintech.

CloudDistributed SystemsKafkaPythonSQL
New York
$200K - $250K / year

Contract Data Architect, Snowflake

The Motley Fool

Making the world smarter, happier, and richer through free and premium investing guidance.

Data Engineer3 days ago
ContractRemoteTeam 501-1,000Since 1993H1B No Sponsor

Contract Data Architect specializing in Snowflake for a financial services company

ETLSQL
United States
$95 - $110 / hour

Senior Data Engineer

TopDog Law

Personal Injury Law Firm

Data Engineer3 days ago
Full TimeRemoteTeam 51-200Since 2019H1B No Sponsor

Senior Data Engineer designing and maintaining data platforms for a personal injury law firm

AirflowBigQueryCloudPythonSQL
United States
$130K - $160K / year

System Architect, Data Engineer, Data Quality Owner

ProfitCoach

Boost your Financial Performance through Operations Insights!

Data Engineer3 days ago
Full TimeRemoteTeam 11-50Since 2019H1B No Sponsor

Software Architect managing data infrastructure at ProfitCoach

United States
$155K - $175K / year