Data Engineer
Location
United States
Posted
3 days ago
Salary
Not specified
Job Description
Role Description
Bowhead seeks a Data Engineer to join our team in Mobile, AL. The Data Engineer will not only build scalable data pipelines, but also help operate and secure our cloud environment. This role blends data engineering, cloud platform administration, and cyber compliance to ensure our data platforms are reliable, auditable, and compliant with federal security standards. This position is fully remote.
Responsibilities
-
Data Engineering
- Design, build, and maintain ETL/ELT pipelines for batch and streaming workloads.
- Develop data processing and automation solutions in Python.
- Write and optimize SQL for transformations, data modeling, and performance tuning.
- Build and manage real-time data ingestion using message brokers and stream processing frameworks.
- Inventory and catalog data assets — tracking lineage, classification, and ownership to support governance and compliance.
- Troubleshoot data quality, pipeline, and performance issues across the platform.
-
Platform Operations
- Administer and support cloud resources including identity, networking, storage, and compute.
- Implement and maintain CI/CD pipelines for data workloads and infrastructure deployments.
- Configure and manage infrastructure using IaC and GitOps practices.
- Containerize and deploy workloads using Docker and Kubernetes.
- Monitor system health, performance, and reliability; respond to incidents and outages.
- Automate workflows across data pipelines, infrastructure provisioning, and operational processes.
- Manage access controls, backups, logging, patching, and environment hardening.
-
Cyber Compliance
- Implement and maintain security controls including logging, encryption, IAM, and configuration baselines aligned to NIST 800-53.
- Support FedRAMP/FISMA compliance activities including control implementation, evidence collection, POA&M tracking, and audit preparation.
- Work with security and governance teams to ensure environments meet required federal standards.
- Assist with vulnerability management, remediation, and security documentation.
Qualifications
- BA/S in relevant technical field preferred. Years of experience may substitute in education requirement.
- Five (5+) years of relevant technical experience.
- Proficiency in Python for data processing and automation.
- Strong SQL skills across relational and non-relational databases.
- Experience with distributed data processing frameworks (e.g., Spark) including streaming workloads.
- Familiarity with message brokers and event-driven ingestion patterns.
- Experience with data cataloging and inventory for governance and compliance.
- Production cloud experience (Azure preferred).
- Experience with Docker, Kubernetes, CI/CD pipelines, and IaC/GitOps practices.
- Ability to design scalable solutions and troubleshoot complex data and system issues.
- Experience supporting security and compliance in cloud environments — controls implementation, logging, IAM, encryption, and audit support aligned to NIST 800-53 or equivalent frameworks.
Nice to Have
- Experience with Databricks, Delta Lake, or lakehouse architectures.
- Familiarity with cloud-native governance and security tooling.
- Experience with Terraform, Bicep, or similar IaC tools.
- Prior experience supporting FedRAMP or FISMA authorization processes.
- Experience with agentic coding tools and AI-assisted development workflows.
- Experience with Navigation data and GIS.
Tech Stack
- Python
- SQL
- Spark (batch & streaming)
- Message Brokers
- Docker
- Kubernetes
- CI/CD
- IaC/GitOps
- Cloud Platforms (Azure preferred)
- Data Lakehouse
- NIST 800-53 / FedRAMP
Physical Demands
- Must be able to lift up to 25 pounds.
- Must be able to stand and walk for prolonged amounts of time.
- Must be able to twist, bend and squat periodically.
Security Clearance
Must be able to obtain a security clearance at the Public Trust level. US Citizenship is a requirement for Secret clearance at this location.
Job Requirements
- BA/S in relevant technical field preferred. Years of experience may substitute in education requirement.
- Five (5+) years of relevant technical experience.
- Proficiency in Python for data processing and automation.
- Strong SQL skills across relational and non-relational databases.
- Experience with distributed data processing frameworks (e.g., Spark) including streaming workloads.
- Familiarity with message brokers and event-driven ingestion patterns.
- Experience with data cataloging and inventory for governance and compliance.
- Production cloud experience (Azure preferred).
- Experience with Docker, Kubernetes, CI/CD pipelines, and IaC/GitOps practices.
- Ability to design scalable solutions and troubleshoot complex data and system issues.
- Experience supporting security and compliance in cloud environments — controls implementation, logging, IAM, encryption, and audit support aligned to NIST 800-53 or equivalent frameworks.
- Nice to Have
- Experience with Databricks, Delta Lake, or lakehouse architectures.
- Familiarity with cloud-native governance and security tooling.
- Experience with Terraform, Bicep, or similar IaC tools.
- Prior experience supporting FedRAMP or FISMA authorization processes.
- Experience with agentic coding tools and AI-assisted development workflows.
- Experience with Navigation data and GIS.
- Tech Stack
- Python
- SQL
- Spark (batch & streaming)
- Message Brokers
- Docker
- Kubernetes
- CI/CD
- IaC/GitOps
- Cloud Platforms (Azure preferred)
- Data Lakehouse
- NIST 800-53 / FedRAMP
- Physical Demands
- Must be able to lift up to 25 pounds.
- Must be able to stand and walk for prolonged amounts of time.
- Must be able to twist, bend and squat periodically.
- Security Clearance
- Must be able to obtain a security clearance at the Public Trust level. US Citizenship is a requirement for Secret clearance at this location.
Related Guides
Related Categories
Related Job Pages
More Data Engineer Jobs
Senior Data Engineer – Real-Time ML, Pricing Platform
TrumidDelivering a full ecosystem of credit protocols and trading solutions in one easy to-use-platform.
Senior Data Engineer designing streaming data infrastructure for fintech.
Contract Data Architect, Snowflake
The Motley FoolMaking the world smarter, happier, and richer through free and premium investing guidance.
Contract Data Architect specializing in Snowflake for a financial services company
Senior Data Engineer designing and maintaining data platforms for a personal injury law firm
System Architect, Data Engineer, Data Quality Owner
ProfitCoachBoost your Financial Performance through Operations Insights!
Software Architect managing data infrastructure at ProfitCoach