Data Engineer
Location
Arizona
Posted
5 days ago
Salary
$107.5K - $204.5K / year
Job Description
Job Requirements
- Typically requires a University Degree or equivalent experience and minimum 8 years prior relevant experience, or An Advanced Degree in a related field and minimum 5 years' experience
- Experience developing and optimizing ETL processes
- Experience with Python for data transformation
- Proficiency in API integration (REST) and associated exchange security patterns
- Experience with cloud platforms, services and tools
- The ability to obtain and maintain a U.S. government issued security clearance is required.
- U.S. citizenship is required, as only U.S. citizens are eligible for a security clearance.
Benefits
- Medical, dental, vision, life insurance
- Short-term disability
- Long-term disability
- 401(k) match
- Flexible spending accounts
- Flexible work schedules
- Employee assistance program
- Employee Scholar Program
- Parental leave
- Paid time off
- Holidays
Related Guides
Related Categories
Related Job Pages
More Data Engineer Jobs
Database & Data Sustainment Engineer
Dine Development CorporationThis contractor and subcontractor shall abide by the requirements of 41 CFR 60-1.4(a), 60-300.5(a) and 60-741.5(a). These regulations prohibit discrimination against qualified individuals based on their status as protected veterans or individuals with disabilities, and prohibit discrimination against all individuals based on their race, color, religion, sex, sexual orientation, gender identity, national origin, or for inquiring about, discussing, or disclosing information about compensation, or any other basis prohibited by law. We participate in E-Verify.
The engineer will manage and maintain Oracle database environments across development, test, and production systems, focusing on monitoring performance, implementing patching, and supporting data modernization efforts. Key duties include developing database scripts, troubleshooting performance issues, and ensuring compliance with security and access control requirements.
The role involves building end-to-end agentic AI capabilities, implementing agent logic using frameworks like LangChain/LangGraph, and developing robust system prompts and personas aligned with mission workflows. Responsibilities also include building a metadata catalog/platform, implementing secure data integration pipelines in IL5 cloud environments, and contributing to DevSecOps practices.
Databricks Data Engineer
McKessonMcKesson is an Equal Opportunity Employer providing equal employment opportunities to applicants and employees, without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability, age, genetic information, or any other legally protected category. McKesson welcomes and encourages applications from people with disabilities. Accommodations are available on request for candidates taking part in all aspects of the selection process.
This role involves designing and operating reliable, scalable data workflows on the Databricks platform, focusing heavily on process monitoring, job optimization, and ensuring data quality. Key tasks include building and maintaining batch and streaming data pipelines using Databricks, Spark, and Delta Lake for cloud analytics workloads.
The specialist will implement and execute dispute quality processes, including conducting quality assessments, creating reports, analyzing trends, and translating insights into actionable strategies for continuous improvement. They will also contribute directly to projects advancing Company dispute program.