eSimplicity

An engineering firm that delivers high-quality Healthcare IT, Cybersecurity, and Telecommunication solutions.

Data Engineer III

Data EngineerData EngineerFull TimeRemoteTeam 51-200Since 2016H1B No SponsorCompany SiteLinkedIn

Location

United States

Posted

15 hours ago

Salary

$113K - $127K / year

No structured requirement data.

Job Description

Description

About Us:

eSimplicity is a modern digital services company that partners with government agencies to improve the lives and protect the well-being of all Americans, from veterans and service members to children, families, and seniors. Our engineers, designers, and strategists cut through complexity to create intuitive products and services that equip federal agencies with solutions to courageously transform today for a better tomorrow.


Responsibilities:

  • Responsible for developing, expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams.
  • Support software developers, database architects, data analysts and data scientists on data initiatives and ensure optimal data delivery architecture is consistent throughout ongoing projects.
  • Creates new pipeline and maintains existing pipeline, updates Extract, Transform, Load (ETL) process, creates new ETL feature , builds PoCs with Redshift Spectrum, Databricks, AWS EMR, SageMaker, etc.;
  • Implements, with support of project data specialists, large dataset engineering: data augmentation, data quality analysis, data analytics (anomalies and trends), data profiling, data algorithms, and (measure/develop) data maturity models and develop data strategy recommendations.
  • Operate large-scale data processing pipelines and resolve business and technical issues pertaining to the processing and data quality.
  • Assemble large, complex sets of data that meet non-functional and functional business requirements
  • Identify, design, and implement internal process improvements including re-designing data infrastructure for greater scalability, optimizing data delivery, and automating manual processes 
  • Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS and SQL technologies
  • Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition?
  • Working with stakeholders including data, design, product and government stakeholders and assisting them with data-related technical issues
  • Write unit and integration tests for all data processing code.
  • Work with DevOps engineers on CI, CD, and IaC.
  • Read specs and translate them into code and design documents.
  • Perform code reviews and develop processes for improving code quality.
  • Perform other duties as assigned.

Requirements

Required Qualifications:

  • All candidates must pass public trust clearance through the U.S. Federal Government. This requires candidates to either be U.S. citizens or pass clearance through the Foreign National Government System which will require that candidates have lived within the United States for at least 3 out of the previous 5 years, have a valid and non-expired passport from their country of birth and appropriate VISA/work permit documentation.
  • Minimum of 8 years of experience as a Data Engineer or in hands-on software development, including at least 4 years using Python, Java, and cloud technologies for building and maintaining data pipelines.
  • Bachelor’s degree in Computer Science, Information Systems, Engineering, Business, or a related scientific or technical discipline required. 
  • OR In lieu of a degree, candidates may qualify with 10 years of general information technology experience, including at least 8 years of specialized experience.
  • Expert data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.
  • Self-sufficient and comfortable supporting the data needs of multiple teams, systems, and products.
  • Experienced in designing data architecture for shared services, scalability, and performance
  • Experienced in designing data services including API, meta data, and data catalogue.
  • Experienced in data governance process to ingest (batch, stream), curate, and share data with upstream and downstream data users.
  • Ability to build and optimize data sets, ‘big data’ data pipelines and architectures?
  • Ability to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions?
  • Excellent analytic skills associated with working on unstructured datasets?
  • Ability to build processes that support data transformation, workload management, data structures, dependency and metadata
  • Demonstrated understanding and experience using software and tools including big data tools like Spark and Hadoop; relational databases including MySQL and Postgres; workflow management and pipeline tools such as Apache Airflow, and AWS Step Function; AWS cloud services including Redshift, RDS, EMR and EC2; stream-processing systems like Spark-Streaming and Storm; and object function/object-oriented scripting languages including Scala, Java and Python.?
  • Flexible and willing to accept a change in priorities as necessary.
  • Ability to work in a fast-paced, team-oriented environment
  • Experience with Agile methodology, using test-driven development.
  • Experience with GitHub and Atlassian Jira/Confluence.
  • Excellent command of written and spoken English.

Desired Qualifications:

  • Federal Government contracting work experience.
  • Databricks Certification, Google’s Certified Professional-Data-Engineer certification, IBM Certified Data Engineer – Big Data certification, CCP Data Engineer for Cloudera
  • Centers for Medicare and Medicaid Services (CMS) or Health Care Industry experience
  • Experience with healthcare quality data including Medicaid and CHIP provider data, beneficiary data, claims data, and quality measure data.

Working Environment:
eSimplicity supports a remote work environment operating within the Eastern time zone so we can work with and respond to our government clients. Expected hours are 9:00 AM to 5:00 PM Eastern unless otherwise directed by your manager.

Occasional travel for training and project meetings. It is estimated to be less than 5% per year.


Benefits:
We offer highly competitive salaries and full healthcare benefits.


Equal Employment Opportunity:
eSimplicity is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender, age, status as a protected veteran, sexual orientation, gender identity, or status as a qualified individual with a disability.

Related Categories

Related Job Pages

More Data Engineer Jobs

Data Architect / MuleSoft Integration & Migration Specialist (Remote)

IIIIIIUS

ICF is a global advisory and technology services provider, but we’re not your typical consultants. We combine unmatched expertise with cutting-edge technology to help clients solve their most complex challenges, navigate change, and shape the future. We can only solve the world's toughest challenges by building a workplace that allows everyone to thrive. We are an equal opportunity employer. Together, our employees are empowered to share their expertise and collaborate with others to achieve personal and professional goals.

Data Engineer15 hours ago
Full TimeRemoteTeam 1,001-5,000H1B No Sponsor

The role involves leading the design and implementation of scalable data solutions for a Salesforce Government Cloud Platform deployment, focusing on MuleSoft integration and secure data migration from legacy systems. Responsibilities include developing integration solutions, defining data models, ensuring compliance with federal standards, and leading data migration efforts.

United States
$98.6K - $167K / year

AI/Data Governance Epic, Enterprise Systems & AI Governance

Impact Advisors

Impact Advisors, LLC is a nationally recognized healthcare management consulting firm delivering Best in KLAS advisory, implementation, and optimization services. We are driven by a commitment to exceed client expectations and are proud to be a trusted partner to many of the nation's leading healthcare organizations. Our mission to drive patient-centered, value-driven outcomes has earned us prestigious industry accolades.

Data Engineer15 hours ago
Full TimeRemoteTeam 501-1,000

This role provides enterprise-level leadership for data governance across Epic EHR and core enterprise systems, with a specific mandate to establish and mature AI/data governance capabilities. The Lead defines standards, operationalizes governance processes, and guides the organization toward consistent, responsible, and scalable AI practices.

United States
Data Engineer15 hours ago
Full TimeRemoteTeam 501-1,000Since 2022H1B No Sponsor

This role involves leading Business Intelligence (BI) development, elevating Power BI architecture, and establishing data modeling standards through hands-on technical leadership. Key duties include building scalable semantic models, modernizing development practices, mentoring developers, and ensuring strong governance and operational visibility across the BI function.

United States
$121K - $140K / year

Principal Data Architect

Unqork

Using CaaS (Codeless-as-a-Service) to accelerate time-to-market & eliminate legacy code for the enterprise 🚀

Data Engineer15 hours ago
Full TimeRemoteTeam 201-500Since 2017H1B Sponsor

Principal Data Architect driving data architecture and infrastructure at Unqork

JavaScriptMongoDBNode.jsRedisSQL
United States
$215.1K - $286.2K / year