Digible, Inc

Digible provides advanced digital marketing and technology solutions for the multifamily housing industry.

Senior Data Engineer

Data EngineerData EngineerFull TimeRemoteTeam 51-200H1B No SponsorCompany SiteLinkedIn

Location

United States

Posted

15 days ago

Salary

$125K - $155K / year

Bachelor Degree5 yrs expEnglishDockerPythonSQL

Job Description

• Own our Bronze data layer: build and maintain ingestion pipelines from third-party APIs, transform via dbt, orchestrate via Prefect • Partner with Product and Engineering on Silver-layer modeling — ensuring data is clean, documented, and governed as it moves toward consumption • Enable upstream teams to own their Gold-layer data well — establishing best practices, governance standards, pipeline automation, and tooling so teams can build confidently without it becoming the wild west • Troubleshoot pipeline failures and data quality issues, driving toward root cause and long-term fixes • Contribute to platform evolution — identifying opportunities to optimize, refactor, or scale our data infrastructure • Stay informed on developments in the modern data stack and introduce tools and processes that improve our development workflows

Job Requirements

  • 5+ years of data engineering experience, including at least 2 years in a senior capacity
  • Strong proficiency with SQL, Python, ELT patterns, and data modeling
  • Hands-on experience with modern data stack tools (Snowflake, dbt, orchestration frameworks, BI tools)
  • Strong proficiency with Git and version control practices
  • Experience focused on system resilience, observability, and operational excellence across data platforms
  • Experience with Docker and DevOps practices (especially for the Platform Operations focus area)
  • Demonstrated fluency with AI-assisted development tools in your engineering workflow
  • Experience working with modestly-sized, fast-paced teams
  • Strong communication skills and ability to partner across engineering and product teams
  • Working knowledge of iterative, value-focused technical delivery

Benefits

  • 4-Day Work Week (32 Hour Work Week)
  • WFA (Work From Anywhere) OR 1 Day / Week Remote
  • Discretionary Bonus
  • 3 weeks of PTO as well as Sick leave, and Bereavement
  • 11 paid holidays
  • 401(k) + Match
  • 75% employer paid health benefits (Medical, Dental, and Vision)
  • Mental and Physical Wellness Reimbursement Benefit
  • $1000/year travel fund for employees who have been with Digible 3+ years
  • Paid Parental Leave
  • Dog-Friendly Office
  • Company-Wide Social Events
  • Weekly Lunches and Snacks for in-office employees!

Related Categories

Related Job Pages

More Data Engineer Jobs

Data Engineer

SunnyData

With a strong Data Engineering backbone, we deliver Databricks projects from concept to production.

Data Engineer15 days ago
Full TimeRemoteTeam 51-200

Data Engineer supporting customers with their data engineering needs.

AWSAzureCloudGoogle Cloud PlatformHadoopKafkaPandasScikit-LearnSparkSQL
United States

Data Engineer

GovWorx

AI That Elevates the Impact of ALL Responders

Data Engineer15 days ago
Full TimeRemoteTeam 11-50Since 2023

Data Engineer at GovWorx supporting AI solutions for public safety agencies

Amazon RedshiftAWSCloudDockerETLMySQLPandasPySparkPythonSQLTerraform
United States
$150K - $180K / year

Data Lake Architect II

arrivia

We’re a global powerhouse born from the merger of ICE, SOR Technology, and WMPH Vacations. We combine 55 years of stability with the agility of a tech innovator.

Data Engineer15 days ago
Full TimeRemote

At arrivia, we believe travel has the power to transform lives. To deliver world-class travel experiences to our millions of members, we need a world-class data foundation. We are looking for a Data Lake Architect II to be the visionary behind our next-generation cloud data platf...

AzureDatabricksSparkKafkaFlinkDelta LakeTerraformMLflowSynapseData LakeBig DataCI/CD
United States
Full TimeRemoteTeam 25Since 2024

Build and maintain data onboarding pipelines and automation to extract, transform, validate, and import retailer data from legacy POS systems. Optimize performance, ensure data quality, and collaborate with onboarding, product, and operations teams to scale retailer onboarding.

Ai ToolsAsync Job ProcessingData PipelinesPostgreSQLRuby On Rails
United States