Data Engineer

Data EngineerData EngineerFull TimeRemoteTeam 11-50

Location

United States

Posted

31 days ago

Salary

Not specified

No structured requirement data.

Job Description

This description is a summary of our understanding of the job description. Click on 'Apply' button to find out more.

Role Description

As a Data Engineer, you will:

  • Build and maintain scalable, best-in-class data infrastructure and pipelines that serve as core components of a multi-tenant data platform.
  • Ensure our data pipelines and data warehouse are optimized for accuracy, performance, and accessibility.
  • Manage architecture frameworks and participate in the development of data, experimentation, and analytics solutions in collaboration with cross-functional partners in the Product and Engineering organizations.
  • Implement processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
  • Test and clearly document data assets and warehouse implementations to enable others to understand the implementation and definition of data methodologies easily.
  • Design data integrations and a data quality framework.
  • Work closely with Product and Engineering teams to develop a strategy for long-term data platform architecture.

Qualifications

  • Demonstrated ability to build, manage, and optimize core data infrastructure at scale in a multi-tenant environment.
  • A propensity to independently identify opportunities for optimization and drive forward high-impact projects with minimal guidance.
  • Proficiency in SQL and strong programming skills in Python, with experience in Bash scripting for automation and workflow management.
  • Deep knowledge and experience with Snowflake and advanced features like Snowpipes, storage integrations, stages, streams, tasks.
  • Experience with the AWS ecosystem and securely deploying and managing applications using serverless tools like ECS and Lambda.
  • Experience building and maintaining custom ingestion pipelines using tools like dlt or requests.
  • Ingest data from third-party APIs and custom ingestion pipelines with Dagster and Snowflake (Snowpipe, streams, and tasks).
  • Proficiency with workflow orchestration tools (Dagster or similar tooling like Airflow or Prefect) and data transformation tools (dbt).
  • Experience with DataOps tools, such as Docker, GitHub Actions, and Terraform.
  • Experience with AI or ML is a plus.

Requirements

  • Excited to build foundational data infrastructure that powers many e-commerce brands.
  • Energized by the opportunity to abstract repeated data problems into platform-level solutions.
  • Passionate about working cross-functionally across engineering, product, and data teams.
  • Motivated by working in a fast-paced and iterative environment.
  • Excited by the opportunity to be an early, critical member of a rapidly growing organization.
  • Personally aligned with our mission to make commerce accessible.

Benefits

  • An investment in your physical and mental well-being; we offer 100% employee Medical Benefits coverage, with 69% dependent coverage.
  • Flexible PTO; we encourage you to take the time you need to be your best self at work.
  • An onboarding package and annual work from home stipend to ensure you have everything you need to be successful while working remote.
  • Generous Parental Leave with customizable transition back to work program.
  • The benefits of working from home, with opportunities to spend quality time with the team at Chord in-person events throughout the year.
  • To make an impact! We’re an early-stage company, which means there is space to champion ideas, and create and lead initiatives at any level in the Organization.
  • This is a full-time, salaried position that includes Equity.

Job Requirements

  • Demonstrated ability to build, manage, and optimize core data infrastructure at scale in a multi-tenant environment.
  • A propensity to independently identify opportunities for optimization and drive forward high-impact projects with minimal guidance.
  • Proficiency in SQL and strong programming skills in Python, with experience in Bash scripting for automation and workflow management.
  • Deep knowledge and experience with Snowflake and advanced features like Snowpipes, storage integrations, stages, streams, tasks.
  • Experience with the AWS ecosystem and securely deploying and managing applications using serverless tools like ECS and Lambda.
  • Experience building and maintaining custom ingestion pipelines using tools like dlt or requests.
  • Ingest data from third-party APIs and custom ingestion pipelines with Dagster and Snowflake (Snowpipe, streams, and tasks).
  • Proficiency with workflow orchestration tools (Dagster or similar tooling like Airflow or Prefect) and data transformation tools (dbt).
  • Experience with DataOps tools, such as Docker, GitHub Actions, and Terraform.
  • Experience with AI or ML is a plus.
  • Excited to build foundational data infrastructure that powers many e-commerce brands.
  • Energized by the opportunity to abstract repeated data problems into platform-level solutions.
  • Passionate about working cross-functionally across engineering, product, and data teams.
  • Motivated by working in a fast-paced and iterative environment.
  • Excited by the opportunity to be an early, critical member of a rapidly growing organization.
  • Personally aligned with our mission to make commerce accessible.

Benefits

  • An investment in your physical and mental well-being; we offer 100% employee Medical Benefits coverage, with 69% dependent coverage.
  • Flexible PTO; we encourage you to take the time you need to be your best self at work.
  • An onboarding package and annual work from home stipend to ensure you have everything you need to be successful while working remote.
  • Generous Parental Leave with customizable transition back to work program.
  • The benefits of working from home, with opportunities to spend quality time with the team at Chord in-person events throughout the year.
  • To make an impact! We’re an early-stage company, which means there is space to champion ideas, and create and lead initiatives at any level in the Organization.
  • This is a full-time, salaried position that includes Equity.

Related Categories

Related Job Pages

More Data Engineer Jobs

Data Engineer31 days ago
Full TimeRemoteTeam 51-200Since 2016

Engenheiro de Dados SR com foco em GCP e Python

AirflowCloudETLGoogle Cloud PlatformNoSQLPythonSparkSQL
United States
Data Engineer31 days ago
Full TimeRemoteTeam 10,001+H1B Sponsor

DataOps Engineer owning the lifecycle for our Snowflake-on-AWS platform

AirflowAWSAzureKafkaPythonSQLTerraform
United States
$87.4K - $123.4K / year
Data Engineer31 days ago
Full TimeRemoteTeam 51-200H1B No Sponsor

Hands-on data engineer developing scalable data solutions for a sustainability-focused platform

AWSCloudDockerETLPostgresPythonSQL
United States
$135K - $165K / year

Senior Data Engineer

SERHANT.

The first real estate company designed for the marketplace of tomorrow.

Data Engineer31 days ago
Full TimeRemoteTeam 51-200

Senior Data Engineer building data infrastructure with modern AI tools

AirflowApacheETLKafkaPostgresPythonSparkSQLTypeScript
United States