Senior Data Engineer
Location
United States
Posted
6 days ago
Salary
Not specified
Job Description
Role Description
Genius is looking for a remote Senior Data Engineer to help build the ultimate music companion. Overseeing millions of pages of lyrics, annotations, and music metadata, your work will power Genius.com, our public API, and our partner integrations, with no shortage of interesting data challenges to solve. We're looking for engineers who thrive in complex backend systems: folks who can build robust data pipelines, optimize query performance at scale, and keep critical services running smoothly.
As a Senior Data Engineer for Genius, you will play a key role in strengthening our data infrastructure, helping us process, transform, and serve the music knowledge that powers everything we do. You will work across our Rails backend, PostgreSQL databases, and Python-based data pipelines to improve data reliability, throughput, and accuracy. Additionally, you'll work to improve the stability, performance, and scalability of Genius' backend services, which is no small feat for a platform serving millions of users daily!
Location Requirement: Candidates must be based in Los Angeles, California, or Seattle, Washington to be considered for this role.
What You’ll Do
- Build and maintain data pipelines using Python and Airflow to ingest, transform, and enrich music metadata from internal and external sources
- Proactively identify and fix infrastructure bottlenecks to scale backend services to tens of thousands of requests per minute
- Architect database query patterns and migrations in PostgreSQL, Clickhouse and BigQuery that scale to large tables with 1B+ rows
- Design and implement backend APIs in Ruby on Rails that serve data reliably and performantly to our frontend and partner integrations
- Take ownership over the systems you build, proactively identifying and surfacing performance, reliability, and maintainability improvements
- During your on-call rotation, be the backstop for backend quality, stability, and performance. Triage incoming issues to find the most urgent problems, and handle emergent incidents to keep services online
- Work directly with stakeholders to uncover and address business needs, including product owners, data analysts, and other engineers across the company
Qualifications
- Hands-on experience building and maintaining ETL/ELT pipelines using Python and workflow orchestration tools like Apache Airflow
- Deep proficiency with PostgreSQL and relational databases
- Strong experience with Ruby on Rails (or a similar "batteries-included" framework)
- Brings a product-first mindset, able to drive projects from ideation to launch while focusing on data quality, system reliability, and cross-functional alignment
- Comfortable working cross-functionally with frontend engineers, data analysts, product, and other teams
- Demonstrates a passion for staying current with new technologies, frameworks, and industry best practices
Requirements
- At least 4 years of hands-on experience in a backend or data engineering capacity, preferably in a product-driven environment
Education (Preferred)
- Bachelor's degree in Computer Science, Engineering, or a related technical field
Job Requirements
- Hands-on experience building and maintaining ETL/ELT pipelines using Python and workflow orchestration tools like Apache Airflow
- Deep proficiency with PostgreSQL and relational databases
- Strong experience with Ruby on Rails (or a similar "batteries-included" framework)
- Brings a product-first mindset, able to drive projects from ideation to launch while focusing on data quality, system reliability, and cross-functional alignment
- Comfortable working cross-functionally with frontend engineers, data analysts, product, and other teams
- Demonstrates a passion for staying current with new technologies, frameworks, and industry best practices
- At least 4 years of hands-on experience in a backend or data engineering capacity, preferably in a product-driven environment
- Education (Preferred)
- Bachelor's degree in Computer Science, Engineering, or a related technical field
Related Guides
Related Categories
Related Job Pages
More Data Engineer Jobs
Design, develop, and maintain scalable data pipelines using Airflow; Create and optimize tables in Athena, ensuring efficient query performance and cost-effectiveness; Write and manage SQL transformations in DBT, building reusable and well-documented models; Optimize data workflo...
Data Engineer
CACI InternationalCACI is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, age, national origin, disability, status as a protected veteran, or any other protected characteristic.
The Data Engineer will be responsible for designing, building, validating, and maintaining scalable data pipelines and analytics solutions, focusing heavily on Databricks and data quality. This involves implementing data ingestion, transformation, and orchestration workflows while performing rigorous data quality assurance activities.
Freelance Senior Data Engineer
HugeIncHuge is a design and technology company. We create products and experiences that grow the world’s most ambitious brands. We believe all experiences should be intelligent, shoppable, and unique to every brand. Huge’s nearly 1,000 thinkers, tinkerers, makers, and creators have been problem-solving across North America, Europe, and Latin America for over 25 years. Huge is committed to creating an inclusive employee experience for all. Huge is an equal opportunity employer (EOE) and strongly supports diversity in the workforce.
The role involves implementing and maintaining Google Analytics tracking across web platforms, configuring event and conversion tracking, and managing Google Tag Manager containers and DataLayer structures. This engineer will also support GA4 implementation and ensure proper web analytics instrumentation for accurate user behavior tracking.
The Sr. Analyst, Data Engineering partners with the Real-World Evidence (RWE) team and Business Development stakeholders to design, build, and maintain analytical data assets and dashboards. This involves leveraging technologies like SQL, Python, and R to create scalable analytic datasets and deliver actionable insights supporting strategic growth.