Gusto
Gusto

1001-5000 employees

WebsiteLinkedIn
Human Resources
Payroll
SaaS
Financial Services
Software
About Gusto

Gusto is a leading cloud-based platform that provides small and medium-sized businesses with comprehensive payroll, benefits, and human resource management solutions. Founded in 2011, the company aims to simplify the complexities of running a business by offering an all-in-one platform that automates payroll processing, employee onboarding, benefits administration, and compliance management. Gusto's mission is to create a world where work empowers a better life by helping businesses take care of their teams with ease and confidence. The platform integrates with various accounting and time-tracking tools, making it a popular choice for businesses seeking streamlined HR and financial operations.

3 months ago

Senior Data Engineer

Full-time
Senior
Data Engineer
Report problem

📋

Description
  • The Data Engineering team at Gusto builds tools and systems that make data consistent, user-friendly, and helpful, enabling teams to make data-driven decisions and provide customized experiences to customers.
  • The Senior Data Engineer will be responsible for building reliable, scalable, and high-quality data systems, partnering with analytics, product, and engineering teams to deliver data solutions that drive business and customer impact.
  • The role requires experience in building and maintaining data pipelines, ETL workflows, data modeling, schema design, and working on cloud platforms like Snowflake, Redshift, BigQuery, or Databricks.
  • Responsibilities include implementing CI/CD pipelines, automated testing, data observability, performance optimization, and leveraging AI and automation in data engineering.
  • The role is based in various US cities with hybrid work expectations, and compensation varies by location.

🎯

Requirements
  • 8+ years of industry experience in data engineering building scalable data pipelines and data products
  • Strong proficiency in SQL and Python
  • Proven experience building and maintaining robust data pipelines and ETL workflows, with hands-on dbt experience
  • Hands-on experience ingesting data from diverse sources, including APIs, databases, SaaS applications, and event streams
  • Strong foundation in data modeling, schema design, and data quality best practices
  • Experience working on cloud platforms like Snowflake, Redshift, BigQuery, or Databricks
  • Experience implementing CI/CD pipelines, automated testing, and data observability
  • Familiarity with monitoring, alerting, and incident response for production-grade data pipelines
  • Proven ability to optimize performance and cost across data workflows and storage systems
  • Functional understanding of leveraging AI and automation in data engineering
  • Strong problem-solving skills and ability to work collaboratively in cross-functional teams
  • Strong communication and collaboration skills

🏖️

Benefits
  • Competitive salary based on location and experience
  • Hybrid work model requiring 2-3 days in the office per week in Denver, San Francisco, or New York City
  • Support for remote work with reliable internet connection
  • Inclusive hiring practices and commitment to diversity
  • Equal opportunity employer
  • Reasonable accommodations for disabilities and veterans
  • Security and privacy protections