We sell fleet management software to mid-size trucking companies in the US and Canada. It is not a sexy product category. Our customers are not venture-backed startups. They are third-generation family logistics businesses trying to manage 80 trucks, comply with ELD mandates, and reduce their fuel costs. We have 2,300 paying customers, $18M ARR, and no outside investors. We have been profitable since year three. We are not trying to IPO. We are trying to build a durable business that serves a specific market well for a long time. Our data engineering team is two people. They built a working pipeline. It handles our current volume. It will not handle three times our current volume, which is where we expect to be in 18 months at our current growth rate. We need a senior engineer to join that team, understand what exists, and help build what comes next — specifically a migration from our current self-managed Postgres-based analytics layer to a Snowflake-based data warehouse, with Airflow replacing the bespoke scheduling scripts we have used for too long.
Responsibilities
Lead the migration of our analytics pipeline from self-managed Postgres to Snowflake
Replace our bespoke scheduling scripts with documented, maintainable Airflow DAGs
Design and implement data quality monitoring and alerting across our pipeline
Document the current data architecture, data lineage, and transformation logic — we will give you time to do this properly
Work with our product and customer success teams to scope and build new analytical capabilities for the product
Requirements
5+ years of data engineering with at least two years owning production pipelines end-to-end
Python for pipeline development, data transformation, and operational tooling — clean, tested, documented
SQL at depth: complex queries, performance tuning, and data modelling for analytics workloads
Snowflake for cloud data warehousing — you've designed schemas, managed costs, and built access control models in production
Airflow for pipeline orchestration — you've built DAGs that other people maintain without coming to ask you things
ETL pipeline design and implementation — you understand the trade-offs between transformation approaches
Experience with IoT or telemetry data is useful given the nature of our product
Benefits
Profitable, stable company with no investor pressure to grow at the expense of quality
Full remote
$118,000 – $138,000 base salary + profit sharing
No on-call for data pipelines — if something breaks at 2am, it waits until morning