About The Role
We are hiring a Data Engineer to build and maintain compliant, privacy-safe data infrastructure that powers audience development and marketing intelligence systems. This role ensures all data pipelines operate using aggregated, historical signals—never real-time incident or individual-level data.
Key Responsibilities
- Build and maintain ETL/ELT pipelines
- Develop reusable inference layers
- Create audit-ready segmentation workflows
- Implement governance controls (lineage, validation, logging)
- Perform geospatial aggregation (ZIP, county, corridor)
- Collaborate with Growth and Compliance teams
Success (60–90 Days)
- Reliable, monitored pipelines
- Governed inference layer library
- Standardized segmentation workflows
- Improved audience performance
Required Qualifications
- Strong SQL and data modelling
- Python proficiency
- Cloud platforms (AWS, GCP, Azure)
- Data warehouses (Snowflake, BigQuery, Redshift)
- Data governance experience
- Aggregated geospatial data experience
Compliance Expectations
- Understand inference vs. knowledge
- Use only aggregated historical data (45+ days old)
- No real-time or individual-level data usage
- Vendor diligence and audit logging
Nice to Have
- Marketing segmentation experience
- Privacy frameworks (CCPA, GDPR, TCPA)
- dbt, Airflow, Dagster, Spark
Compensation
- Compensation is based on experience, skills, and location and will be competitive within the global market.