Search by job, company or skills

Hypertrade

Data Engineer

3-5 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 3 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Open Position

Data Engineer atHypertrade(Bangkok / Hybrid) build and scale data pipelines that powerRetail AIproducts used by leading retailers across APACand Africa.

About this role

  • Hypertradeisn'tanother analytics shop we'rebuilding the operating system for modern retail.
  • We turn chaotic, high-volume retail data into real-time clarity for leading retailers and global brands, powering A2, CIC, and next-gen Agentic Workflows.
  • Join our core Data Engineering team to design and run production-grade ingestion and transformation pipelines across 10+ retail environments.
  • You'llwork with Python, Airflow, GCP, Bigtable and build systems that get used, every day, to drive decisions on the shop floorand digital platforms.

What you will do

  • Design, build, andmaintainscalable ETL/ELT pipelines in Python and Apache Airflow for multi-tenant retail clients.
  • Ingest and process POS, inventory, pricing, promotion, and loyalty data into our GCP-based data platform (Bigtable and related services).
  • Monitor, debug, andoptimizeworkflows for performance, reliability, and cost; handle incidents and drive root-cause analysis.
  • Implement data quality checks, validation, and reconciliation to ensure trustworthy, analytics-ready datasets.
  • Collaborate with Data Science, Analytics, and Product teams to shape data models and enable new features in our retail analytics products.
  • Contribute to standards for version control (Git/GitLab), CI/CD, and documentation (Confluence), and work in an Agile setup using JIRA.

What we are looking for

  • 35 years of experience as a Data Engineer or similar, working with large datasets and production pipelines.
  • Strong Python skills for data processing and automation, plus solid experience with Apache Airflow (or similar orchestration tools).
  • Good understanding of ETL/ELT, data warehousing concepts, and data modeling across relational and/or NoSQL systems.
  • Experience with at least one major cloud platform; GCP (Big Query, Cloud Storage, Pub/Sub, etc.) is a strong plus.
  • Skillful with Git-based workflows (Git/GitLab), code reviews, and Agile collaboration using JIRA and Confluence.
  • Clear communication in English and a problem-solving, ownership mindset.

Nice to have

  • Background in retail or ecommerce data (POS, inventory, pricing, promotions, loyalty).
  • Experience with Big Data tools, containerization (Docker), and CI/CD for data pipelines.
  • Exposure to monitoring/observability for data systems (logging, metrics, alerting).
  • Familiarity with how BI and Data Science teams consume data and measure impact.

How to apply

  • Share your CV and (optionally) GitHub/GitLab profile witha short noteaboutyourfavorite data pipelineyou'vebuilt, and why, to yourHyperTradecareers contactat ([Confidential Information])

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 135469053

Similar Jobs