Search by job, company or skills

ttb bank

Senior Data Engineer (Banking Data & Basel III, IV)

5-7 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 20 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Summary

We are seeking a Senior Data Engineer with strong experience in banking data and Basel III, IV risk computation engines to design, build, and optimize data pipelines that power regulatory capital, liquidity, and risk analytics. You will partner with Risk, Finance, and Compliance to deliver high-quality, auditable data for RWA, LCR, NSFR, Counterparty Credit Risk (e.g., SACCR), and Market Risk calculations (and downstream financial accounting/). This role is pivotal in ensuring data integrity, lineage, and performance for regulatory reporting, stress testing, and management insights.

Job Description

  • Data Platform & Pipelines
  • Design, develop, and maintain scalable batch/streaming pipelines (e.g., Spark/Databricks) for risk, finance, and all bank datasets.
  • Build robust integration layers for trade/transaction, positions/market data, reference data, counterparty/CRM, collateral, payments/GL, and liquidity sources.
  • Optimize data models for Basel III, IV computations (credit risk, market risk, operational risk where applicable) and ensure low-latency access for reporting.
  • Basel III / Risk Engine Enablement
  • Engineer data foundations for RWA (SA/IRB), EAD, PD/LGD/EAD components, SACCR, CVA, market risk sensitivities and aggregation (e.g., delta/vega/curvature where relevant), and liquidity ratios (LCR/NSFR).
  • Implement controls, reconciliations, and audit trails for regulatory reporting and model risk management.
  • Partner with risk and data teams to operationalize data inputs/outputs for the Basel engine and ensure explainability of results.
  • Data Quality, Controls & Governance
  • Define and implement data quality rules, outlier detection, and exception workflows; drive root-cause analysis and remediation.
  • Ensure lineage, metadata, cataloging, and access controls aligned to BOT principles.
  • Enforce data privacy, protection, and SOX control requirements; support internal/external audits.
  • Architecture & Performance
  • Lead design reviews; champion best practices for data modeling (e.g., architecture, dimensional models, data vault where applicable).
  • Tune pipelines, partitioning, caching, storage formats (e.g., Delta Lake), and query performance (SQL/PySpark).
  • Contribute to the target-state architecture for risk/finance data interoperability and reconciliation to GL.
  • Collaboration & Leadership
  • Work closely with Market Risk, Credit Risk, Treasury, Finance, Model Validation, and Regulatory Reporting teams.
  • Translate regulatory/methodology changes into data/tech requirements and delivery plans.

Qualifications

  • Bachelor's or Master's in Computer Science, Data/Software Engineering, Information Systems, Financial Engineering, or related.
  • Relevant certifications a plus (e.g., Databricks, Azure/AWS).
  • 5+ years in data engineering within banking or capital markets; 5+ years working with risk/finance data domains.
  • Hands-on with Python and SQL at an advanced level; expert with Spark.
  • Strong experience on one or more clouds (Azure/AWS) and data platforms (e.g., Databricks).
  • Workflow/orchestration and transformation tools: Azure Data Factory or equivalents.
  • Streaming and event-driven architectures: Kafka/Event Hubs.
  • Proven delivery for Basel III data needs: RWA (SA/IRB), SACCR, CVA, Market Risk sensitivities/aggregation, LCR/NSFR, stress testing.
  • Demonstrated knowledge of BOT and data governance (lineage, DQ, metadata, issue management).
  • Experience implementing controls & reconciliation for regulatory reporting and GL.
  • Excellent communication; comfortable partnering with risk/finance stakeholder.

Preferred Qualifications

  • Experience integrating/feeding Basel engines (vendor or in-house): e.g., Moody's/RiskAuthority, Regnology/OneSumX, or custom engines.
  • Familiarity with market data (pricing curves, vol surfaces), reference data (instrument/counterparty mastering), and collateral/margin.
  • Knowledge of IRB model data (default history, EAD/LGD modeling data), backtesting, and model governance workflows.
  • Exposure to FRTB (SA/IMA) data requirements and sensitivities.
  • Data catalog/governance: Azure: secrets/keys management and role-based access patterns.
  • Containerization & DevOps: Docker, Kubernetes, CI/CD (Azure DevOps/GitHub Actions).
  • Financial accounting integration: reconciliations between risk, finance, and GL, sub-ledger data models, and IFRS mappings.

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 142102819