Skip to content

Treasury Data Layer

Status: Shipped

Domain: Data Linear Projects: None tracked


What It Does

Palm builds on top of your existing TMS or data infrastructure to collect bank data. We plug into the connectivity you already run - TMS, ERP, data lake, or integration hub - and transform statements and transactions into structured, trusted inputs for forecasting, analytics, and decisions.

Rather than another integration project, Palm ingests from where your data already lands, then normalizes, enriches, validates, and models it for downstream use.


Capabilities

Capability Status Notes
Ingest from existing pipes Shipped S3/GCS/Azure Blob, SFTP, HTTPS APIs
Standard bank formats Shipped ISO 20022 CAMT, MT940/MT942, BAI2, CSV
Custom schema support Shipped Flexible parsing for non-standard formats
Entity resolution Shipped Map accounts to legal entities
Currency handling Shipped Multi-currency with rate source configuration
Value-date logic Shipped Proper handling of value vs booking dates
Cash-pooling rules Shipped Pool structure awareness in data model
Multi-bank harmonization Shipped Unified schema across bank formats
Schema validation Shipped Pre-ingestion data quality checks
Freshness monitoring Shipped Track data arrival and staleness
Audit trails Shipped Full lineage from source to analytics

Jobs Fulfilled

1. Get bank data into a usable format without massive integration projects

Desired Outcomes Addressed: - [x] Minimize the time from decision to live data (weeks, not months) - [x] Reduce dependency on IT for new bank connections - [x] Increase flexibility to add new data sources without rebuilding

How Palm Addresses This: - Connects to existing data landing zones (TMS exports, data lakes) - Supports standard formats out of the box - Custom parsers for non-standard formats

2. Trust the data powering forecasts and reports

Desired Outcomes Addressed: - [x] Minimize data quality issues that pollute downstream analytics - [x] Increase confidence that all accounts are being captured - [x] Reduce time spent investigating data discrepancies

How Palm Addresses This: - Schema validation catches issues at ingestion - Freshness monitoring ensures data is current - Audit trails enable investigation when needed


Pain Points Addressed

Pain Point Addressed? Notes
Bank connectivity projects take months Yes Build on existing infrastructure
Different banks, different formats Yes Multi-format support with harmonization
Data quality issues discovered too late Yes Validation at ingestion
No visibility into data freshness Yes Monitoring and alerts
Can't trace data back to source Yes Full audit trails

What's NOT Included (Yet)

  • Direct bank API integrations (we connect to your existing connectivity)
  • Real-time/intraday data feeds (depends on upstream availability)
  • Self-service connector configuration UI
  • Data quality scoring/dashboards

How It Works (Technical)

Component Technology Notes
File ingestion Airflow DAGs for scheduled and event-driven ingestion
Format parsing Python parsers CAMT, MT940, BAI2, CSV with custom extensions
Data storage PostgreSQL palm schema for normalized data
Transformation DBT Silver/gold models for analytics
Monitoring Airflow sensors Freshness checks and alerting

Key files/services: - /backend/ingestion/ - Ingestion pipelines - /backend/palm_dbt/ - Data transformation - /db_schemas/palm_schema.sql - Core data model


  • All features depend on this - Data layer powers everything else
  • Roadmap: None currently planned

Last updated: 2026-02-03