Have you heard about SAP RTP yet ?

Beyond the LLM: Meet SAP RPT, The Foundation Model for Business Data

Why treating your financial ledgers like Shakespeare is holding your AI back—and the new table-native breakthrough that fixes it.


Businesses run on structured data. From financial ledgers to supply chain logistics, tables are the backbone of the global economy. Yet, for years, applying AI to this data has had a fundamental limitation: The "One-to-One" Trap.

Until now, if you wanted to forecast demand, you built one model. If you wanted to predict late payments, you built another. A third was needed for sales opportunities. This fragmented approach is expensive, slow, and requires constant maintenance.

But what if you had a single, universal AI engine that could understand your relational business data and address virtually any predictive task instantly? Enter sap-rpt-1 (Relational Pretrained Transformer).

A Paradigm Shift: Table-Native AI

We are all familiar with Large Language Models (LLMs) like GPT-4. They are amazing at text. But business data isn't text—it's relational. It lives in rows and columns.

sap-rpt-1 applies the "foundation model" concept to tables. It is designed from the ground up to understand the relationships, structures, and semantics inherent in tabular enterprise data.

How It Works: In-Context Learning

Imagine hiring a brilliant consultant. You don't send them to a 6-month training camp. You just show them a few rows of data: "Here are customers who paid late, here are customers who paid on time." They immediately get the pattern.

That is In-Context Learning. You provide labeled table row examples in the prompt, and SAP RPT delivers the prediction for your query row instantly. No fine-tuning required.

The Table-Native Architecture

Prediction Output (Class/Regression)
SAP RPT Engine (2D Layers)
Cross-Column Attention ↔ Cross-Row Attention
Context Row 1
Order: 16/8/24
Price: $1792
(Labeled)
Context Row 2
Order: 18/8/24
Price: $500
(Labeled)
Query Row
Order: 20/8/24
Price: ???
(Target)

*Simplified schematic of the 2D attention mechanism

Beating the Specialists

You might think a "generalist" model would perform worse than a specialized model trained for one specific task. Surprisingly, the data shows the opposite. On SAP-specific benchmarks mirroring real business complexity, sap-rpt-1 outperformed task-specific models substantially.

Performance: SAP RPT vs. Narrow AI (Error Reduction)

Figure 2: Relative improvement of sap-rpt-1-large over narrow AI predictions.

We see up to 2.0x error reduction in complex domains like Materials Management. This means fewer supply chain hiccups and more accurate invoicing.

The Elephant in the Room: Why not just use GPT?

LLMs are versatile, so why not just dump your table data into a prompt and ask for a prediction? While possible, there are two major issues:

  1. Accuracy: LLMs process numbers as text strings, making them prone to "hallucinations" in calculation.
  2. Efficiency: Using an LLM for these tasks requires up to 100,000x more floating point operations (FLOPs) and takes 50x longer.

When benchmarked against state-of-the-art LLMs, the table-native SAP RPT model was drastically more accurate:

Performance: SAP RPT vs. State-of-the-Art LLM

Figure 3: Relative improvement of sap-rpt-1-large over LLM predictions.

In Sales scenarios, SAP RPT achieved a massive 3.5x improvement over leading LLMs.

Insights in Minutes, Not Months

This technology democratizes AI. An accountant can assign cost centers or a logistics manager can predict shipping delays simply by providing a few historical examples—without waiting months for a data science project.

Try it Yourself

You don't have to wait. Preview sap-rpt-1 now via a no-code UI or API.

Visit rpt.cloud.sap →

Also available as Open Source on HuggingFace and GitHub.

Back to blog