core.solver.DataModel

core.solver.DataModel()

Analyzes the ‘Data Wall’ — the throughput bottleneck between storage and compute.

This solver models the data pipeline constraints, comparing the data demand of a workload (e.g., training tokens or high-resolution video frames) against the physical bandwidth of the storage hierarchy and IO interconnects.

Literature Source: 1. Janapa Reddi et al. (2025), “Machine Learning Systems,” Chapter 4 (Data Engineering). 2. Beitzel et al. (2024), “The Data Wall: Scaling Laws for Data Ingestion in AI.” 3. Mohan et al. (2022), “Analyzing and Mitigating Data Bottlenecks in Deep Learning Training.”

Methods

Name Description
solve Solves for data pipeline feasibility.

solve

core.solver.DataModel.solve(workload_data_rate, hardware)

Solves for data pipeline feasibility.

Parameters

Name Type Description Default
workload_data_rate Quantity The required data ingestion rate (e.g., TB/hour or GB/s). required
hardware HardwareNode The hardware node with storage and interconnect specs. required

Returns

Name Type Description
Dict[str, Any] Pipeline metrics including utilization and stall probability.
Back to top