The Non-Scalable Intercept: When Networks Defy Antitrust
Thresholds that create markets, not margins
This post introduces the non-scalable intercept – a coordination threshold in network industries crossed only once without destroying value. When singular, it renders competition incoherent: policy must shift from taming monopolies to governing (or publicly providing) the network core. Some markets require collective provision to exist at all.
Introduction
Brian Albrecht’s recent post “The Invisible Hand Screws Up Your Regression” highlights a key blind spot in applied economics: empirical methods recover slopes – relative effects across treated and control units – but miss intercepts, the system-wide level shifts that matter in interconnected markets. His airline merger example shows networks make this acute.
His point is methodological. This essay asks: what if, in some network industries, the intercept is structurally non-replicable? Clearing it once creates viability; attempting it twice destroys that foundation. That’s the non-scalable intercept.
When Scalable Assumptions Fail
Most economic models proceed as though this level effect were scalable. Fixed costs may be large and coordination demanding, but replication is treated as conceptually straightforward. If demand is sufficient, entry is expected to occur. Competition may be imperfect or inefficient, yet it remains intelligible within the model. The regression overlooks the level shift, while theory quietly assumes it exists and can, in principle, be reproduced.
This assumption breaks down in a specific class of industries. Payments rails, telecom backbones, transport infrastructure – these defy that logic.
In these cases, viability depends on clearing a coordination threshold that can only be crossed once. Below it, production is not simply costly or inefficient, it is too fragmented to work. Above it value appears all at once. Replication does not generate parallel output but instead splits the network into pieces that never add up to a functional whole.
Fragmentation, Not Competition
In practical terms, certain networks function only when a critical mass of users, endpoints, or connections is brought under a single coordinating structure. Fragmentation does not yield smaller competitors operating at reduced scale. It yields sub-scale systems whose combined value falls short of the original whole. Entry reshapes the system rather than disciplining it, and feasibility erodes in the process.
Payments provide the clearest illustration. A card network, clearinghouse, or real-time payment rail derives its usefulness from near-universal participation. Fragmentation raises costs on both sides of the market and reduces acceptance everywhere. Splitting destroys value instead of creating rivals.
Similar dynamics appear in telecom backbones and spectrum coordination, as well as in rail networks, air traffic control, roads, and water systems. In each case, surplus is maximized around a single shared core.
Other cases lie closer to the boundary. Social networks may contract without immediate collapse, though only down to a point, after which connectivity fails. Advertising platforms are more ambiguous, as data portability and interoperability can partially relax coordination requirements. Transport systems vary by layer: airlines compete vigorously atop infrastructure that resists replication.
Interoperability sometimes eases coordination by allowing separate systems to communicate. But when reliability, clearing, or safety demands single-point control, interoperability merely shifts who runs the system – it doesn’t enable replication. The intercept remains non-scalable; only governance changes hands.
Across these examples, the mechanisms differ – network topology, indivisible coordination, common standards, pooled reliability – but they converge on a shared feature: below a critical scale, the object being produced ceases to function as a system.
What distinguishes these cases is not simply that duplication raises costs, a result well understood in the literature, but that replication can invalidate production itself.
Why Antitrust Misfires
This perspective helps explain why antitrust interventions in network industries often feel misaligned. Policy analysis tends to focus on slopes: price effects, margins, exclusionary behavior. Remedies aim to increase the number of firms performing the same activity side by side. That approach implicitly treats the intercept as scalable. When it is not, intervention targets the wrong margin.
Here Albrecht’s missing intercept takes on a different significance. Even flawless empirical work cannot justify remedies that assume post-intervention competition is technologically feasible. The regression can’t see the level shift because economic models already assume multiple firms can replicate the system.
Coordination Before Competition
Once the intercept is understood as non-scalable, the policy problem reorganizes itself. Attention shifts from disciplining a dominant firm toward questions of governance: who controls the network core, under what constraints, and with what accountability.
Markets of this kind often require collective provision to exist at all. Sometimes regulated private provision suffices. In other cases, public ownership offers a cleaner solution, precisely because the network is foundational rather than competitive. The familiar trade-off between monopoly power and efficiency looks different once fragmentation becomes the primary source of inefficiency. Governance failures remain a risk, but they are the relevant risk.
Viewed this way, many antitrust failures in network industries stem from misdiagnosing the constraint, not from weak enforcement. Policy targets pricing and conduct where coordination thresholds actually bind. It mandates entry where fragmentation eliminates the value competition is meant to preserve.
The non-scalable intercept names this distinction. It links a methodological blind spot in empirical work to a structural feature of certain industries and suggests a different institutional sequence: secure coordination first, consider competition later – but only if the system architecture truly permits it at all.

