Ethereum Raises Blob Limit To 21, Increasing Data Throughput For Rollups

Semi-realistic Ethereum block image showing blob capacity at 21 for higher L2 throughput in a clean tech style.

Ethereum rolled out its second Blob Parameter-Only (BPO) hard fork on January 7, 2026, lifting the blob limit to 21 and building on EIP-4844 (Proto-Danksharding) introduced with Dencun. The practical objective is to expand per-block data capacity so rollups can post more efficiently, lowering Layer-2 costs and reducing fee spikes that tend to appear when multiple rollups batch at the same time.

This upgrade is intentionally minimal in scope. Instead of introducing new protocol features, it adjusts data-availability parameters, signaling a scaling approach where capacity is increased by tuning values rather than redesigning architecture.

What BPO2 Changed on January 7

With BPO2, each block can now carry up to about 2,688 KB (roughly 2.6 MB) of blob data, while individual blob units remain 128 KB. The fork also raised the blob target from 10 to 14 alongside the new blob limit of 21. Developers frame these settings as directly influencing throughput, pricing predictability, and the operational load placed on nodes and data-availability systems.

Andrew Gross summarized the intent by describing BPO2 as evidence that Ethereum scalability is now governed by parameters, making it possible to increase throughput by modifying capacity. Christine Erispe added that the added headroom allows more L2 batches to be processed in the same timeframe or at a lower marginal blob price, which reduces the chance of fee spikes and improves predictability for batch timing.

Why It Matters for Rollups and Market Microstructure

Since the initial BPO adjustment in December, observers have pointed to reduced fee volatility on mainnet as rollups increasingly relied on blob posting. This January increase is designed to reinforce that trend by giving rollups more room to batch, which should lower per-transaction costs and make operator revenue models easier to forecast.

More capacity, however, comes with infrastructure implications. Higher blob throughput increases demands on node operators and data-availability systems, so node health metrics and monitoring remain a priority as rollups scale into the new limits. For liquidity providers and market makers, the downstream effect is mechanical: posting cadence and batch sizing can shift activity patterns and change slippage assumptions in on-chain pools as transaction flow becomes more efficiently packaged.

How It Connects to the 2026 Roadmap

BPO2 was discussed alongside other near-term capacity steps flagged in a mid-December All Core Developers call, including a possible increase in the protocol gas limit from around 60 million toward 80 million, enabled by improved blob capacity. Later in 2026, the Glamsterdam upgrade is expected to push the gas limit further and introduce Block Access Lists to support more parallel processing. Together, these items represent a practical test of whether parameter-based scaling can keep expanding throughput while preserving decentralization and security under growing demand.

Market participants will now watch how quickly rollups expand their batch sizes under the new parameters and whether the network’s operational overhead remains stable. The core question is whether steady, parameter-driven increases—paired with items like PeerDAS and zkEVM optimizations—can deliver sustained cost reductions without shifting risk onto node operators or weakening the network’s resilience.

Find Us on Socials

Join Our
Newsletter

Subscribe to get latest crypto news!

Latest News

You may also like

The Chain Observer
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.