Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Breaking Down Blockchain Scalability: Why Computation, Data, and Erasure Coding Matter in State Scaling
Blockchain scalability isn’t a one-size-fits-all problem. Vitalik Buterin recently outlined a systematic framework for understanding the different layers of scalability challenges, ranking them by complexity. Understanding this hierarchy is crucial for evaluating Layer 2 solutions and future Ethereum scaling roadmaps. According to Odaily, Buterin structures the scalability challenge across three dimensions, each requiring fundamentally different technical approaches: computation, data, and state.
Computation: The Easiest Layer to Optimize
When it comes to scaling computation, the pathway is relatively straightforward. Computation is the most manageable component to enhance through parallel processing techniques. Block builders can provide “hints” that allow the network to execute operations more efficiently, significantly reducing computational bottlenecks. Another powerful approach involves replacing computationally intensive operations with cryptographic proofs—most notably zero-knowledge proofs, which enable verification without reprocessing the original calculations. These methods have proven effective because they don’t fundamentally alter the blockchain’s security assumptions.
Data Availability: Where Erasure Coding Enters the Picture
Data scaling presents a moderate difficulty curve. The real challenge emerges when systems must guarantee data availability—ensuring all historical transaction data remains accessible for verification and recovery. This is where innovations like erasure coding become essential. Erasure coding allows networks to store redundant data fragments across nodes, enabling recovery of the complete dataset even if some portions are temporarily unavailable. Projects like Ethereum’s PeerDAS implement erasure coding techniques to optimize how data is distributed and validated across the network.
Beyond erasure coding, systems can employ data splitting strategies and support “graceful degradation,” allowing nodes with limited storage capacity to continue validating blocks of equivalent size. This democratizes participation by lowering hardware requirements while maintaining network security and data integrity.
State: The Fundamental Bottleneck
State management represents the most formidable scalability challenge. To verify even a single transaction, nodes must maintain access to the complete state—the cumulative account information, balances, and smart contract data. Even if architects redesign state as a tree structure with only the root hash stored on-chain, updating that root still requires processing the entire underlying state tree. Splitting state across different validators can theoretically help, but such approaches demand significant architectural changes and often introduce new centralization risks, limiting their practical applicability.
The Strategic Principle: Trading Layers for Decentralization
Buterin’s analysis leads to a clear strategic principle: when possible, replace state with data without creating new centralization vectors. Similarly, when feasible, substitute computation for data—again, without compromising decentralization. This hierarchical thinking explains why solutions emphasizing erasure coding and data optimization continue gaining traction: they push scalability challenges down the stack to layers that are more tractable. The framework reveals that long-term Ethereum scalability depends not on solving state directly, but on clever engineering that shifts the burden toward data and computational layers where solutions already exist.