We're seeing a value chain forming around zero-knowledge proofs. To understand it, it's helpful to start with the proof lifecycle.

**The life of Pi**

**User intent:**It starts with a user intent, they want to swap a token on an zk-rollup, prove something about their identity, execute a derivative trade, etc...**Proof request:**The application executes the transaction, typically in a zero-knowledge virtual machine (zkVM), and requests a proof.**Proof generation:**Generating a proof is compute intensive. Application developers can use their own prover, but they can also outsource the job to a third-party provers like Succinct Labs, Gevulot, Bonsai and others.**Verification:**Once the proof is generated, it needs to be verified somewhere. Right now, this typically happens on a blockchain. The application runs a smart contract on a destination chain, and that smart contract job is to verify the proof.**Settlement:**Once the proof is verified on-chain, it's settled. At this point, some other component of the application that lives on the destination chain would use the proof to update the state of the application.

A simple bridging example: I want to send 10 USDC from source chain (SC) to destination chain (DC). I lock the tokens in a bridge contract on SC, and the bridge app generates a proof offchain. The proof is verified by a verification contract on DC. When it is, another contract releases 10 USDC.

This simplified lifecycle is helpful to understand the value chain forming around zero-knowledge proofs.

**The proof value chain**

Similarly to block construction in Ethereum, an infrastructure stack and along with it value chain is forming with actors specialising at each step of the process.

**zkVMs: **First is an ecosystem of programming languages and developer platforms to generate proofs. Because proofs are complex cryptography tools, there's demand for simplifying developer experience. Developer platforms like RISC Zero or SP-1 are zkVMs: they make it easy for developers generate ZKPs for general computation, removing the need to care for low level intricacies of circuits. In the background, they are essentially compilers of arbitrary code to provable circuits.

**Prover markets: **Applications can generate their own proofs, but they ultimately aim to use decentralised networks of provers to be censorship resistant and ensure that the service can't go down (liveness). They can run the prover set themselves, like a blockchain does with its validator set, but that job is likely to be taken on by marketplaces of provers like Succinct Labs, Bonsai or Gevulot.

**Proof aggregation: **Proofs generated by the network needs to be verified. This is currently done by a contract on a layer 1 or layer 2 blockchain. But it's expensive. The cheapest kind of proofs (Groth16) cost $20-30 (assuming $3000 ETH and 30 Gwei) to prove on Ethereum. A STARK proof costs $180.

It's a key bottleneck for proof usage, so there is an emerging class of solutions centered on bringing the cost of verification down. The main approach is proof aggregation. The intuition is combining multiple proofs into one proof. That single proof can demonstrate the validity of all the original proofs together, and therefore split the costs on all the rolled-up proofs.

**Proof settlement:** The proofs, aggregated or not, need to be verified on-chain to be used in smart contracts. This is currently done on L1 and L2, but we're excited by the prospect of dedicated layers that focus on bringing verification costs down, and enable interoperability between proof-generating applications.

**Application: **The final part of that value chain is the application where the user transacted, and paid for the service. This is where the flow of money starts, and it flows forward in the value chain.

**Where's the money at?**

Multiple points of the value chain have network effects and defensibility.

The application layer owns the users, and thus the 'proof order flow'. That's a cornered resource, especially in crypto where there are only so many users.

Next down the chain are developer platforms. They benefit from some lock-in of the application developers who use them. That also gives them some power over the order flow, because whoever they integrate with will benefit from the stream of proofs they generate.

Proof markets have a strong network effect. Their job is to match proof request to service providers who can compute them. Higher demand leads to attracting more suppliers of compute resources for proofs, creating the typical marketplace virtuous cycle. The flywheel is compounded by further economies of scale, as volume means higher utilisation rates and thus lower costs. We expect centralisation around that layer, and already see fierce competition.

Proof verification is the more emergent layer, but it also benefits from strong network effects. Aggregation scales with volume: the more proofs you have to aggregate, the lower the costs and the latency (because ceteris paribus, assuming a fixed cost per proof, you complete the batches faster).

Settlement also scales with volume: if proofs are all settled in one place, that layer can become a canonical source of truth for proofs, and thus a trustless interoperability layer.

**Rising and sinking in the stack**

We're expecting the value chain to verticalise, and we're already seeing companies increasing their feature offering to that effect, starting at one layer and looking to integrate up and down the stack.

RISC Zero started out as a zkVM, to let developers generate proofs for Rust and C++ code. They then built Bonsai, a prover market to generate proofs on behalf of their users. It's a great synergy because they own the 'proof order flow' and can direct it their own marketplace. Compiling is not particularly differentiated or defensible, but as we've seen the prover market is a compelling value accrual layer.

Succinct Labs did the same move in the opposite direction. They started out as a prover market. They rapidly added a proof aggregation layer, and went up the stack by creating SP-1, an open source zkVM to compete with RISC Zero. I'm not privy to their decision making, but it makes sense to commoditise the zkVM layer to ensure that no single actor like Risc-0 owns the proof order flow and circumvents your marketplace. They are also going down the stack with a recently added aggregation layer, to further benefit from their 'proof order flow'.

Destination chains like Polygon with a lot of unused blockspace are also looking to go higher up the stack, starting with aggregation.

**Fin**

In conclusion, the zero-knowledge proof value chain is emerging, and already verticalising, with companies expanding their offerings to capture more of the value chain to offer more comprehensive solutions and potentially capture more value.

Prover markets an aggregation/settlement seem to be the layers with the strongest network effect and long-term defensibility. This is where we're seeing the most competition at the moment.

But it's still early and important questions like how value transfer mechanisms will work are still open.

If you're thinking about these topics, write me.