Vertical Integration in the Zero-Knowledge Proof Value Chain

Network effects and value capture strategies

We're seeing a value chain forming around zero-knowledge proofs. To understand it, it's helpful to start with the proof lifecycle.

The life of Pi

post image
The zero-knowledge proof lifecycle
  1. User intent: It starts with a user intent, they want to swap a token on an zk-rollup, prove something about their identity, execute a derivative trade, etc...

  2. Proof request: The application executes the transaction, typically in a zero-knowledge virtual machine (zkVM), and requests a proof.

  3. Proof generation: Generating a proof is compute intensive. Application developers can use their own prover, but they can also outsource the job to a third-party provers like Succinct Labs, Gevulot, Bonsai and others.

  4. Verification: Once the proof is generated, it needs to be verified somewhere. Right now, this typically happens on a blockchain. The application runs a smart contract on a destination chain, and that smart contract job is to verify the proof.

  5. Settlement: Once the proof is verified on-chain, it's settled. At this point, some other component of the application that lives on the destination chain would use the proof to update the state of the application.

A simple bridging example: I want to send 10 USDC from source chain (SC) to destination chain (DC). I lock the tokens in a bridge contract on SC, and the bridge app generates a proof offchain. The proof is verified by a verification contract on DC. When it is, another contract releases 10 USDC.

This simplified lifecycle is helpful to understand the value chain forming around zero-knowledge proofs.

The proof value chain

Similarly to block construction in Ethereum, an infrastructure stack and along with it value chain is forming with actors specialising at each step of the process.

post image
The proof value chain

zkVMs: First is an ecosystem of programming languages and developer platforms to generate proofs. Because proofs are complex cryptography tools, there's demand for simplifying developer experience. Developer platforms like RISC Zero or SP-1 are zkVMs: they make it easy for developers generate ZKPs for general computation, removing the need to care for low level intricacies of circuits. In the background, they are essentially compilers of arbitrary code to provable circuits.

Prover markets: Applications can generate their own proofs, but they ultimately aim to use decentralised networks of provers to be censorship resistant and ensure that the service can't go down (liveness). They can run the prover set themselves, like a blockchain does with its validator set, but that job is likely to be taken on by marketplaces of provers like Succinct Labs, Bonsai or Gevulot.

Proof aggregation: Proofs generated by the network needs to be verified. This is currently done by a contract on a layer 1 or layer 2 blockchain. But it's expensive. The cheapest kind of proofs (Groth16) cost $20-30 (assuming $3000 ETH and 30 Gwei) to prove on Ethereum. A STARK proof costs $180.

It's a key bottleneck for proof usage, so there is an emerging class of solutions centered on bringing the cost of verification down. The main approach is proof aggregation. The intuition is combining multiple proofs into one proof. That single proof can demonstrate the validity of all the original proofs together, and therefore split the costs on all the rolled-up proofs.

Proof settlement: The proofs, aggregated or not, need to be verified on-chain to be used in smart contracts. This is currently done on L1 and L2, but we're excited by the prospect of dedicated layers that focus on bringing verification costs down, and enable interoperability between proof-generating applications.

Application: The final part of that value chain is the application where the user transacted, and paid for the service. This is where the flow of money starts, and it flows forward in the value chain.

Where's the money at?

Multiple points of the value chain have network effects and defensibility.

The application layer owns the users, and thus the 'proof order flow'. That's a cornered resource, especially in crypto where there are only so many users.

Next down the chain are developer platforms. They benefit from some lock-in of the application developers who use them. That also gives them some power over the order flow, because whoever they integrate with will benefit from the stream of proofs they generate.

Proof markets have a strong network effect. Their job is to match proof request to service providers who can compute them. Higher demand leads to attracting more suppliers of compute resources for proofs, creating the typical marketplace virtuous cycle. The flywheel is compounded by further economies of scale, as volume means higher utilisation rates and thus lower costs. We expect centralisation around that layer, and already see fierce competition.

Proof verification is the more emergent layer, but it also benefits from strong network effects. Aggregation scales with volume: the more proofs you have to aggregate, the lower the costs and the latency (because ceteris paribus, assuming a fixed cost per proof, you complete the batches faster).

Settlement also scales with volume: if proofs are all settled in one place, that layer can become a canonical source of truth for proofs, and thus a trustless interoperability layer.

Rising and sinking in the stack

We're expecting the value chain to verticalise, and we're already seeing companies increasing their feature offering to that effect, starting at one layer and looking to integrate up and down the stack.

post image

RISC Zero started out as a zkVM, to let developers generate proofs for Rust and C++ code. They then built Bonsai, a prover market to generate proofs on behalf of their users. It's a great synergy because they own the 'proof order flow' and can direct it their own marketplace. Compiling is not particularly differentiated or defensible, but as we've seen the prover market is a compelling value accrual layer.

Succinct Labs did the same move in the opposite direction. They started out as a prover market. They rapidly added a proof aggregation layer, and went up the stack by creating SP-1, an open source zkVM to compete with RISC Zero. I'm not privy to their decision making, but it makes sense to commoditise the zkVM layer to ensure that no single actor like Risc-0 owns the proof order flow and circumvents your marketplace. They are also going down the stack with a recently added aggregation layer, to further benefit from their 'proof order flow'.

Destination chains like Polygon with a lot of unused blockspace are also looking to go higher up the stack, starting with aggregation.

Fin

In conclusion, the zero-knowledge proof value chain is emerging, and already verticalising, with companies expanding their offerings to capture more of the value chain to offer more comprehensive solutions and potentially capture more value.

Prover markets an aggregation/settlement seem to be the layers with the strongest network effect and long-term defensibility. This is where we're seeing the most competition at the moment.

But it's still early and important questions like how value transfer mechanisms will work are still open.

If you're thinking about these topics, write me.

Using a decision journal for more thoughful decisions

Insight with hindsight

I am trying out using a decision journal.

Good decisions are the result of a good process, and a good process is an attempt at accurately representing our current state of knowledge and information.

Here's a simple process: for any meaningful decision, I write down my reasoning for making this decision.

The goal is to become more intentional about why and how I do things.

My current structure is the following:

post image
my decision making template in obsidian

It's a forcing function to express the core assumptions I make, and scrutinise them.

Turning them into predictions is a good way to test them, and revisit the quality of my decisions later on.

Having that written down let you assert that your process for making the decision was right. The outcome may not be, but the process should be.

The feeling part is also important because it gives room for intuition to have a say. It might make me revisit my decision.

I apply this to all decisions: investments, how to spend your evening, purchases, trading, etc.

Open systems create emergent behaviours

A thesis for open social graphs

My goal in this piece is to develop a view on web3 social, that justifies beyond ideology why a decentralised open graph can be superior to a constellation of closed graphs, and what are the reasons to believe that there's a credible path to their adoption.

My thesis is that the web3 social networks can enable better experiences and stronger network effects than existing social products. Their core advantage is incentivising an ecosystem of developers that build experiences that benefit users and the third-party applications building on top. Open systems attract developers, and in consequence, create emergent behaviours.

Open systems that support emergent behavior are way more likely to become platforms and we are excited by the possibilities of new consumer facing web platforms.

Fred Wilson, Twitter investment thesis in 2007

It’s not a  new idea - it was part of the investment thesis for Twitter. And it was spot on: their mobile client, a big driver of growth in the mobile age, was developed by a third-party developer. The word tweet and the bird logo came from another third-party client that was so popular it became the network's brand. Key behaviours in the app like threads came from the experiments of hobbyist and tinkering users.

The promise with web 3 is that the social protocols we’ll use won’t be able to close off the graph and API like all previous social networks eventually did. Twitterific, the client that gave us the bird and the verb, lost their access to Twitter when the firm did an unannounced and undocumented policy change. The second promise is that that value will flow fairly from the protocol to the third party application developers, therefore supercharging the cycle that encourages more them to tap into the user base and build companies.

This being said, Facebook’s early days traction is sobering for web3 social - within a few months it had reached millions of users. We can’t say the same of web3 social. It’s hard to point to unique and differentiating features with a massive pull on users. If we believe that an open ecosystem can create those unique experiences, we need to explore how social products built on open graphs can compete.


I see three vectors of significant user experience improvement, and some early signs of what they can look like:

  1. Open ecosystems of apps:

Guaranteeing platform access to developers promotes the number of apps being built and increases the rate of innovation.

Social media have often been platforms for developers to build third-party apps (and sometimes clients) and it has often contributed to their success (Zynga representing 12+% of FB revenue in 2011, Twitter mobile client).

Open data graphs guarantee third-party developers access to the data which lowers the barriers to entry for new developers because a) they skip the cold start problem of having no data/content/users b) it reduces the friction for users to adopt the new app where both users' profiles and social following are pre-loaded c) they are natively social/tailored to the users preferences.

There are some early signs of this happening in the Farcaster with:

While it is too early to call them hits, they show that a number of high caliber builders are committing months of full-time work to building the experience that might in turn attract more users. 

  1. Cross context network effects:

The internet’s  'one-app one identity' architecture leads to a fragmentation of data across apps with little to no integrations among them and user lock-in.

An open graph creates the native possibility to opt-in to share data across contexts. In terms of UX, it could look like: 

  • starting on a new app with a pre-loaded profile and follow-list

  • Seing actions taken by the people you follow in a third-party app (e.g. on an exchange, or see their restaurant ratings on Google Maps).

This increases the potential strength of the network effect of sharing a backend for a social graph: the number of nodes in the network scales much faster than in a setting with closed and split user databases.

We can expect a couple of large scale successes to exert a strong pull for other platforms to build on top of the same open graph. That would be a process similar to how Uniswap significantly improved the UX of using Ethereum, attracting more users, thus making it more attractive for developers to build on the platform.

  1. Granular curation:

The ad-based business model of social media dictates engagement to be the variable to optimise, often at the expense of quality content.

With the content and user graph in open storage, there's a proliferation of options to tailor content recommendation with a high level of granularity.

There are some early experiments in the Farcaster ecosystem that either focus on certain types of contents (Alphacaster for DAO-related content) or giving users the flexibility of building and sharing custom feeds like Jam or Discove.

Broader possibilities could include: restricting content only from verified humans, surfacing niche content shared within a subpart of the graph, creating new types of content (from the graph’s metadata)

These vectors open the possibility for users and developers to build apps, experiences, or surface knowledge that 


What exactly this 'social layer' will look like is hard to predict but here are some traits I think we can expect:

"The last handle you'll create":

  • In its steady state, it could be the canonical online identity layer - the address that represents all your connections to other profiles and authorship over any content.

  • A good example of what that might look like is ENS that are used by most crypto users as their global ID.

  • That profile controlled by a private key would serve as the primary identifier pointing to a number of alternative context-specific profiles.

  • Serving as a single sign-on would immediately makes apps aware of all that existing data (if users want to).

  • The big mental shift here is that rather than having apps have all the context, the profiles are the ultimate container of data and bring it to the dApps.

  • Some UX possibilities to visualise: connecting to an app with tailored recommendations based on other apps you've used or your following; anonymous contributions with provable reputation (anonymous reply with proof of number of followers)


A web 2 to web 3 continuum

  • I find it unhelpful to oppose web 2 and web 3 especially in the context of social - it’s a continuum based on how much is stored on decentralised infrastructure.

  • The experience is likely to be opt-in interoperability - users are presented with the option to share as much or as little context with the apps they use.

  • Some UX possible to visualise: connecting to GMaps and toggling "see ratings from my circle", using a web3 key to sign and timestamp publications on web2 social platforms for authenticity, storing content on decentralised storage solutions


Immutable APIs

  • Developers incentive to build on top of social networks is that out of the box you get a) rich data about the users for personalisation, b) an existing user base, c) a network of existing integrations in various platforms, d) out of the box social features like follows and comments, and e) a monetisation model.

  • That explains why building on top of Facebook or any other social network was great, until it wasn't.

  • At some point, it became in the economic interest of platforms to close access, and so they did.

  • What's changed is that blockchains can be viewed as computers that can make commitments, commitments of the sort - here's irrevocable access to the social graph.

one small idea

Written by

weekly (mostly) publication. I write to learn, sharing ideas and notes for what I find interesting

Subscribe