Original author: @Web3 Mario
Introduction: EigenLayer AVS has been online for a while. In addition to its official long-guided EigenDA and Layer 2 and other related use cases, the author found a very interesting phenomenon, that is, EigenLayer AVS seems to be very attractive to projects in the privacy computing track. Among the 9 AVS that have been online, three belong to this track, including two ZK coprocessor projects Brevis and Lagrange, and a trusted execution environment project Automata. Therefore, I decided to conduct a detailed investigation to explore the significance of EigenLayer AVS to related products and future development trends.
The appeal of “cheap security” is the key to the success or failure of the EigenLayer AVS ecosystem
With TVL having officially exceeded 15 billion US dollars, EigenLayer has had a very fantastic start. Of course, I think most of the funds are for the purpose of obtaining potential airdrop income, but this undoubtedly lays a solid foundation for EigenLayer to enter the next stage. The key to the next stage lies in the success or failure of the AVS ecosystem, because the scale of AVSs fee income determines the timing of EigenLayers transition from the subsidy period to the mature period.
There are already many articles introducing the technical details of EigenLayer, so I will not repeat them here. Simply put, EigenLayer creates a cheap consensus layer protocol by reusing the consensus capability of Ethereum Pos, also known as Restaking. First of all, I would like to discuss the core value of EigenLayer. In my opinion, the core value of EigenLayer mainly has three aspects:
* Decouple the consensus layer from the execution layer so that it can better cope with large-scale or high-cost data processing and consensus : Generally speaking, mainstream blockchain protocols are considered to be a solution with high execution costs but low execution efficiency. The reason for its high execution cost is competition for block space, which is a fashionable word. We know that the execution environment based on blockchain usually uses market mechanisms to adjust the allocation of its node computing resources, that is, those who bid higher have priority to be executed, and the executors are in a competitive relationship. When demand rises, the fair price will continue to rise, and the execution cost will inevitably rise. The low execution efficiency comes from the fact that the original design of blockchain technology is to become an electronic currency settlement system, and the processing of transaction data is time-sensitive. Therefore, the execution layer has to be designed in a serial way, which makes it less efficient when dealing with most scenarios that are not sensitive to timing, such as social networks, AI training and other scenarios.
Decoupling the consensus layer from the execution layer allows application developers to design a dedicated execution environment, often referred to as an application chain or Layer 3, so that its users can get rid of the competitive relationship with users of other applications and reduce the cost of use. On the other hand, it allows developers to develop a more adaptable execution layer according to different application scenarios and improve execution efficiency.
* Consensus as a service, by productizing or resourceizing consensus, fully exploring the potential demand of the market : I think those who have experienced the era of the Hundred Schools of Thought in Layer 1 will have a unified sigh, the establishment of the consensus layer is usually expensive and difficult. In order to maintain their own consensus security guarantees, it may be computing power or pledged funds. Before sufficient profitability is generated, they are in the subsidy stage, and the cost is not low. Usually, the subject matter of the subsidy is the token income obtained from mining. Only a few successful protocols can successfully transition to relying on their own revenue capacity, that is, the fee income, to maintain sufficient consensus capacity. For example, the transformation of the Ethereum economic model. This high startup cost makes many innovative applications discouraged, because the cost of establishing an execution environment suitable for their own applications, or building an application chain by themselves, is too high and faces great risks. This makes the Matthew effect of the Web3 industry very obvious. The evolution of the current Web3 technical solution has basically been engulfed by the technical route of Ethereum.
By turning consensus into a service or product, innovative applications have another option, which is to purchase consensus services according to demand. For example, for an innovative application, assuming that the amount of funds entrusted to the entire application in the early stage is 1 million US dollars, then this means that as long as more than 1 million US dollars of PoS consensus is purchased, the security of its execution environment can be guaranteed, because the economic cost of doing evil is negative. As the application develops, consensus services can be purchased flexibly and quantitatively. This reduces the startup cost of innovative applications, reduces their risks, and fully explores market potential.
* Cheap consensus source : The last point is that EigenLayers consensus source reuses Ethereums PoS funds, which means that for PoS stakers who can only capture one layer of income, participating in EigenLayer can get an extra layer of income. This allows EigenLayer to cleverly transform itself from a competitive relationship with the industry leader Ethereum to a symbiotic relationship, reducing its own cost of attracting consensus funds. This also gives it an advantage in pricing, such as the consensus purchase fee of the AVS protocol, over other protocols, making it more attractive to innovative applications. It has to be said that this is a really smart move.
The above three points allow EigenLayer to provide a cheaper security source for Web3 applications compared to other Web3 execution environments, giving it lower execution costs, better scalability, and a more flexible business model. Therefore, I believe that the key to the active EigenLayer AVS ecosystem lies in whether Web3 applications can be impressed by this cheap security and migrate to the ecosystem in large quantities.
The cost of use is the fundamental reason that restricts the development of Web3 privacy computing track
After discussing the core value of EigenLayer, lets take a look at the dilemma of the Web3 privacy computing track. The author is not an expert in the relevant field, so he focused on the current status of the track of privacy computing-related projects in the AVS that has been launched. That is, the so-called ZK coprocessor. I believe that most cryptographic products that use zero-knowledge proof algorithms face the same dilemma, that is, the high cost of use hinders the promotion of usage scenarios.
It seems that where the concept of ZK coprocessor comes from is not very important. As the name suggests, the original intention of the related products in this track is to use the zero-knowledge proof algorithm to provide coprocessor services for the current mainstream blockchain system, so that it can offload complex and expensive computing operations to the chain, and the correctness of the execution results is guaranteed by zero-knowledge proof. The most classic example of this modular idea is the relationship between CPU and GPU. By handing over parallel computing operations such as image processing AI training that the CPU architecture is not good at to another independent module GPU to handle, the execution efficiency is improved.
The technical architecture of a classic ZK coprocessor project is basically as follows. This is the simplified technical architecture of Axiom, one of the leaders in this field. Simply put, when a user has a demand for a complex calculation, you can use Axioms off-chain service to calculate the result and generate the relevant ZK Proof proof. Then Axiom will use the result and proof as parameters to call Axioms on-chain verification contract. The contract relies on the execution result, execution proof, and the key block information of the entire chain provided by Axiom to the chain, such as transaction merkle root (the process of maintaining the key information of the entire chain is also trustless). These three parts of data verify the correctness of the result through the on-chain verification algorithm. After passing the verification, the result will be notified to the target contract through the callback function to trigger subsequent operations.
It is generally believed that the process of proof generation is a computationally intensive operation, while proof verification is relatively light. From Axioms documentation, we know that the verification gas fee required for a ZK Proof verification operation on the chain is approximately 420,000, which means that if the Gas Price is 10 Gwei, the user needs to pay 0.0042 ETH for verification. Assuming the market price of ETH is $3,000, the cost is about $12. Such a cost is still too high for ordinary C-end users, which greatly limits the construction of the potential use scenarios of this product.
Take the Uniswap VIP program, a scenario that is often promoted by the ZK coprocessor project, for example. Uniswap can use the ZK coprocessor to set up a loyalty program similar to CEX for its traders. When the cumulative trading volume of a trader reaches a certain level over the past period of time, the protocol will rebate or reduce the transaction fee of the trader. Considering that the calculation of cumulative trading volume is a complex operation, Uniswap can use the ZK coprocessor solution to offload the calculation to the off-chain, reducing the calculation cost while avoiding large-scale modifications to the on-chain protocol.
Lets do a simple calculation. Suppose Uniswap has set up a VIP activity where anyone can enjoy free transaction fees as long as they can prove that their cumulative transaction volume in the past month exceeds $1,000,000. A trader chooses to trade in Uniswaps 0.01% transaction fee pool. When the users single transaction volume is $100,000, the transaction fee is $10, but the cost of verification is $12). This will undermine the users motivation to participate in this service and raise the threshold for participating in the activity. In the end, it will only benefit the whales.
Similar cases should not be difficult to find in related pure ZK architecture products. Both the use cases and technical architecture are great, but I think the cost of use is the core constraint that prevents related products from expanding their usage scenarios.
From Brevis transformation, we can see the siphoning effect of EigenLayers cheap security on related products
So let’s take a look at how Brevis, one of the first AVSs, was affected by EigenLayer. I hope this will illustrate that EigenLayer has obvious appeal to related cryptographic products with its “cheap security”.
Brevis itself is positioned as a ZK coprocessor. When it was first launched in early 2023, it was positioned as a full-chain data computing and verification platform . Of course, this is essentially no different from the ZK coprocessor, except that the latter is cooler. For a long time in the past, Brevis has been operating using the so-called Pure-ZK solution mentioned above. This makes it difficult to promote its use scenarios, and in a blog post on April 11, it announced a collaboration with EigenLayer and a new cryptoeconomics + ZK proof solution, Brevis coChain. In this solution, the verification layer is moved from the Ethereum mainnet to a coChain maintained by AVS.
When a user has a computing need, the client circuit calculates the result and generates the relevant ZK Proof proof, and sends a computing request to Brevis coChain through the on-chain smart contract. After listening to the request, AVS verifies the correctness of the calculation, and after passing, the relevant data is packaged and compressed and sent to the Ethereum mainnet, and the correctness of the result is asserted. In the next period of time, like other optimistic verification schemes, it will enter the challenge period, when the challenger can submit the corresponding ZK fraud proof to object to a certain result and strive to confiscate the evildoer. After the confiscation period, AVS will use the callback of the target contract through the on-chain contract to complete the subsequent operations. Considering that most of the topics of privacy computing consider how to trust through mathematics, I would like to call this scheme optimistic trustlessness.
Similarly, Lagrange and Automata must have gone through the same mental journey, and finally turned to launch an optimistic trustless solution using AVS. The advantage of this solution is that it greatly reduces the verification cost. Because in the process of obtaining the correct result, it is no longer necessary to verify the calculation on the chain with high cost, and instead optimistically trust the processing results of the consensus layer of EigenLayer and the security brought by ZK fraud proof. Of course, the shift from trust in mathematics to trust in human nature will definitely face some challenges in the field of Web3. But I think this is an acceptable result compared to the practicality it brings. Moreover, this solution will effectively break the constraints of verification costs on the promotion of usage scenarios. I believe that many more interesting products will be launched soon.
This solution also has a demonstration effect for other privacy computing track products. Considering that this track is still in the blue ocean stage, it should be more conducive to the promotion of the new paradigm compared to the fiercely competitive rollup-related track. I believe that the AVS ecosystem will be the first to usher in the outbreak of the privacy computing track. Since the author is not related to cryptography, there are inevitably omissions in the writing process, and I hope experts will correct me.