State of Crypto: Trends, data, more
Sonal: Welcome to web3 with a16z crypto. I’m your host, Sonal Chokshi and Editor in Chief here at a16z crypto but today I invited Robert Hackett, Features Editor and Head of Special Projects on our team to guest host this episode which is all about our latest State of Crypto Report and this year also introduces a new interactive tool, the State of Crypto Index which helps visualize web3 and tech progress towards building the next internet, which is the theme of the show. You can find both of these reports and the index and these accompanying posts and resources at a16zcrypto.com/stateofcrypto or a16zcrypto.com/soc. The report was coauthored and produced by Daren Matsuoka, Eddy Lazzarin, Robert Hackett, and Stephanie Zinn.
Before we begin, none of the following is investment, business, legal or tax advice. See a16z.com/disclosures for more important information, including a link to a list of our investments. Also please note that any charts, data, or projections discussed are subject to change without notice, may differ from opinions expressed by others and are for informational purposes only. They should not be relied upon when making any investment decision. The content also speaks only as of the date indicated and a16z has not independently verified third party sources nor makes representations about the enduring accuracy of the information.
Okay, with that, now here’s Robert to introduce the guests and the episode.
Robert: Thanks, Sonal. Hi, all. Thanks for joining us. I’m here with Eddy Lazzarin, chief technology officer at a16z crypto, Daren Matsuoka, lead data scientist and Chris Dixon, founding general partner of a16z crypto. The first half of the conversation digs into the findings and also touches on themes such as infrastructure, NFTs, gaming, the creator economy, energy, zero knowledge and more. That’s from our recent Twitter Spaces. And the second half of the episode was recorded separately so we could go deeper into the methodology, index, and bigger picture. The first voice you’ll hear after mine is Chris followed by Eddy, then Daren.
Robert: Welcome, everybody.
Eddy: Hey.
Chris: Hello.
Robert: So, we’ve seen lots of people tweeting various insights and takeaways from the report but I want to know what your top takeaways are. What is the single most interesting or surprising statistic or trend you came across this year? Chris, you wanna take this one?
Chris: I think the key message to take away is a lot of really interesting work, progress made on the infrastructure side…particularly around the Ethereum ecosystem and Layer 2s [L2s]. A lot of the basic framework came from work we did a couple of years ago where we tried to look at…to separate the fundamentals… like, the way we kinda think about it is all technology markets have different ways they progress and there’s a lot of focus on financial markets. This could be crypto and non-crypto. It’s also true of other kinds of tech. And then there’s sort of the fundamental progress of the technology, of the user-application infrastructure.
And so, the core idea with this report – which we’ve been doing now for a couple of years – is to really look at the fundamentals and to take what appears to be a chaotic process and try to understand it and see the overall logic. And so, you know, things like the number of developers, the number of applications, the progress of infrastructure. In a lot of computing waves…a key metric is price-performance. There’s Moore’s Law in semiconductors and an equivalent in the blockchain space would be kind of the costs, gas costs or blockspace costs for secure, high-quality transactions.
There’s a lot of data in there. I won’t go through it all. I’ll let Daren and Eddy talk about some of the details…which I think show nice progress on a lot of dimensions.
Robert: That’s great. So, you’re seeing a lot of improvement when it comes to price and performance in web3, particularly with respect to how blockchains perform. Eddy, it looks like you wanna jump in.
Eddy: There’s a ton of things I could pick. Obviously, I’m a little biased. I like a lot of the statistics in the report. I could definitely choose many trends. Maybe one to highlight is NFT buyers. I thought one that’s really neat is that NFT buyers has decreased since the recent highs, like, in the last year. But they’re down a lot less than I think you’d think and down much less than volume, which indicates that maybe there’s some new interesting patterns that we started to see unfolding in terms of how people buy them and what exactly they do with them. We’ve seen a little bit of a trend…this is now more, a little more anecdotal but around more patronage style models where people are purchasing NFTs in primary sales and using them as part of a way to support a creator directly with basically minimal to zero intermediaries.
I think it’s a really cool pattern. It’s a little subtle in the data. Another one I’d point out if I can do two is that ZK [zero knowledge] slide we have and we show incredible advancements in the pace of proving speeds, proof sizes, and verification speeds. These all make it a lot easier to try to experiment with and incorporate two key benefits of ZK technology, which is scaling through succinctness and privacy. It’ll just open up a lot of space for people to start to experiment with them since now they’re finally becoming economically easy to do and fast.
Robert: We got a lot of feedback on that zero-knowledge slide so we’re definitely gonna spend some time on that. But I want to linger on something you raised at the outset about the number of NFT buyers which has decreased since the highs in early 2022 but has actually jumped up in recent months and actually appears to be possibly starting something of a rebound. I wanna tease that apart a little bit. Eddy, what’s going on there? Why this difference all of a sudden?
Eddy: Yeah, well, the key point…and this is a theme of the whole report, right, is if you zoom out despite the volatility, as Chris was alluding to, there seems to be an underlying order in the product cycle, but also an underlying trend, maybe a smoother growth trend. If you squint a little bit when these technologies come in, they pop, there’s chaos, and then there is adoption that comes slowly after. The NFT buyers slide demonstrates this really clearly. There’s basically no activity in 2020 and it only begins in early 2021. Where we’ve stabilized now is still radically above where we were. So that’s kind of the big picture trend.
Where the excitement is coming in…more recently, Manifold open editions and a variety of other projects have been making it really easy for creators to experiment, not just with new features and new ways that NFTs can be incorporated into projects, but also with the process of minting them and distributing them. We’ve seen a lot of contention on the royalty side and it’s a fascinating other topic that we should probably unpack some other time. But despite the royalty fees, primary sales and direct relationships with consumers are being experimented with. And that’s great. That’s exactly what we want to see.
Robert: Just for people who don’t know, you referenced open editions. Maybe you could just elaborate on that.
Eddy: I’ll let Daren say a little bit more on open editions but it’s a way to create mints that often end up being, like, a little cheaper, a little more in the moment and related directly to current efforts by a creator. Daren, do you have any thoughts about open editions?
Daren: Yeah, I’d say the big point to make is that it creates a much more accessible experience for people compared to limited editions which often…you know, the best ones just increase in price to a point where it’s pricing out 99% of potential buyers. And so, it’s a cool way to bring more people into the space and drive those organic use cases as opposed to the speculative ones.
Robert: That’s interesting, yeah. So, it sounds like maybe instead of the big-ticket item, the high price, one-off sale NFT, it’s now giving way to something a little bit more open and accessible to broader audiences?
Daren: Yeah. And look, I’d also tie this in with what’s happening in L2s because it’s not a coincidence that…if most NFT transaction fees are very high, which is absolutely the norm of what we saw throughout a lot of the late 2021, early 2022 period…if transaction fees are really high, you’re not gonna be able to do much with very low price NFTs or things that experiment with all kinds of new features that maybe mutate state on-chain, which is expensive. If gas fees come down and there’s more high-quality blockspace at a low price, then you can have cheaper NFTs. You don’t have to justify the expense of a transaction by arguing that it’s an investment or some speculative behavior. Instead, you can do all the cool kinds of things, all the computer behaviors where NFTs are their own programs and have interesting interactions with other kinds of online objects. Like, that’s a really cool behavior and that requires very low transaction fees which require scaling, which we’re seeing in L2s.
So, it all kinda comes together in my mind.
Robert: L2s is a huge, meaty topic that we’re going to also get into. But before we do that, I gotta say, one thing that really stuck out to me in this report is the fact that active addresses is now at an all-time high. There are more than 15 million monthly addresses making on-chain transactions. That’s a pretty surprising statistic and especially when you couple it with a decline in mobile wallet users. It’s very odd that on the one hand, you have, you know, active addresses hitting an all-time high and yet, at the same time, this big decrease in mobile wallet users over the months from the peaks that we saw in early 2022. What’s going on there? A lot of people were curious about this. Why are we seeing that odd trend in the data?
Eddy: Yeah, that’s a great one. I’ll let Daren weigh in on this too. I admit, we have to go another level deeper and probably find additional sources for underlying metrics to really refine the story about why this is, but personally, I use a desktop wallet and I don’t use a mobile wallet in my day to day. That might change soon. I’ve been seeing some really great products launched like Uniswap’s wallet very recently. That’s exciting to see. But it’s hard to say. I don’t know. I can speculate more about how app metrics can be a little imprecise. The improvements in desktop wallet software and the norm that has been set in crypto for some time about custodying and taking your wallet experience very seriously and leaving it in more secure…I’d have to speculate. Do you have any thoughts, Daren?
Daren: Yeah, I guess the only thing I would add is we’re seeing more applications that are kind of embedding the wallet experience into the app itself which means that they’re not requiring users to bring a wallet or their mobile but instead are kind of facilitating the transactions on behalf of users in an attempt to really make it a better user experience that can attract a more mainstream audience. That’s certainly one trend that we’re looking at.
Eddy mentioned the Uniswap wallet. I’m also very curious next month to see what the numbers look like with that new one included because that’s something we can certainly do. We’ll continue to update the data. We’ll continue to look at new players. It’s clear that the mobile wallet experience isn’t fully solved yet. And so, we’re constantly looking for new entrants and builders that are trying to solve some of these hard problems. But to Eddy’s point, we will have to do a deeper dive on some of the specifics there.
Robert: That’s a great point. Just to also continue to tease that apart another statistic that was in here, in the report, is how web3 games generate 23 times more on-chain transactions than DeFi [decentralized finance]. And there have been a lot of web3 games that have come out in the past year. I mean, more than 700. So, I wonder how that trend is affecting what’s going on in the industry right now. It seems like we’re seeing this big uptick in gaming, in on-chain gaming. What’s going on there?
Eddy: Yeah. First…the interaction model of a game is so different from a financial interaction model, right? With your bank or your other financial services, I don’t know that people use them hundreds of times, thousands of times in a session, so to speak, right? You probably issue a couple of transactions a day to credit cards and other things like that. Some days are a little busier than others. But in a game, if you’re mutating a lot of state, you might do thousands of things. So, it makes sense that games would generally be more computationally intensive and engaging in their use. Of course, that means we need better scaling. That means we need lower transaction fees.
And it’s worth pointing out that a lot of these game transactions, when you go underneath the hood, they’re happening on their own network. They’re happening on the scaling solutions. We’ve been seeing many more games in the last year pop up on L2s and move between them. So that’s no surprise at all and maybe to round it out…in the history of technology, gaming tends to push the envelope in all dimensions, right? Demands for better hardware drove the development of GPUs. All kinds of other hardware advancements like laser mice and stuff like that were pushed to some extent by games. And that’s also true with software. You know, broadband with low latency internet connections, all these types of things. Gamers have intense demands and are very open to experimentation as long as it creates great and novel game experiences.
So, I’m a big gamer and I’m pretty excited to see how it unfolds there. It’s just not a huge surprise to me given the amount of interesting economic and composability opportunities there are when you bring crypto into a gaming environment.
Robert: Yeah, what you said there…it called to mind that meme of that guy with a domino and he’s got the one small domino and the big domino on the end. When it comes to the way that…
Eddy: Totally.
Robert: The small domino would be, like, Nvidia’s GPUs helping gamers and then at the other end you have, like, general artificial intelligence.
Eddy: Yeah, yeah, exactly. At the bottom is, like, John Carmack fooling around with making a 2D game look 3D with Doom and then at the top you have, like, John Carmack making the AGI.
Robert: A fun way to look at the timeline, for sure. Eddy, you said a phrase in there while you were talking about on-chain games. You said “mutating state”. While that’s fine for the computer scientists in the room, I feel like for the more consumer-oriented gamers out there…what does that look like in on-chain gaming? First of all, what does that mean? Like, what can you do in an on-chain game that you can’t do in another type of game?
Eddy: Yeah, it’s a great question…so by mutating state, I just mean changing around things, like, changing around data, moving around data, manipulating data. Of course, every time someone plays a game, there’s a lot of data shuffling around. There’s graphics data, there’s network data. If you’re playing a multiplayer game, someone has to run a server and that server is managing the mutations of a lot of data that has to do with the state of the game and what’s happening in the game. An interesting concept. And we actually have a podcast on this with Arianna and me and Sonal where we talk a little bit about web3 games, on-chain games. So, I encourage you to listen to that if this topic is interesting to you.
But the basic idea is that when you run a multiplayer game server now, you have to have that run by somebody. And there’s a little bit…of course, it’s not a real big deal in some senses, but you have to trust them to run that server. And that’s no problem. You know, game companies want people to have a great time playing games. But what’s interesting is that if you can have that state managed in an autonomous way, like, by no one in particular or maybe just governed by specific rules that are set in place at the beginning of that game or the genesis of that game universe, that becomes really interesting because then you can have all kinds of different people build things on top of that game world or that game state and extend it and remix it and recombine it and take the objects and have them compose with each other in ways that make modding and different kinds of game changes a first class part of the game.
And you don’t have to rely on the game creator to moderate it. Instead, you can have the whole game governed by rules that are set in place at the beginning which means anybody can change the game but they can’t make it unfair or they can’t cheat or they can’t, you know, shut down and blow up the game for everybody else. So to sum it up in brief, there are interesting things made possible when no one in particular controls the game server that you couldn’t do if only a specific person is the one responsible for running it.
Chris: Well, to that point, Eddy, I think to me the most exciting part of that architecture is the composability it enables. Once you have a trustless server, you have an incentive for third party developers to build on top of it and you have this sort of bottom-up creativity where people can build on…like the way people build a city or something where somebody builds a core and then people build around it. You can imagine something that starts off as a small game evolving into a much larger world and eventually into a set of worlds. To me, it all starts with the trust model of having nobody running the back end of the game.
Eddy: Yep.
Robert: That’s a really good point. And actually, it segues nicely into a question we received. Do you feel there’s enough community involvement in the steering and development of the industry? Given what you said about including more network participants into becoming owners and operators of this stuff, is there enough community involvement in web3 from what we’re seeing?
Chris: Kinda interesting question. I admit I’m not 100% sure how to parse it because, maybe I’m being a little silly. I feel like crypto is obsessed with the community, uniquely, in a fascinating way. Sometimes in a way that’s silly but often in a way that’s incredibly earnest and serious and material because one of the underlying core ideas to me is that you can hand control of a project or software or a network or system to a community and, like, authentically hand them control in a way that can’t be done in a web2 world or… without legal agreements. So, are they involved enough? I mean, I think I’d have to go project by project to say but I think crypto rightly is incredibly preoccupied with the way that communities can become involved and incentivized and owners and collaborators all with each other to develop the future protocols of the internet.
Daren: Maybe just one data point that we can add from our report is if you look at participation in DAO governance as measured by the number of monthly active voters on Snapshot, which is the most popular platform for DAO voting and proposals, that number has pretty consistently grown even during the bear market which I was surprised to see. You know, 13 million total votes have been cast. Almost two million unique voters, 78,000 proposals. So, the stats just around community driven governance with these DAOs has been growing nicely.
Robert: Yeah, that’s great to point out. It is pretty interesting to see that going up. It is really interesting to see some of these maintain an upward slope generally even despite whatever prices and the financial markets are doing. Eddy, at the top you mentioned scaling blockchains and that’s obviously a huge area. And in particular it has to do with what we were just talking about about increasing community participation because if you can scale blockchains, more people can get involved at lower price points, there are fewer obstacles then to participation. What did we find there?
Eddy: In terms of scale?
Robert: Yeah. How is the challenge of scaling blockchains coming along?
Eddy: Yeah. It’s a great one. This is one of my favorite graphs. This shows the increase in blockspace on Ethereum L1 being consumed by L2s. And the graph shows that 7% of Ethereum fees now are paid by L2 rollups and there’s a whole bunch in that category. We published this last year, updated it again this year. That has been a pretty steady and healthy trend upward. That to me is the strongest direct evidence that Ethereum is scaling and we expect that number to keep going up, right, because as L2 technology improves, as the marginal cost of transactions goes down, there’s more things you can do that are worthwhile, more ways that you can experiment.
There’s a variety of technological choices represented in that set. We’ve got optimistic rollups, we have ZK rollups. It’s a variety of different approaches. So, I think it’s going really well. I think this year is gonna be a year of a lot of discussion around L2s, the tradeoffs between them, developer tools around using them, user experience improvements so that users kinda can abstract or reason about what exactly they’re using. It’s headed in the right direction. Of course, after the recent upgrade, it sounds like the next big Ethereum technological upgrade is likely to focus at least in part on tech to help scale these L2s even further. So, I’m excited to see that. I’m personally in the camp that radically decreasing the cost of blocks, high quality blockspace, is a key ingredient in allowing for experimentation that people need to do to discover the new types of products and things that can really go mainstream. So, I’m excited for that.
Robert: I would like to repeat that stat you mentioned because it is kind of mind-blowing. Two years ago…we didn’t have a report two years ago but if you look back retroactively at 2021, the fees being paid by L2s on Ethereum was effectively 0%. Now last year when…it was the first time we put out this report. It was about 1.5%. And now it’s up to 7%. So L2s are paying 7% of all Ethereum fees right now. That’s a big jump. And a lot of that actually happened in the past month. It was 4.5% a month ago and now it’s 7%. How are we expecting these numbers to change moving forward through the year?
Eddy: They’re gonna be bouncing all around, up and down. One of the repeated motifs of our conversation and the report is that there’s no graphs that are straight lines. So, this is not gonna be a straight line up or straight line sideways or down or anything. It’s gonna have its ebbs and flows. I think it is funny that the more activity that goes onto L2s, then the cheaper it is to transact on L1. So, there’s some kind of countervailing forces where, you know, if tons of activity, if all the activity moves to L2s, then this actually could…L1 could get a lot cheaper which could bring in more transactions. I don’t mean to get confusing but the point is I think it’ll remain a little bit volatile but in the long term this is likely to equilibrate much higher than 7%.
Robert: Eddy, you mentioned an upgrade that’s in the works on Ethereum that would help scale blockchains more. It would be a nice fit for L2 progress. Maybe share a little bit more insight about that.
Eddy: Yeah, that upgrade many call “proto-danksharding”. It’s known as EIP-4844. And it’s an upgrade that sets aside a special pocket for data called blob data that’s necessary to use in the operation of optimistic rollups but also to some degree zero knowledge rollups. And this separate, little, temporary area of data storage serves the needs of these rollups but without having to permanently store the data and the underlying L1 in Ethereum which is what happens now. So, it’s just a different type of mechanism that means we don’t have to store permanently what doesn’t need to be stored permanently. And that just takes a ton of the pressure of the storage costs because you need to store less.
Robert: And so what’s the outcome of that?
Eddy: Yeah. It’s always hard to say exactly and I actually think that, like…I’m gonna say something and I’m gonna contradict myself a little bit, but…some think it could drop L2 transaction fees by more than an order of magnitude. Maybe even two orders of magnitude. And there’s a lot of reasons to believe that that’s true and that it makes sense. The part where I disagree slightly is that I think where Ethereum transaction costs or any blockchain’s transaction costs end up is a function of their demand. And as the price goes down, there become more and more things that are worthwhile. Like, just to give a kind of obvious example, if it really consistently costs less than a penny to send money on Ethereum or near Ethereum, then payments would all of a sudden become a viable thing again which was often talked about in early Ethereum, at least a primitive form of payments.
If that became feasible, then that would increase demand which would mean that the cost would equilibrate at the marginal value of using…of admitting a transaction. So, all that’s to say the cost could go down very significantly. I don’t think it’ll go down, like, to an outrageously low amount because then it would be too easy to submit transactions and everyone would be doing them all the time. So, I think the net effect, though, would be much, much, much cheaper, very high quality, very safe blockspace which we’ve never really had in Ethereum.
Robert: What you’re describing reminds me of that principle in economics where you might think if you add another highway it’s gonna reduce traffic, but actually people end up just using the highway more. It invites more people to come and use that infrastructure and travel across it.
Eddy: Yep, that’s exactly right. Another similar idea is Jevons Paradox. That’s J-E-V-O-N-S Paradox that people can look up, where technological progress can make a valuable resource more abundant but actually increases its demand because now there’s more of it that you can use for that valuable end. So, it’s not to say that Ethereum’s gonna become more expensive because of upgrades like EIP-4844. It just means that they may not drop linearly. Like, if we get 10 times or 100 times the throughput, that doesn’t mean the cost is gonna drop 100 times.
Chris: Yeah, I would expect…I think they call it induced demand in economics. I would expect that. And that’s basically the pattern that happens with every computing resource. So, you know, CPU, memory, bandwidth, most recently GPUs with AI. You basically have this sort of back and forth, right, between the computing resource and the applications. And so the computing resource gets more efficient – sort of Moore’s law or some other price-to-performance curve – that then kinda acquires users and unlocks application use cases. Those things come along and they use up that new resource. They drive on the new highway. But that in turn then creates this sort of back and forth and that process takes, you know…sometimes can go in fits and starts. That’s some of the things we’re trying to highlight in the report, to try to see the long-term pattern because it’s not instantaneous when new infrastructure comes along.
So, for example, famously in the early 2000s, there were all these talks about how there was this “dark fiber”. We had overbuilt…a lot of smart people thought we had overbuilt long haul internet fiber and it would never get used during the ’90s. It got overbuilt. Of course, then what happened is you had this last mile broadband catch up. You had applications like video, like YouTube in 2005. And then very quickly we used up all of that dark fiber and much more and we keep laying cables to this day. So, it’s not always a completely smooth process but I would expect the same thing will happen with blockspace. As 4844 gets implemented, L2s get more sophisticated, other blockchains…just all sorts of various kinds of scaling solutions – that unlocks new categories. Eddy was talking about gaming earlier which is a category that really requires much lower transaction fees than we have today. That in turn will probably use up that resource and create incentives to create even more of those resources and I think we’ll continue to see that back and forth over the next decade plus.
We’re projecting up the future but, I mean, this pattern has just happened so many times in the last 70 years in the history of computing that I think we can now look back and observe and learn from that.
Robert: So yeah, we’re gonna be going from dark fiber up to dark blockspace, I suppose. And by the way, at the mention of Jevons Paradox, I saw Scott [Duke] Kominers, resident economist on team a16z crypto. He was pretty excited about that and threw a 100 emoji up there. So, you got a nod from Scott. Since we’ve been talking about technical upgrades, there was one that actually just went off quite recently, the Shapella upgrade. And of course, there was The Merge, the upgrade to POS [“proof-of-stake”]. And this sort of encapsulates the whole transition from proof-of-work to proof-of-stake. We captured the start of that in the report talking about the impact of The Merge. Daren, tell us a little bit about that. What were the energy consumption implications of that whole long process that culminated in The Merge?
Daren: Yeah, sure. So, as you mentioned, in September of last year which feels like a long time ago now, Ethereum transitioned to a new consensus mechanism which resulted in a drastic reduction in energy consumption. More than 99.9% of energy consumption has been reduced. And if you compare that to other well-known products in industries, you can see…and we do this exactly in our report. You can see that proof-of-stake Ethereum is very, very low on that comparison and when you look at something like YouTube, Ethereum now consumes about 0.001% of the energy that YouTube consumes annually. We have different sources…third party sources that we pulled from, like the Ethereum Foundation. You can compare it against things like global data centers which is 78,000 times more than proof-of-stake Ethereum. Gaming in the USA is 13,000 times the energy consumption of proof-of-stake Ethereum. It’s just remarkable to see this all happen in the way that it did.
It was truly an incredible upgrade that I think will go down in history as one of the most significant upgrades in the history of open-source software development. You can check out all these comparisons and numbers in our report. But it’s really just a cool experience to see unfold. And as of a couple of days ago, we now have withdrawals enabled which I think completes that story with proof of stake and it’s just awesome to see the Ethereum community deliver on these upgrades that have been years in the making. We’re really excited to see that happen.
Robert: We actually got some questions on this slide. So, I figure I’ll just ask one. Daren, some people were inquiring about YouTube having a higher energy consumption per year than global data centers and why that would be given that YouTube presumably runs on data centers.
Daren: Yeah. Maybe first I’ll say we pulled these from third party sources. Of course, this stuff is difficult to measure and so we recognize that and show these figures really just to illustrate a point. Now with respect to YouTube specifically, I will say that maybe one thing that we could’ve made clearer is that the YouTube number actually reflects the end user devices that are being powered in order to watch these YouTube videos. So that would explain specifically why it’s greater than the global data centers because it accounts for specifically the laptop or the phone or whatever that’s used by the end user in order to watch these videos.
Now I always make this caveat because I think the world is so dynamic and it’s hard to measure these different things, but the energy consumption that’s spent to watch a video on YouTube could have otherwise maybe been spent to go get in your car and drive to a movie theater, which I think just further underscores this point that it’s hard to measure. And I think that the real win when it comes to proof-of-stake Ethereum is that it eliminated what was really by definition wasteful energy associated with the proof-of-work mechanism. And so, eliminating what is true wasted energy for a system that is now run much more efficiently I think is the big win. But maybe to answer your question specifically, it’s the YouTube end user devices that are also accounted for in that number.
Eddy: Yeah, and I wanna throw in, Robert, like, I really enjoy seeing this complaint from crypto skeptics because we used to hear so many complaints about Ethereum’s energy consumption. And that was a big story to see a year ago. We were hearing about nonstop in a bunch of different avenues. And that problem has been solved. It has just been totally and completely solved. And now the best complaint has to do with the application of an illustrative statistic. Like, we’ve made a lot of progress. You know, the exact, like, YouTube number it…may or may not be right. It’s very difficult to measure. But the decrease in Ethereum’s energy consumption is indisputable and the point is sort of its size and puts Ethereum in contrast with a bunch of other means of consuming energy that are maybe less controversial or less disputed out in the zeitgeist.
So, I’d say, like, the key point sort of stands and is exciting. An incredibly positive development in my opinion.
Robert: You know, in addition to all that, Eddy, I think there’s also a meta or macro point to be made about the ability to measure this stuff at all. The numbers are super clear on Ethereum because all the data is public. And that kinda goes into the whole purpose of the report, highlighting all these public data sources and showing all the various metrics.
Eddy: Yeah. That’s exactly right. And I’d even…like Daren was saying, I’m not even sure how we would quantify how much energy is consumed by people driving to movie theaters. That’s never been, like, a metric debated. I’m not saying that we need to or that maybe there is some problem or something but just I think crypto maybe gets punished to some degree by the fact that it is so transparent and measurable which means it is easy to scrutinize in public. That’s what we’re doing with our report obviously. I think that’s one of its virtues. But it’s important to take into context which things aren’t easily scrutinized, which things are opaque.
Robert: Yeah, I totally agree. It’s a great point. And a lot of these private companies, they don’t have this data readily available whereas in web3 a lot of these things you can look them up. This sort of segues nicely into privacy. We could talk about zero knowledge cryptography which was another slide that a ton of people really latched onto, really loved and were sharing. Let’s just quickly dive into what we saw there, what’s been going on in the field of zero knowledge. Daren, maybe you wanna take this one. Or Eddy, I see you unmuted yourself.
Eddy: Oh, either, either.
Daren: Whoever.
Eddy: Go, Daren.
Daren: I can say in short that the zero-knowledge field is gaining incredible momentum. This is happening across a number of different data dimensions that we track. One we look at is academic publications and we built out a data pipeline that allows us to extract this data, to tie in the metadata, and pull out ZK related publications. That trend has been very much up and to the right over the last few years as the space has really moved from theory to practice. I think more academic people are interested in the potential applications for crypto.
We also looked at GitHub stars across key ZK repositories. That trend has also been very much up into the right. And so has the daily transactions verifying ZK proofs on Ethereum. That’s kind of a measure of how ZK proofs are being used in production today. And all these numbers just show really kinda big momentum in the trend, particularly over the last few years. We also worked with our research team to pull together some specific benchmarks that show that the tech itself is improving at an incredible pace just when you look at dimensions like prover time, proof size, verifier time, each of which is important to a ZK scheme in different ways. That has, even just in the last couple of years, seen dramatic improvement, which speaks to some of the stuff that we were talking about earlier. Those are the data trends. I don’t know if Eddy has anything to add about the development of the space itself.
Eddy: Yeah, that’s all exactly right. I’d say if you look at the advancements in the underlying technical capabilities like proof size, verifier time, proving time, advancements in hardware. We have some really cool posts on this. Elena Burger from our team has some great writeups documenting different aspects of how zero knowledge proofs and zero knowledge systems have advanced over the last year or two. It’s accelerating very quickly and that’s because there’s this interesting thing where specific fields of cryptography have been revitalized by interest in crypto. You can see it in the academic data, academic slide or academic chart that we have in the report. There’s been this big pop and increase in attention that academics pay. Our research team is an example of that.
And there’s all kinds of interesting things that can be done. We’re seeing privacy systems built out and experimenting using this tech. We’ve seen all the scaling we’ve seen in L2. And it’s kinda the beginning.
Chris: Just to add to that. I was chatting the other day just casually with two cryptographers on our team, Dan Boneh and Justin Thaler and we were talking about the history of cryptography. It’s an interesting field, because you basically had, from a theoretical side, very little happen up until public encryption in the ’60s and ’70s. You had, of course, significant engineering improvements. You had to look at things like the codes used in World War II. They were very sophisticated but they were really just more sophisticated versions of the old kind of cesarean codes replacing one letter with another. The key feature was you still had to get together ahead of time and… with the person you’re communicating with and trade the codes.
And so, the big breakthrough with public encryption was that you could now…I can send information to somebody else who I’ve never met before and never traded codes with just by using their public key which of course at the time seemed interesting but not that probably applicable. That turned out to be the foundation of SSL and all the encryption we use on the internet. Imagine if every time you go to a website you had to have pre-gotten together with the website and traded codes. Like, it wouldn’t work. So, it turned out to be incredibly important. And also a very important feature of public key encryption is it has two use cases. One is encryption. So privacy preserving features. And then the other is authentication – proving that this document was actually mine and I signed it. And of course, the authentication use case is really the primary use case in crypto.
This also creates confusion. You watch…I was watching a TV show the other day and they talked about crypto being private, which of course is actually a difficult, only partially solved problem. Anyway, so I think of zero knowledge proofs as, in my mind, the other big major theoretical breakthrough in the long term history of cryptography, really since public key encryption. But because it was so obscure and the method so complex, it was not very practical. And Eddy and Daren can feel free to correct me on the details here if I’m getting it wrong but essentially it was just so computationally intensive.
One of the nice side effects of the crypto blockchain movement is that there’s been just a lot more investments in this area and as a result, you’ve seen dramatic improvements in these algorithms and the performance. And also, by the way the other really nice symmetry here is that as public encryption has these dual inverted use cases, you know, encryption and authentication, in zero knowledge there’s also two dual use cases. There’s the ability to prove something to somebody without disclosing information you don’t want to disclose. That’s kind of where the word zero knowledge comes from.
But then there’s also this ability to prove that you did a computation correctly which is where all the scaling benefits come from. So, I can compute something on my computer and then send the results and then oversee that I did that proof and then someone else can verify that I did it correctly without having to run the computation which would be much more computationally intensive. So, it has all these really interesting properties and it’s been around since the ’80s but has only been practical recently and it’s getting more so very rapidly. And I would expect this to have many other spillover benefits to other areas in technology outside of the space we work in.
So, I think it’s a very broadly important and probably underestimated trend in cryptography and in general in computer science.
Robert: Yeah, that’s a really great point. You mentioned that this technology is likely to, or it might spill out into other areas. If we’re gonna talk about one area that’s just gotten so much attention lately given the advances going on, it’s AI. And Eddy, you mentioned a piece from one of our deal partners, Elena Burger. She wrote about the potential implications of zero knowledge technology in machine learning. That might be interesting for people to hear about here.
Eddy: Oh, man. I feel like we could spend an hour on it. So, I don’t know how much we should unpack it, but it’s a fascinating one. I mean, in short, I think… There’s actually no short way to put it. But maybe the ultra-shortest way. Just two key parts for me. One is AIs are sophisticated computer programs. And one of the core ideas in crypto is, Who controls those programs? As people interconnect and interweave these types of programs with important services on the internet and throughout the world, we should ask, How are these controlled? How do we trust them? How do we verify them? How do we know what they’re doing and how the economic systems built around them work? That’s what crypto’s all about. So that’s a very crypto topic to me.
And the second maybe more specific piece is that when a model makes an inference or puts some output for you to use or for another program to use, there are a lot of reasons why you want to be able to trust that output. And right now, if you trust that output, you’re trusting the company and the API that serves that. But at the limit, you can imagine extreme and fascinating cases where people tweak or modify these complex programs to produce outputs that serve their interests and not just yours. Kind of like how, you know, a large social media network might modify the feed to serve their interests. There’s no reason why someone couldn’t modify an AI model to serve their interests. You probably want to be able to know that the model is executing exactly what you gave it and working exactly the way that you want. That is verifiable computation, and that has been a story, one of the core stories in crypto, from the beginning.
So, I think that AI and crypto really overlap and zero knowledge technology is one of the ways that we will probably come about being able to verify what AI models do and the economics around them. Just can’t wait to see how it unfolds.
Daren: And if you look at some of the data just around the internet consolidation that we’ve seen, three companies control a third of all global web traffic. Five companies represent 50% of the Nasdaq-100’s total market cap. That’s up from 25% from a decade ago. It’s very clear that the internet is consolidating power into the hands of the few giant tech corporations. And AI is only going to make that problem worse. Now I think the credible counterbalancing force of web3 and a decentralized network for compute, I think, makes me at least very interested to see how this all unfolds and what role web3 and blockchain can play in the current trend.
***
Robert: Let’s talk about how we can quantify the activity that’s going on. So, the keystone of this year’s state of crypto report is this accompanying tool that we’ve created, the State of Crypto Index which tries to consolidate and combine all these different metrics to give you a picture of the health of the state of crypto. Eddy, talk us through what the State of Crypto Index is, what it represents.
Eddy: The key thesis to me of the index is that there are product cycles and there are financial cycles. And it’s very easy to see a price and to measure the price action of different assets and mistake that for a synoptic view into the health of an ecosystem. The prices can change. That doesn’t mean that the builders and the underlying tech is changing. And vice versa. During periods of very exciting development, it’s not the case that those things will be reflected in the price for the short or long term. It may actually have very little to do with prices that you see.
So how do you tease apart these things and get a sense of how crypto’s actually doing? One way that we think about it is that on one hand there’s innovation, on the one hand there’s adoption. And we think that there are key underlying indicators that you can find from public sources that reflect those things. So, I think if you look at the charts that we’ve shared, these are just some of the key indicators we think are most interesting and may be most representative of what’s happening on the innovation side, the supply side and the adoption side, the demand side. They’re not meant to be perfect. “Index” is obviously a little bit of a metaphor. It’s not like the S&P 500 or some sort of rigorously defined measure. Far from it. These are just indicators that we think are really interesting that we look at when we try to get a macro view of how the space is unfolding.
And to that end, the way that we blend and calculate them to create the aggregated innovation side and adoption side, we invite you to think of different blends. On our site, you will see a page that allows you to look at each of these metrics individually and to get a sense of how they accumulate together into our larger measures. You can adjust the weights and you can even drop some out to zero because maybe you disagree with how we’ve blended them together or how we measure them. That’s totally no problem. We’re not trying to be authoritative here. We’re just trying to show public measures that we think are interesting that point at the way we see the product and financial cycles developing somewhat independently.
Robert: So, markets are this interplay between supply and demand. And when you look at a price chart, you know, you go to CoinMarketCap or CoinGecko or something, supposedly you’re seeing what people are willing to sell a token at and what price people are willing to buy it at sort of netting out…creating the price for that token on the open market. But that’s really focusing, like you said, on the financial side of things when also there are these tech markets, these product markets of, like, what are builders actually building, what applications are they putting together, how many smart contracts are they deploying and who’s actually using this stuff, who’s transacting with it.
And so that’s why you get that supply side correlating to innovation, actually what’s being built, and then the demand side, being adoption, who’s actually using this stuff. I think it’s a really interesting way to look at a market in a nonfinancial way because it still involves supply and demand, just at a different angle than people are usually accustomed to.
Eddy: Yeah. I think that’s exactly right. And we hope over the coming months and years to improve and adapt this. We’d love people’s feedback, thoughts about interesting types of measures. Of course, we prefer public measures that everybody can see and everybody can understand together but we’d love to update it, we’d love to refresh it. We hope to add new categories as they unfold. It’s just meant to be a wide view that we think is a little bit more legible and a little bit more indicative of what’s actually developing than just prices.
Robert: Daren, walk us through some of the data points that went into here.
Daren: Sure. So, you mentioned we split it out between innovation which is the supply side and adoption which is the demand side. And maybe just to list out the specific metrics that are included in each of those categories. On the innovation side, we looked at something we call active developers which is the number of unique GitHub users who have committed to or forked a public crypto repository during the month. All of these metrics, by the way, are aggregated monthly. We can also expand that set into a category that we call interested developers which includes anybody…any unique GitHub user that has committed to fork or starred a crypto repository. So that is the second category in the innovation bucket.
The third category is the number of contract deployers, so developers who are deploying smart contracts on one of the tracked blockchains that we look at. We can also look at something called verified smart contracts which we think is an indicator for product launches because once you deploy your smart contract, the first thing that a developer usually does is get that smart contract verified. So, it’s another indicator of developer activity.
We also look at developer library downloads, specifically web3.js and ethers.js and how that trend looks month over month. We look at academic publications. So, we have a tool that allows us to extract all of the crypto related publications from the academic world and track that metric over time. And then we also look at job search trends. So, people who are searching the web for crypto related jobs. And those seven metrics make up the innovation indicator bucket. Again, we track those metrics over a monthly aggregation period and roll those into the index along with the adoption indicators.
Robert: And just by the way one thing that’s interesting about the innovation indicators when you look at the charts just generally, a lot of them sustain. A lot of them have been increasing over time and just kinda going up. It just goes to show that despite the market seasonality or what it might appear prices are doing, what people are actually building is totally separate from that. The building activity continues undaunted. Winters are for building, so they say.
Chris: That’s right.
Daren: Exactly. And that’s a key takeaway from the work we’ve done here with the State of Crypto Index. The innovation indicators are much more steady, a lot less volatile than what we’re seeing on the adoption side which maybe is a little bit more tied to the market cycles. And it’s a function of everything that we’ve been talking about, the fact that builders tend to stick around. Builders keep building in the bear market. All of that stuff I think is really quantified by that trend that we’re seeing, which is that the innovation or supply side is much more steady than the adoption or demand side.
Robert: It’s funny that we might be in a winter and yet in terms of the product cycle things are going pretty well. There’s a lot of really amazing activity happening right now. Is there a place for winter, then? Is winter a good thing?
Eddy: I think both sides are important, right. I think the thesis is that there is an upside to the hot summer, as crazy as it is. As much as sometimes financial excitement exceeds our current technological capabilities because it attracts new people who stick around and will continue building as I think a lot of the metrics that we showed demonstrate. They’ll continue building even when prices cool down. So, the summer/winter is how to think about the new entrants, the revolving door of people who come into crypto and kinda see what it’s capable of in the long term and wanna stick around and help build that. Although honestly, I don’t know that it’s as seasonal in the product cycle. It kinda reminds me of California. Weather’s always the same. There’s no winter, there’s no summer. I don’t think it’s a coincidence that it’s such a fit.
Robert: Nice. Springtime is breaking over here in New York. It’s quite lovely.
Daren: Part of the reason why I think we look so much at product cycles is because financial cycles are just very hard to predict, right? They fluctuate very unpredictably based on macroeconomic conditions whereas product cycles follow their own internal logic and it’s often based on consumer behavior and broader tech trends that we’ve seen play out over history. And if you look back since 2000, you can see that good companies have been created in every financial market cycle and I think that’s an important point to recognize here. For example, if you decided to ignore tech after the dotcom crash, you would’ve missed industry defining companies like Facebook and YouTube that were born in that era. Similarly, if you would’ve been spooked by the global financial crisis, you would’ve missed iconic companies like Instagram and WhatsApp and many others.
And so, we often look at these product cycles through our own internal perspective that we build meeting with these entrepreneurs and technologists…the product cycle is something that we feel very excited about and it’s something that we’ve seen continue to progress despite the down market from a financial cycle point of view.
Chris: What you can kinda see if you zoom out…and this is in our slide deck, is that although it looks kinda chaotic on the shorter time cycles, when you zoom out, you see a real predictable pattern and steady growth, steady compounding growth overall.
Daren: And maybe to get a little bit more specific with the data, what’s cool about the insights here is that you can actually align these very specific metrics on the same X axis, the same time axis and you can see when the big swings happen. The key insight that I think comes out of this analysis is that the big swings in things like developer activity – which is measured as the activity that we’re seeing on public GitHub crypto repositories – that upswing tends to happen in the months shortly after the price swings, which implies that the price brings interest and with that interest, that turns into new developers that are building new products. You can see on the time axis just how the price comes first as a leading indicator for some of the developer activity, the startups, the social media interest often is really triggered by the price itself and you can visually see that with the data.
Just to complete the list so people can understand exactly what goes into this index, on the adoption or demand side, we look at active addresses. So unique on-chain addresses that initiate transactions during the monthly period across the tracked blockchains that we have. We look at the raw transaction counts over time. We look at transaction fees that are paid by users in aggregate over the period across the different blockchains. We look at mobile wallet users from Apptopia which is a metric that we look closely at. We look at volume on decentralized exchanges. We look at the number of buyers of NFTs each month and then the on-chain transaction volume for stablecoins. And those seven metrics make up the adoption indicators category and together we roll that into the State of Crypto Index which really is just a culmination of all this data, analytics, and market research that we do and we like to bring it all together in a way that is very flexible. We, as Eddy mentioned, give users the ability to update the weights and the thresholds themselves. It’s just a data tool that is designed to give people a point of view on how the nonfinancial oriented metrics are performing when it comes to the health of the crypto industry.
So, it’s something we’re really excited to be sharing.
Robert: That’s a lot of metrics. That’s a lot of data to combine, to roll up into one single output, one number that sums up the whole state of crypto. Eddy, maybe you could share a little bit about what that number represents.
Eddy: Yeah. It’s an interesting one. You should consider the net percent growth for that category. So of course, we have to pick a time to begin measuring a specific metric. Let me give you an example of, like, an absurd choice. If you decided to start measuring Facebook’s growth starting from the first user, they’ve grown to more than 1.3 billion, 1.4, I don’t know the latest metrics but billions of users. If you consider that percent growth, 1.3 billion percent growth, that’d be a little absurd. You need to pick a specific time period against which to measure growth. Typically, what data people do to avoid introducing weird biases as a result of how they start measuring one of these metrics is they’ll aggregate to a notional period like a month or a year and they’ll round it out that way and say, “All right. This year we had this number and next year we have this number and so on.” And they measure growth against the yearly aggregates.
In cases where we could, when there’s a long enough time period and it makes sense to aggregate to a nice level of granularity, we choose a starting period to measure growth from there for each sub metric. Of course, you can actually…on our website you’ll see that you can adjust that period so that if you disagree with how we used our best judgment to decide when you should start measuring percent growth, you can do so. But the point is just to say for each of these metrics, How much did they grow from that starting point? That means they can go negative. This especially makes it not a real index in some scientific sense of the term. But it should be an accurate measure of how to think about how much those categories, those measures grew all together whether that’s the adoption side or the innovation side.
Daren: Maybe just to summarize it, I think the actual value of the index is the weighted average monthly growth of all of the included metrics under certain assumptions that Eddy has gone into detail on. But it’s that weighted average monthly growth since 2016 for the included metrics which we believe is a fair representation of how the industry as a whole is performing, especially given that we allow the users to go in and adjust the weights, adjust the thresholds based on their preferences. But we feel it’s…after spending many hours litigating the details of the methodology, we feel it’s a pretty fair representation of just simply the weighted average growth of metrics.
Robert: That’s a great pithy summary of what it’s displaying. It’s like you’re throwing all these metrics in a blender and outputting how much the industry is growing along all these various dimensions. It’ll be interesting to see where we’re at a year from now. I don’t think we really know what it’s gonna look like. Does anybody have an expectation? Are we gonna be out of winter? Into springtime? Into summer time? Still in winter?
Daren: Such a brutal prediction.
Robert: Well, to ease it…maybe not from a financial perspective but strictly based on the kind of building activity you see going on or the sort of things that the index is measuring.
Daren: That’s true. I mean, I feel very confident about the innovation side. I just…I cannot imagine what it would take for all the things that we’re seeing develop now to not play out. There are going to be more transactions. There are going to be more interesting types of projects that are possible and as a result I hope more developers. I see that going very well. The demand side is the crazy one. I just don’t know. That’s just tied to so many external factors. Who knows what’s happening with the money supply, the global economy, oil. I don’t know.
Robert: But we will be able to track it using the State of Crypto Index. Chris, what’s your sense of how things will shape up in one year from now? Whether there will be more innovation, more adoption or less?
Chris: Well, look, this is what I do fulltime and obviously I believe deeply in it and continue to. I think my experience just, in general, as an entrepreneur formerly and then as an investor, is that maybe some people like Ray Dalio or something can predict macro cycles. I can’t and don’t try to. So, I have no idea what will happen with the economy and potential recession, inflation, all those things including prices of any tech assets whether they be crypto or non-crypto. We just really try to keep our heads down and focus on infrastructure, applications, technology, founders, working with our founders to make sure they have the resources they need to build what they wanna build. Ultimately, they’re the ones who build these great things. Given what I’ve seen there, I’m very excited over the next year for all the products that will launch. So that’s what I’m keeping my eye on and I think it’s very exciting.
Robert: How’s the U.S. shaping up against other countries in the world of crypto?
Daren: In terms of both developer activity and user activity, we’re seeing the numbers decrease over the last few years in terms of the U.S.’s market share and influence over the crypto industry. The number of crypto developers as measured by a great report produced by Electric Capital has decreased from 40% down to under 30% in terms of developers that are based in the United States building in crypto and that has consistently decreased.
Eddy: Yeah, and this is despite a very strong retention and even overall growth in the number of active crypto developers. So, it’s not a reflection of the space shrinking. It’s a reflection of the proportion of interests moving overseas.
Daren: Exactly.
Robert: Chris, you’ve got this great conceit around this tug of war between two cultures within crypto.
Chris: In our world, there’s at least two – maybe more – but two significant different movements or motivations of people involved in this space. Our view, as we make very clear in this report and generally throughout the years, is that a blockchain is a new type of computer that has a particularly useful application of building new networks that have a bunch of positive features. So that’s kind of the lens by which we view our investing. How can we further advance that mission and hopefully nudge the internet into a new era where we have networks that are owned and operated by communities, that are built through composability where the bulk of the money flows to the edges, to the creators and software developers who build it? That’s what excites us.
In our culture – the computer – the technology vision is primary. To the extent there are speculative markets, that’s a byproduct as it is in real-estate. That’s fine. I mean, obviously, there have to be rules around these things and there need to be strict enforcement of regulation and obviously it’s not fine when there’s bad behavior as there was with FTX. But I do think it’s important to call out this distinction because I think in the public imagination these two have been conflated. When you see politicians saying, “You know, we should ban crypto,” which is becoming kind of a meme now, I believe they’re referring to the casino aspects of it. I think that if they understood the deeper technological vision that they would be actually quite supportive. So that’s why I think that two cultures distinction is important and I think not as widely understood as it should be.
Robert: Excellent. Well, thank you all for joining. Tons of great insights.
Chris: All right. Thanks, everyone. Great to see you.
Daren: Thank you.
Eddy: Thanks for having us.
Sonal: Thank you for listening to web3 with a16z. You can find show notes with links to resources, books or papers discussed, transcripts and more at a16zcrypto.com. This episode was technically edited by our audio editors, Seven Morris and Justin Golden. Credit also to Moonshot Design for the art and all thanks to support from a16z crypto. To follow more of our work and get updates, resources from us and from others, be sure to subscribe to our web3 weekly newsletter. You can find it on our website at a16zcrypto.com. Thank you for listening and for subscribing. Let’s <BLEEP> go.
***
Robert: How ideally would you like to see people using this tool out in the wild?
Eddy: Crypto’s an incredible thing. We have all this public data. It’s all visible. You can see how different projects are playing out live in front of you. Otherwise, we have to beg companies and startups to share what little information they’re willing to. I’d like to see people using that information more richly. We see a lot of that. To be fair, I’m not saying it’s not happening. We see tons of it on social media. People love sharing the latest of what’s happening with their pet projects, their favorite projects. But I’d like to see these other types of measures take a larger market share, and mindshare of the way people measure crypto. The price is one thing but progress is another and I’d like people to measure that.
Daren: And if anyone has ideas on how to make this better or feedback that they can share, just come talk to us. This is the stuff that we love talking about every day. So, I would encourage people to try it out. Obviously, it’s just a data tool but if you have ideas or feedback or things you wanna chat more about, we’re definitely open to talking. So, keep that in mind.
Eddy: Definitely.
Robert: I would love to see people create their own views on this, manipulate the data and then share some of it. Take a screenshot, tweet at us. Also tell us what metrics you might like us to add or change, why you disagree with us on the default view. I think it’s cool that this thing is pretty much open source and people can do with it what they want. I don’t really know how people are gonna use it but hopefully they find it valuable.
Eddy: Yep.
Chris: The way I think about it is always, it’s important to have more tools and more data sources and things for people to understand the situational awareness of what’s going on. But I also think what Daren and Eddy created here is important because, going back to this computer versus the casino idea, there’s all sorts of numbers that track prices. We don’t think about that. We think about the products. People pay attention to metrics. And the more metrics we have that track the fundamental progress, the important progress, the product progress, the technology progress, the better. So, people will pay attention to them and when you measure something, you can optimize it as a community.