Early Days of Ethereum

Preserving the history and stories of the people who built Ethereum.

ethereum devcon-0: ethereum 2.0 and beyond

A presentation on Ethereum 2.0 and the future vision for the Ethereum platform, delivered at DEVCON0 in Berlin.

Transcript

[00:00] SPEAKER_00: This is the last presentation for the day, for the week. In Python that's index negative one. Unfortunately C I don't think supports that particular feature. One of the many reasons why it's inferior.

So basically here I'm going to give in part some of my own technical, economic, philosophical overview of what I, of where I see the future of always in part Ethereum, in part other blockchain related and decentralized computing related technologies going in the next two years, five years, 20 years.

So in part the kind of perspective that I generally have on this whole space is that I see a lot of interest, a lot of interesting ideas, I see a lot of very definitely very cool math, definitely very cool economics and cryptography and so forth. But the challenge is trying to figure out exactly how a lot of this stuff can end up actually being useful. And so a lot of the ideas that I have are around trying to figure out sort of what is the fundamental thing, what is the fundamental set of parameters that this technology fills in a way that's better than any other technology that's available.

And so in those areas, those areas where crypto does well and where other areas don't exist at all, those are in theory going to be the areas where you can sort of look for practical applications that are stable long term. And by stable long term, I mean stable in the sense that people will use them not just because, oh, hey cool, it's Bitcoin or hey cool, it's decentralized. Yay, I got to keep my data. But rather because they are in some meaningful sense better at something. And people actually care about, they'll have properties that people like and the weaknesses will be so small people won't care about them.

So an important point is that everything we're doing here might end up being completely useless. There's a large probability of that. When you're in a startup you kind of have to accept that, well, when you're doing any kind of technology, you kind of have to accept that. How many failed designs have there been for fusion reactors now? But the way that you do development is you sort of assume that your path is going to succeed and you try and go down that path and then maybe if you see some fork, you pick the branch of the fork that's more likely to succeed. But you sort of imagine things are going well and if it all fails, then okay, the sort of bottom threshold you can't go below is basically the amount of time you've wasted. And that's fine, but hopefully you'll discover something that's actually great and useful.

So I got a couple of sort of secrets in this presentation. I got a couple of sort of secrets in the Peter Thielian sense of the word. So first one is the history of software development is a history of human beings deliberately employing progressively less and less efficient software paradigms because we like them for other reasons.

So 1980s assembly, obviously the best and only true way to write code. It beats some of these by like a factor of several hundred. Machine code and hex. Machine code and hex. Even better. Actually even better to just make an ASIC. So yeah, 1920s ASICs, forgot that one.

Then we invented this thing called C, which adds a whole bunch of overhead. It makes things several times slower. And yet for some strange reason, it won. Then we invented Python and JavaScript which are, oh my God, even worse. But people like them because in this case it's because they make development easier.

And so the idea is there's an established history of people in some respects sacrificing efficiency and in some cases sacrificing efficiency by over two orders of magnitude because it provides some ancillary benefits. And because as technology improved it turns out that for a lot of computations, for most use cases, the amount of computation that we actually do is pretty trivial and so slowing it down, even by like a factor of 10,000 actually isn't all that bad.

So the sort of theory that may be correct that I'm working on here is well, what if decentralized consensus protocols are sort of an element of this trend. So not the next element of the trend, because this trend is about making things easier and this trend is about making things trust free, but sort of following in a similar direction.

So first off, one thing that people get mixed up about decentralization is there's two kinds of decentralization. Well, there's several kinds, but these are two kinds. So in general, if something is decentralized, what that means is that nobody controls it. And that fact by itself mainly has political benefits. Well, I would also add in the benefit of reliability. But in general the fact that nobody controls it is to a large extent the reason why things should be decentralized as a category in itself.

Now within decentralized there's also this concept of things being distributed. So decentralized and distributed are quite often used as synonyms. In this case I'm making a distinction. And chances are there are going to be people who try and make different distinctions because all these words are not too clearly defined. But here the distributed means that the work is actually split up between many parties. And when decentralization is the distributed kind of decentralization, you can actually see unambiguous technical benefits from making things distributed.

So it could provide variation. And those technical benefits are so large that even centralized entities are willing to adopt the paradigm. Best example, World of Warcraft. World of Warcraft actually seeds updates for their platform using torrents. And a whole bunch of other software does that. Ubuntu distributes Linux using torrents. A whole bunch of people distribute this stuff using torrents.

So that's one kind of decentralization. The other kind is consensus. So consensus is somewhat different. So consensus, nothing is being split, work is just being massively replicated. Right? So when work is massively replicated, that's always going to be less efficient than just doing everything on one server. And so that has to be done for other reasons.

So in general, how do you think about consensus? The basic properties that it has are it's reliable, guaranteed to keep working. And it's much, I think in the long term you'll definitely be able to count on it being much more reliable than any one particular service. One particular service is in a lot of cases likely to either shut down, but transparent, globally accessible, not controlled by anyone. So it doesn't, it's not going to shut down. It's also particularly not going to become evil. It's not going to start installing DRM, it's not going to change its API, it's not going to try and throw captchas into its API because it wants remarketing revenue.

But for all this, or at least in Ethereum 1.0 and Bitcoin 1.0 and every other crypto 1.0, the price you pay is it's 10,000 times slower. So if it's a C implementation, then that's 10,000 slower than C and maybe a million times slower than assembly. And if that's a Python implementation, then it's 100 million times slower than assembly.

So less efficient in theory but the idea is that it's more efficient in practice due to the lack of monopoly rent. So why is it that theoretically Bitcoin is inferior from a purely technical standpoint to just running everything on a centralized server? The reason, it's pretty obvious why Bitcoin, there's send a transaction, it has to get replicated by a few thousand nodes. You got all this sort of wasteful proof of work being done. PayPal one server send done. PayPal might do some replication internally for sort of technical redundancy purposes. But even still that's going to be much more efficient than Bitcoin, which provides both technical redundancy and political redundancy.

But in practice for some reason Bitcoin charges $0.05 and PayPal charges $0.29. So there's interesting paradox that less efficient in theory and more efficient in practice.

Now what I will say is that this idea of monopoly rent is something that can appear in many ways. So you could have monopoly rent because of prices. PayPal charges 3% plus $0.29 plus 1% for foreign transactions, plus 2.5% for currency conversions and also intangibles. So things like privacy, freedom, they might, whoever creates the software might sort of take some extra effort to impose their own ideological vision and so forth.

So another sort of way of thinking about it is decentralization is a commitment strategy that an application developer can use to commit not to being a jerk forever. So that's roughly sort of what the advantages of consensus are.

And so the way that we can make this kind of architecture actually useful is that, okay, we know what the advantages are. Now let's see if we can shrink the disadvantages to the point where they don't count anymore. So what are the two disadvantages?

So one of them is scalability. So fundamental problem in all these architectures, every full node has to process every transaction. So the idea is well, if we want to improve on this, let's split things up so that's not the case anymore. But we want to split things up so it's not the case anymore while maintaining shared security.

So as Dominic Williams, the founder of Pebble put it, we need to scale out and not scale up. So if you read the sort of Bitcoin foundation scalability roadmaps, they all talk about scaling up and they all talk about how, oh, things are going to be fine if it all gets a thousand times bigger because look at this data, we'll just have full nodes and unreally powerful computers and so forth. Whereas I think if we want things to really remain decentralized, I think a sort of more horizontal approach where you don't really lose the property that every full node must process every transaction might actually be more appropriate.

So the idea is 10,000 times 5 cents a transaction. It's probably too much to slow down to make the sort of consensus dapps practical. In a lot of cases it's fine for money right now because people are used to paying money in order to transfer money. It's not in most other cases. For something like Internet forum, name registry, whatever sort of time stamping a document, whatever else. People are used to that stuff being free. And so going from 3 to 5 cents is going to be a bit hard for people to accept.

So you might be able to sort of remove the psychological part of it to a large extent by sort of making the payment happen in the background and just replacing the dollar sign with a sort of bar that goes from 100 to 0. And then you just have to pay a dollar once in a while to fill it up again. I'm actually just reading some Predictably Irrational by Dan Ariely and he kind of points out how you can actually remove much of the psychological association with money by sort of creating a separate measurement thing that's kind of one step removed from money itself.

But even still, you're not going to get rid of the fact that people are used to paying nothing in the backend. It'll be hard to get them to pay something. But if you knock it down to say a 200x slowdown then. Well, people already do a 200x, take a 200x slowdown voluntarily when they program in Python instead of assembly. And so if people are willing to do that for free, ease of development, maybe they might be able to do that because they like the benefits of things being decentralized.

Actually another sort of one of those sort of Peter Thielian secrets about decentralization that I missed is that I think to a large extent people will like, these application developers will want to use decentralized paradigms basically because they're too lazy to manage their own servers. So like that's actually a serious issue that I've had. I've actually tried to maintain a Bitcoin multisig Bitcoin wallet and multisig.info and it just kept on crashing so many times that I had to keep on babysitting and restarting Node.js. So just because of that alone I am willing to turn it into a dapp.

So at 200x consensus becomes viable for a large number of things. And of course you don't need to take a 200x slowdown for everything. As Gav pointed out in one of his recent presentations, you can quite often put what you really want to do is you only want to put business logic into consensus and everything that's not connected to business logic you want to put on the user, lay on the sort of user interface layer inside of an individual, inside of the browser or inside of a Whisper Swarm type protocol, distributed hash table and so forth.

So how do you do it? We kind of talked about this but just to sort of go through it quickly again. Solution 1 Sharding, split the state into sub states. And the idea is that block makers sort of build on an edge and when you build on an edge you process things in each, in both vertices along the edge and you also sort of move messages along and if you need to send the message from one vertex to a distant vertex, then the message sort of stays in outboxes and eventually as minus mine edges it sort of makes its way across.

You can think of it as being kind of like the IP protocol except it's obviously on a chain. And tree chains is not quite the same. It's more so for this sort of currency specific paradigm that tries to actually split up debits and credits. Hypercube chains basically is this.

And the other sort of key point that it relies on is this idea of jury selection that for any particular edge you need the verification set for that edge is taken from this entire, and the entire value pool of the entire system so that everyone in the system is statistically protecting each individual block even though only 200 users are actually protecting each individual block.

So that's one approach. So what would this look like in practice? Ethereum 2.0 becomes a hypercube. Target transaction fee maybe knock it down from $0.05 to $0.05 maybe $0.01. Ideally much greater use of on-chain mechanisms.

So that's one approach. Other approach is this multi-chain idea. So multi-chain many blockchains. Most of the, there's going to be some blockchains that have many dapps. Sometimes it makes sense to have your dapps be on the exact same execution environment as all the other dapps because you're interacting really heavily. Sometimes you just want your dapp to be by itself because it's cheaper.

Chains can interact either by explicitly interconnecting with each other or by this sort of Tier Nolan type decentralized exchange. And you do a common security via the sort of consensus as a service paradigm where you sort of, you have a chain that has consensus that, a chain that specializes in voting on data availability. And all the other chains, well they have this selection rule that at each block height the only block that's valid is the first valid block that height where you define valid by number one the thing voted that the data is available and number two that the height, or number two that there aren't any proofs of invalidity of it. And you determine first by the timestamp that this consensus as a service thing specified.

So that's the other approach that you can take. You need to standardize very little. Maybe for light clients you might need to stand, if you want one light client across the entire system you might need to standardize more. But this is sort of more of a sliding scale.

So what does it look like? Ethereum 2.0 blockchain equals Ethereum 1.1 blockchain. So exactly the same thing. 2.0. So we'll call 2.0 is like a set of tools for spinning up new blockchains and it's probably a new sort of high level language that, building on Solidity or, if we were building on Serpent, I would probably call it Hydra just because hydras have multiple heads and if you cut one off, more of them grow and so forth.

So the idea is you would have this sort of high level language you could easily use to compile to any kind of crypto economic structure. So you would have one keyword and you could either compile to code on a blockchain, you could compile to an independent blockchain, or you could compile to an off-chain auditable computation protocol and so forth. And like, ideally it would sort of be one language that makes it easy for people to sort of pull in any of these different approaches. So that's the other approach for 2.0.

So for 1.1 consensus is an issue. So proof of work has problems as we discussed. Centralization, as one mining centralization, mining pool centralization, waste. It's, given that quite a large number of our user base does sort of comprise of generally sort of idealistic environmentalist type of people. I think, if we, if the Bitcoin community or the crypto community in general sticks with proof of work, I think once they realize we are going to alienate a large portion of that crowd. So just for that purpose alone, we don't, it might be worth abandoning proof of work. And obviously we ourselves don't like resources being wasted.

It creates two conflicting interest categories. That's actually something that we missed because with proof of stake you just have stakeholders. Well, you have stakeholders and users that don't hold stake. But two categories here we have three categories. You have stakeholders and miners and people who don't have waste. And the more categories you have with conflicting interests to some extent, the more complicated the analysis becomes.

So proof of stake. Basically no, I would call nothing at stake problem solved in the sense that we've discovered an upper bound which is equal to a lower bound which is this weak subjectivity criterion. So hopefully no need to either argue that it's not possible, not possible even with weak subjectivity, or try to, or even try to come up with clever ways to avoid it. The problem sort of, the science has been I think settled to a rather substantial degree. Challenges solidifying the details.

So what would Ethereum 1.1 look like? So before 2.0, more so the sort of more moderate thing 1.1. So proof of stake maybe some variant of the Slasher 2.0 idea. Always needs some proof of work for anti-DDoS.

Event trees. So the idea with event trees is if you want a dapp or if you want a contract that makes it, that does something that would only be actually processed at some point in the future. So you could imagine a contract that says, after 30 days do this, then you want that event. So that would have to be an event and in order for that event to actually be part of, to actually process at a later point in time, it would have to be stored as part of a state in the meantime. And so that would be an event tree.

Minor improvements. Making the Patricia tree binary instead of hex, as probably better. On-chain data structures. So treaps for example. Actually treaps might be a bit too complicated. Maybe even heaps, just plain old heaps because they're useful for markets. Because the problem right now is that if you do a sort of market order book on chain, you actually have log cubed overhead because you have a tree which is, or you have a heap which is log n on top of a tree which is log n on top of LevelDB which is log n and we can knock it down to n squared, and that's, or to log squared. That's already a substantial improvement just by making some first class data structures.

Support for EC Schnorr, better signature scheme that supports sort of implicit multisig without actually being multisig. And yeah, so events trees already mentioned, alarm.

Yeah, so privacy, that's actually the other major disadvantage. That's that consensus computing has over centralized servers which is, in a centralized server, okay, fine. If you provide your data, Facebook has your data and they're going to do whatever they want and whatever the NSA wants with your data. If you put your data on a blockchain, then everyone's going to do what they want with your data.

So now we do advocate dapps, we do advocate privacy as a use case for dapps because we know that we can use the blockchain only for very specific things and we can design dapps in a sort of model where things are stored encrypted and things are not done in consensus by default. So by default it's a system where things are stored in this sort of Swarm cloud encrypted with your private key and your private key actually directly controls things. So that's fine.

But the problem is, what if we want, there are going to be many situations where we simultaneously want privacy and consensus. So for currency that's actually easier because we can do things like merge avoidance where basically your wallet pretends behind the scenes that you have a hundred separate accounts. You can do CoinJoin which is decentralized mixing, centralized mixing. You can do blind mixing with Open Transactions.

More complex dapps they generally require a more complex state. And that state would have to intrinsically have a concept of accounts and then you can't use these tricks. So in that case if you're doing things on chain you're basically giving a lot of, you're basically giving it all away.

So solution basically secret sharing. We were calling secret sharing DAOs. Nick Szabo actually invented this concept back in 1997 I believe. He called it the God protocol. So that kind of tells you this is actually damn powerful stuff potentially.

So instead of doing decentralized consensus computing by replication you do decentralized consensus computing with secure multi-party computation. So the way that works is you sort of take some data. The state is stored in a sort of secret shared form and there are ways that you can do computation on secret shared data such that the result is secret shared. But at the same time nobody, like no individual ever learns anything about the intermediate state or the end state. So blockchain like security with server like privacy is the idea.

What's the cost? You pay one network message for multiplication. So even worse than blockchains but you can parallelize it. So this is not something that we should advocate everything to run on but it is something that might be useful for some applications and with business logic a 200 times slowdown times another 100x slowdown really isn't all that bad because you're just doing like 20 computations. This is something that might be worth building on top of Ethereum 1.0 or 2.0.

Interesting thing about this mechanism actually is that the privacy part is vulnerable to a 51% attack. So it is not quite perfect but I guess some people might actually like the fact that the privacy is limited because perfect privacy has some dangerous properties. You can do scary stuff with it. But here you actually have this sort of necessarily built in property that if the, if there is this sort of super majority consensus that something deserves to be revealed, then that thing will get revealed and potentially after the fact. So interesting.

So the next idea, we all keep, we all love, but nobody understands as DACs, DAs, DAOs. So that's another one of those areas where there's sort of a whole bunch of different ideas. Nobody's exactly sure of what it, of what they are. And I've been sitting there trying to figure out exactly in what way are these DA stars going to be useful. What niche do they fill that existing institutions or existing mechanisms don't satisfy already?

So one answer is, corporations and you actually could put in governments. They exist to quickly form and stabilize complex equilibria. So simple equilibria we all know. So for example, something like, if you have a tribe in a forest, and they can have a sort of equilibrium, a social norm which is a kind of equilibrium where nobody kills each other. Because if one person kills some other person, then everyone else will ostracize them or will attack them. And if you refuse to ostracize someone who has been ostracized, then you sort of join the group of people that get ostracized and you get ostracized yourself. And that's actually a way that a whole bunch of societies actually work. And you could argue our society has some component of that and that's relatively simple.

The problem is the modern world requires us to be able to build these kind of equilibrium social norms that are more complicated. So here's one analysis of a company. So let's say you have a set of customers C. Now each of these customers pays A dollars to a set of researchers R and they each pay B dollars to a set of manufacturers M and so forth and so forth.

Now problem is that these transfers include payment for public goods. So the researchers in this case are not producing a product for customers. They're producing a public good which the manufacturers and the customers pay for. So this whole thing is not workable piecemeal. You can't have a sort of pure market mechanism based on a simple market in which C and R and M would both get appropriately subsidized and you would end up in a case where R generally gets underfunded and manufacturers manufacture relatively to the optimum crap.

So what a company is, is a company generates a combined transfer profile. So it says that, okay, the combined transfer profile is that the customers pay $12, researchers get $2 per product, manufacturers get $5 per product and so forth. And then it sort of solidifies that particular profile via a network effect. So that the existence of a network effect is what prevents that equilibrium from sort of sliding down into, well, the customer is realizing, well, what if I skip out on the components of the payment to the researchers? Because the whole thing is sort of mashed together into this bundle and it all sort of works together.

So DA stars. Well, there's this chart that actually made it, thanks to Dave Babbitt, made it onto Wikipedia. So the idea is that this sort of autonomous organization. And the idea is that so far most research in automation has been in automation at the edges. So assembly line robots are one example. Just general tools are all partial automation. Laptops are partial automation.

But the thing that we can do, the other sort of paradigm that we've been underexploring is this idea of automation at the center and humans at the edges. So humans still perform. There's a lot of tasks that have to be done by people. People have a lot more creativity, at least in most cases than machines do. At least unless, well, maybe not in art, because to be fair, fractals are rather pretty. But more creativity, more intelligence, the ability to understand the real world and so forth.

But then you have these sort of ideas. You either partially or completely do something to replace the management component at the center, and you sort of replace it with some kind of system that generates equilibria by itself.

So one issue is that there are generally two types of complexity. So human organizations are good at generating equilibria that have this sort of subjective type of complexity. So if you think about the legal system as one big social norm, the question is, what constitutes fraud? What does lying by omission count? Even, what constitutes violence? What's the exact bright line between pollution and just dumping a bunch of sludge onto my house.

But blockchains are better at sort of computational complexity. So highly complex state transition functions, having a sort of turnkey programming language, having, creating all these really complicated apps with a whole bunch of lines of code. And the question is maybe, maybe the sort of new opportunity is creating equilibria, creating social norms that are subjectively and computationally complex at the same time. So that's just, it is a theory. It might end up not being a particularly useful avenue once again. But it is sort of an interesting way of formalizing the idea.

Dapps. So we like that, we as developers we like dapps because in part political reasons. We like privacy, we like decentralization, we like the idea of nobody being in control and we wrote the dapps. We appreciate the cool math that goes behind them. Normal people like dapps if they do something useful.

So what dapps could become actually useful. So in this case useful doesn't mean, it doesn't necessarily have to mean useful in the sense of providing value in some particular or being usable in some puritanical sense of the word use. It just means useful in the sense that attract people who will be willing to use them and particularly for reasons other than these two.

So realistically it's going to be games at first, easy to write, accessible or understandable by anyone. Probably going to be the majority use case of Ethereum I think at least initially just because they're fun, you can write them quickly, people can play them. Yeah, you can do Battleships on the blockchain now. And the reason why you do Battleships on the blockchain is first of all you have these sort of games with crypto tokens involved. The decentralization part helps to make sure that the whole game, that the game is fair and that the server doesn't have any kind of hidden advantage and so forth.

Computational resource markets. So mesh networking is, or the general category of a market for bandwidth, for market for personal bandwidth is interesting. So for a long term vision I've made several speeches where I talk about this idea where you have, where you replace, you uproot the entire ISP system, you replace it with a big global incentivized mesh network where you have the ability to form a company whose sole purpose is to maintain one wire going from Vancouver and Canada to Melbourne and Australia. And then there's a bunch of these sort of single wire companies and maybe some multi wire companies around the world. And every time when you connect to the network, use Dijkstra's algorithm automatically. Find the shortest and cheapest path to wherever, whatever sort of you want to connect to and whoever ends up serving along that path. You just use them. So completely decentralized everything.

Problem is you can't change the world all at once. You need some near term, minimal viable products. So one of them is sharing a WiFi connection. So here's a problem that I've had many times. I go into a new country, I open up my phone, I try and connect to some roaming network. No network works, not even the roaming networks. Now I go into, I go onto the streets and I try and look for a WiFi connection. Ooh, free WiFi. Free WiFi. Free WiFi. Okay, free WiFi. Yay. Landing page. Enter your mobile number.

So two cases. Case one, I have a phone and it's connected to a network. If it's connected to a network, then that means that I probably also have data. And then I don't need your stupid WiFi. Case two, my phone is not connected to a network and this thing's bloody useless.

So what do I do instead? Well, if I have one of these mesh networking apps, what I can do is I can actually open up a little app and I can go into some crowded place and hey, look, here's 100 people and at least some of them are already connected to some of these wireless networks, some of which are locked, some of which are sort of free and so forth. And I can just pay somebody whatever, one cent a megabyte in order to just basically act as a proxy for me. This is something that people could benefit from by the millions ever, right today, if it was available.

Another near term MVP, decentralized VPN. So I've kind of noticed that when I was in China for a few weeks, they're actually just because a lot of the VPNs are centralized, they're actually blocking access to the VPN sites. So and even when I try and log on to Google with a VPN now, what happens is Google actually detects that you're logging on through a VPN and they don't let you log in. So decentralized VPNs could actually substantiate and make a substantial difference.

And they're not, it's not going to work as a volunteer service, I think. You need to incentivize high quality. And the way you incentivize high quality is with incentivization. The reason why you don't want to use Tor is because Tor is just way too powerful for most normal people. Normal people don't need three hops, they just need one hop. Like they're just interested in protecting your privacy to some minimal extent, getting around some restriction. They don't necessarily need NSA proof sort of either military grade or Silk Road grade protection or whatever. They're just trying to have some minimal level of privacy that they're fine with. So this might actually work substantially.

File storage, another one. So near term MVP content addressable web. I'm sure, I won't get into that because I'm sure Juan talked about it.

Supply. So a sort of nice use case of this is that. So one of the things that was pointed out about proof of stake is that you lose this ability to, for anyone to in theory be able to get some quantity of coins by mining them. And the sort of substitute that you can have is you can have a sort of decentralized file storage market where instead of mining by actually mining, you're sort of mining by, you're renting out your hard drive space and storing files for other people. And that's a quick way to earn whatever few, whatever number of finney a day that you're going to get.

Cloud computing projects are going, presented so this week. So that's just a number of possibilities.

So nice things about computational resource markets are, in computational land you can cryptographically prove that certain things happen. So file storage, you use Merkle trees, you can prove that you're storing a file for cloud computing. You can do various sorts of auditable off-chain protocols. So you could do like, you could even do like crazy Merkle tree stack trees, hash hashing to be able to very efficiently prove that you computed something in a valid way or eventually you'll just use SNARKs.

So the idea is that these sort of cryptographic protocols, the blockchain can run verifiers and they can make events directly conditional on verification. Security deposits are king. So that's another point. Thank you, Vlad. Not for coming up with this particular quote, but for just bringing to my attention the fact that security deposits are rather important. They are.

So that's stablecoins, another interesting use case. So problem is right now cryptocurrencies are volatile. The reason why they're volatile is that for currencies, price is basically demand divided by supply. Now it's not a general economic principle that holds generally, just say price is dependent on demand and supply. But for currencies, if there's twice as many units of a currency, then the natural thing to do is just for all prices to go up by a factor of two and then everything's sort of in the exact same equilibrium.

So current model that we have in most crypto environments, one coin stable supply, the demand is always volatile. And so volatile demand leads to volatile price. All Bitcoin users must participate in speculation as part of their Bitcoin use. That's one of those.

And no, going mainstream will not solve the problem. So in the list, people who think that it will naturally become extremely stable when it becomes mainstream. This is gold. It's as mainstream as you can get. Top over bottom is about a factor of 4.6. Meanwhile, if you look at the ratio between two fiat currencies, top over bottom is about 1.5 usually. So probably not good enough. It's and once again I think that this might be one of those cases where people are willing to regress slightly for ideological reasons, but not this much.

So one idea is you sort of create stable assets with a self adjusting monetary policy. Two ways of doing it, endogenous assets. So endogenous means you instead of trying to stick, so the exogenous approach is you try and use this sort of decentralized consensus on real world data to try and perfectly track the US dollar, CNY and so forth.

Endogenous approach is that you try, you don't try and stick to any particular fiat currency. You just try and maintain some concept of stable value. And you try and use like various, you make estimators based on things like the mining hash rate, things like transaction fees. You can try and build in other markets. So you can build in a file storage market and then you can have an estimator sort of listen to the price on that market.

Now so you create some stable assets. And the theory is that you have sort of two coins, one volatile coin, one stable coin, and that's how networks would work and the sort of innovation without speculation. I just put that in there somewhat ironically because it's Blockstream's slogan in their sort of quest to make Bitcoin, Bitcoin maximalism, making Bitcoin the one currency over all of all networks. And I argue that Bitcoin is also speculative and just as speculative as some of these crazy alts.

And so if we really want people, if we really want to decouple the sort of volatility part from the cryptocurrency part, the volatility is always going to remain because these are assets that have no intrinsic value. It's unavoidable. The thing that we really want to do is we want to try and specialize the volatility away. So we want to create a mechanism where people who wants to gamble can gamble, but people who, grandma that just wants to keep her retirement savings safe can also fairly reliably do that as well.

So those are just some of the ideas that I think we'll be seeing in as far as dapps go, as far as scalability, consensus, crypto, the sort of the future of the future of the space. So could be the future or it could be a bubble. You decide.