Bitcoin Classic – more of the same?

The block size debate takes a new turn, possibly a 360º one, to end up back where it started. Yesterday Bitcoin Classic was launched as a hard fork. Developed by a team that includes the original Bitcoin Core maintainer Gavin Andresen (one of the proponents of the now-irrelevant Bitcoin XT) and Bloq CEO Jeff Garzik (who had also proposed the apparently popular Segregated Witness idea as an alternative to a block size increase), Bitcoin Classic updates the current and standard Bitcoin Core protocol with a 2MB block size limit (vs 1MB).

Since this involves a hard fork (transactions that are accepted in the new version would be rejected by the old version, resulting in two different chains), tension is high and opinions are divided. On the one hand, you have those that believe that a bigger block size is urgently needed, and a hard fork will not be that disruptive. On the other, you have those who fear the uncertainty a hard fork will unleash, who would like to find a less abrupt change or who don’t think the block size should be increased at all.

The public approval has been notable, with many of the large bitcoin companies (such as Coinbase, itBit, Xapo, OKCoin…) expressing support. Just over the past two days, the number of nodes running Classic jumped from under 500 to over 700, while about 4,100 are still on Core.


Yet it is worth remembering that Bitcoin XT attracted almost 900 nodes soon after launch in August of last year, before fizzling out. And today a group called The Bitcoin Roundtable published their rejection of the new protocol. This group allegedly represents 90% of bitcoin’s hash power, although that figure has been questioned as some of the signers work for firms who have publicly backed Classic. Confusing.

For the software to officially “activate” and become the main bitcoin protocol, a certain volume requirement has to be met. Of the last 1000 mined blocks, at least 751 (or, to put it another way, just over 75%) of them need to have been processed with Bitcoin Classic.

According to, of the last 1000 blocks mined today, none used Bitcoin Classic.


So, this may all end with another shrug of the shoulders as the debate continues unresolved. Or, the need for a consensus-based change may become more pressing with the public pressure and scrutiny. It would be great to get this resolved, as it will set the tone for bitcoin development going forward. Not just on the transaction limit issue. On the governance issue, which is almost even more important.

Hard fork vs soft fork

The most interesting thing going on in the Bitcoin world this week hasn’t been the alleged unmasking of Satoshi Nakamoto (and I stress “alleged” because, no, I don’t think so). It’s the gathering of the Bitcoin influencers at the Scaling Bitcoin conference in Hong Kong. As the second part to the Scaling Bitcoin conference in Montreal earlier this year, the objective was to get miners and developers together to discuss a possible solution to the block size debate.

image via Coindesk
image via Coindesk

Up until now, everyone has been assuming that the choice is between leave things as they are, or launch a hard fork that will increase the block size and give Bitcoin more scalability. There are pros and cons for each version, and firm, uncompromising beliefs on both sides. Yet some interesting ideas have emerged.

To appreciate what’s at stake here, it is important to understand what a “hard fork” is. A hard fork is a change to the current Bitcoin Core protocol that renders older versions invalid. The Bitcoin Core protocol defines how Bitcoin works. It is the core program that nodes use to validate blocks, and dictates such parameters as the block size, the difficulty of the cryptographic puzzle that needs to be solved, limits to additional information that can be added, etc. A change to any of these rules that would cause blocks to be accepted by the new protocol but rejected by older versions, would lead to serious problems on the blockchain.

Let’s say that the protocol is changed in a relatively fundamental way that relaxes the rules or broadens the code’s scope. If this happens, mining nodes running new versions would produce validated blocks that will not be accepted by nodes running an older version. For instance, if the block size limit is increased from 1MB to 4MB, a 2MB block will be accepted by nodes running the new version, but rejected by nodes running the older version. Let’s say that this 2MB block is validated by an updated node, and added on to the blockchain. But what if the next block is validated by a node running an older version of the protocol? It will try to add its block to the blockchain, but it will detect that the latest block is not valid. So, it will ignore that block and attach its new validation to the previous one. Suddenly you have two blockchains, one with both older and newer version blocks, and another with only older version blocks. Which chain grows faster will depend on which nodes get the next blocks validated, and there could end up being additional splits. It is feasible that the two (or more) chains could grow in parallel indefinitely.


This is a hard fork, and it’s messy. It’s also risky, as it’s possible that bitcoins spent in a new block could then be spent again on an old block (since merchants, wallets and users running the previous code would not detect the spending on the new code, which they deem invalid). The only solution is for one branch to be abandoned in favour of the other, which involves some miners losing out (the transactions themselves would not be lost, they’d just be re-allocated). Or, all nodes switch to the newer version at the same time, which unfortunately is almost impossible to achieve in a decentralized, widely spread system. Or, Bitcoin splits, which would damage its usefulness and scalability. With a hard fork, since new version blocks are only accepted by upgraded nodes, it is essential that all nodes upgrade as soon as possible. This is very hard to achieve.

In March 2013, an accidental hard fork – brought on by an update which led to a database glitch – split the blockchain. The chain mined by updated nodes was longer than the chain containing only older nodes, so it would have been more efficient for the shorter chain transactions to pass to the longer chain. But that would have required a massive forced upgrade, which would have been logistically complicated, so the community decided to abandon the update and go back to the previous version.

For examples of changes that would require a hard fork, see the “hardfork wishlist”.

If, however, the protocol is changed in a way that tightens the rules, that implements a cosmetic change or that adds a function that does not affect the structure in any way, then new version blocks will be accepted by old version nodes. Not the other way around, though: the newer, “tighter” version would reject old version blocks. Old-version miners would realize that their blocks were being pushed off (“orphaned”), and would upgrade. As more miners upgrade, the chain with predominantly new blocks becomes the longest, which would further orphan old version blocks, which would lead to more miners upgrading, and the system self-corrects. Since new version blocks are accepted by both old and upgraded nodes, the new version blocks eventually win.

For instance, say the community decided to reduce the block size to 0.5MB from the current limit of 1MB. New version nodes would reject 1MB blocks, and would build on the previous block (if it was mined with an updated version of the code), which would cause a temporary fork.

This is a soft fork, and it’s already happened several times. Initially, Bitcoin didn’t have a block size limit. Introducing the limit of 1MB was done through a soft fork, since the new rule was “stricter” than the old one. The pay-to-script-hash function, which enhances the code without changing the structure (more on this later), was successfully added through a soft fork. This type of amendment requires only the majority of miners to upgrade, which makes it more feasible and less disruptive.

Soft forks do not carry the double-spend risk that plagues hard forks, since merchants and users running old nodes will read both new and old version blocks.

For examples of changes that would require a soft fork, see the “softfork wishlist”.

One interesting development to come out of the Hong Kong talks is Pieter Wiulle’s “segregated witness” proposal, which would enable Bitcoin to increase the number of possible transactions in a block without a hard fork (more details later). This has the Bitcoin community quite excited, as it would enable a greater level of growth, while avoiding the risks and the controversy. The drama is far from over, though. And the next time you find yourself setting the dinner table, think about using spoons instead.




How Bitcoin works

(This is the first in a series of articles about the basics of Bitcoin, which I will also include in a separate section on the web called Bitcoin Basics.)


Understanding how bitcoin works is not necessary to be able to see its potential and genius. Most of us use the Internet without understanding transfer protocols and IPs, right? We will, as a society, end up being quite comfortable with the underlying technology without thinking about hash functions and state outputs. We probably won’t even notice that we’re using bitcoin, the blockchain or whatever it will be called by then. Much like how now, when we click, we don’t think about data packets.

But if you’re like me, understanding how it works is fun. And for many, it’s a question of trust: understanding is believing. So, without going into too much cryptographic detail, here goes:

(As a convention, when I introduce a technical-ish term for the first time, I’ll put it in quotation marks, but after that, it gets treated as a normal word. And Bitcoin = the system and the concept, while bitcoin = the currency.)

Simple version:

If I want to send one of my bitcoins to you, I “publish” my intention and the entire bitcoin network validates that I have the bitcoin that I want to send, and that I haven’t already sent it to someone else. Once that information is validated, my transaction gets included in a “block” which gets attached to the previous block. Hence the term “blockchain”. Transactions can’t be undone or tampered with, because the blockchain can’t be tampered with. A bit like Lego and superglue.

Getting a bit more complicated:

I keep my bitcoins in my “bitcoin address”, which is a long string of 34 letters and numbers. This address is also known as my “public key”.  I don’t mind that the whole world can see this sequence. Each address/public key has a corresponding “private key” of 64 letters and numbers. This is private, it’s crucial that I keep it secret, and that I don’t lose it. The two keys are related, but there’s no way that you can figure out my private key from my public key.

That’s important, because any transaction I issue from my bitcoin address needs to be “signed” with my private key. To do that, I put both my private key and the transaction details (that I want to send you 1 bitcoin, for example) into the bitcoin software on my computer or phone. With this information, the program spits out a digital “signature”. I send this out to the network for validation.

This transaction can be validated – that is, it can be confirmed that I own the bitcoin that I am transferring to you, and that I haven’t already sent it to someone else – by plugging the signature and my public key (which everyone knows) into the bitcoin program. This is one of the genius parts of Bitcoin: if the signature was made with the private key that corresponds to that public key, the program will validate the transaction, without knowing what the private key is. Very clever.

The network then confirms that I haven’t previously spent the bitcoin by running through my address history, which it can do because it knows my address (= my public key), and because all transactions are public on the bitcoin ledger.

Even more complicated:

Once my transaction has been validated, it gets included into a “block”, along with a bunch of other transactions.

A brief detour to discuss what a “hash” is, because it’s important for the next paragraph: a hash is produced by a “hash function”, which is a complex math equation that reduces any amount of text or data to 64-character string. It’s not random – every time you put in that particular data set through the hash function, you’ll get the same 64-character string. But if you change so much as a comma, you’ll get a completely different 64-character string. This whole article could be reduced to a hash, and unless I change, remove or add anything to the text, the same hash can be produced again and again. This is a very effective way to tell if something has been changed, and is how Bitcoin can confirm that a transaction has not been tampered with.

Back to our blocks: each block includes, as part of its data, a hash of the previous block. That’s what makes it part of a chain, hence the term “blockchain”. So if one small part of the previous block was tampered with, the current block’s hash would have to change (remember that one tiny change in the input of the hash function changes the output). So if you want to change something in the previous block, you also have to change something (= the hash) in the current block, because the one that is currently included is no longer correct. That’s very hard to do, especially since by the time you’ve reached half way, there’s probably another block on top of the current one. Since the current hash needs to be changed, the hash of the current block included in the next one would also need to be changed. And so on.

This is what makes Bitcoin virtually tamper-proof. I say virtually because it’s not impossible, just very very very very very difficult and therefore unlikely.

How are bitcoin created?

This part is actually simpler than it seems (thank goodness).

Bitcoins are created as a reward for creating blocks of validated transactions and including them in the blockchain.

Backtracking a bit, let’s talk about “nodes”. A node is a powerful computer that runs the bitcoin software. Anyone can run a node, you just buy the right hardware (pretty expensive if you want to be a “mining node”) and download the Bitcoin software (free). Nodes spread bitcoin transactions around the network. One node will send information to a few nodes that it knows, who will relay the information to nodes that they know, etc. That way it ends up getting around the whole network pretty quickly.

Not all nodes are mining nodes. Some just help to keep Bitcoin running by participating in the relay of information. Mining nodes actually create blocks and add them to the block chain. How do they do this? By solving a complex mathematical puzzle that is part of the Bitcoin program, and including the answer in the block. The puzzle that needs solving is to find a number that, when combined with the data in the block and passed through a hash function, produces a result that is within a certain range.

How do they find this number? By guessing at random. The hash function makes it impossible to predict what the output will be. Changing a text or data set by just a little bit could change the resulting hash value by a lot, or by a little, there’s no way of knowing ahead of time. So, miners will start guessing what the mystery number could be, and applying the established hash function to the combination of that number and the data in the block. The first mining node to get a resulting hash within the desired range announces its victory to the rest of the network. All the other mining nodes immediately stop work on that block and start trying to figure out the mystery number for the next one. As a reward for its amazing work lucky guess, the victorious mining node gets to send itself some new bitcoins.

At the time of writing, the reward is 25 bitcoins, which at $270/BTC is worth about $7,000. Not bad for 10 minutes’ work.

Although it’s not nearly as cushy a deal as it sounds. There are a lot of mining nodes competing for that reward, and it is a question of luck. Also, the costs of being a mining node are considerable, not only because of the powerful hardware needed (if you have a slower processor than your competitors, it’s unlikely that you’ll find the correct number before they do), but also because of the large amounts of electricity that running these processors consumes. And, the number of bitcoins awarded as a reward for solving the puzzle halves approximately every four years. It’s 25 now, but should go down to 12.5 sometime in 2017, then 6.25 in 2021, etc. It’s likely that the value of bitcoin relative to the dollar will go up over the next few years to partially compensate this reduction, but it’s not certain.

Why 10 minutes? That is the amount of time that the Bitcoin developers think is necessary to keep the entry of new bitcoins to a trickle rather than a flood. It’s arbitrary, and is controlled via the difficulty of solving the block puzzle. They could make it easier or more difficult by changing the puzzle rules (reducing or expanding the range of acceptable answers, for example), but they like 10 minutes.

Why try to keep the introduction of new bitcoins down? Because the limit on the eventual supply of bitcoins is one of the factors that gives it value (you can’t have a currency that’s in unlimited supply, right?). To enforce this control, the Bitcoin program stipulates that there can never be more than 21 million bitcoins in existence. Given the halving of the reward, plus the timed entry of new bitcoins, we should reach that level in 2140.

If you’ve made it this far, then congratulations! There is still so much more to explain about the system, but at least now you have an idea of the broad outline of the genius of the programming and the concept. For the first time we have a system that allows for convenient digital transfers in a decentralized, trust-free and tamper-proof way. The repercussions, the applications and the potential of this will be huge.

(For more on how Bitcoin works, see Bitcoin Basics.)

Where’s the knife? The possible repercussions of a hard fork

A hard fork in Bitcoin is nothing new, and is generally not a problem. But the launch of BitcoinXT and the controversy surrounding the block size increase has given the phrase “hard fork” a sinister tone, with some claiming it portends the end of Bitcoin as we know it. Should we be worried?

photo by Alejandro Escamilla for Unsplash
photo by Alejandro Escamilla for Unsplash

First, some background. A hard fork is a change to the current Bitcoin Core protocol, the “program” that defines how Bitcoin is run by the mining nodes. The change is such that nodes running the previous protocol will reject new blocks created on the new protocol as invalid. So, some nodes will not add a block onto the chain, while other nodes will. Chaotic, confusing, and risky, as it’s possible that bitcoins spent in a new block could then be spent again on an old block (since the nodes running the previous code would not detect the spending on the new code, which they deem invalid).

Hard forks happen when a bug has been found: the programmers fix it, and inform the nodes that they need to switch to the new version. The old version is vulnerable to the bug, so a change to the new version is imperative. “New” blocks will not be valid on the “old” chain. Hard forks can also happen when a policy change needs to be made. When a hard fork happens, everyone switches as soon as possible to avoid the chaos described above.

So, no problem. However, a hard fork can come with controversy attached, as with the block size increase, which is an important component of the protocol. The current block size is 1MB. While bitcoin transactions are currently taking up less than half that, the use is growing. Some developers strongly believe that a block size increase is urgently needed, so urgently that they are urging a hard fork which will increase the limit by a factor of 8. Others think that a block size increase is a bad idea, that we should let the block sizes fill up and allow economics to regulate supply and demand. Bitcoin transaction fees would rise, as only the “profitable” ones will be allocated the scarce space. Others have no problem with a block size increase, but argue that a factor of 8 is either too much or too little.

Most developers seem to recognize that an increase is necessary, but that consensus is vital. There seems to be a consensus that we need a consensus (sorry, couldn’t resist). Why is consensus so important? Because without it, we could end up having two or more Bitcoin protocols. Coins could be spent and mined on one of two or more chains. This could lead to not only confusion and a collapse of the bitcoin price, but also to double-spending and fraud. If I spend my bitcoins on one chain, and it’s verified, I could also spend the same bitcoins on another chain, which will not have the information from the first chain.

The resulting lack of trust, combined with the fall in price, could well kill Bitcoin’s future.

For the first time since Bitcoin’s launch, a split Bitcoin community looks possible. It’s not because of a looming hard fork. Whatever size increase is agreed upon, a hard fork is necessary. The hard fork itself is not the problem. The main worry is the acceptable definition of “consensus”.

In August, Mike Hearn and core developer Gavin Andresen launched BitcoinXT, which will implement an increase to 8MB on the 6th of January 2016, if 75% of the nodes agree. The problem here is the threshold of 75%. Until now, the threshold necessary for implementation of the new Bitcoin protocol has been 95%, since everyone can agree that that is more or less unanimous. Yet if even if 75% agree, 25% choosing to stick with the current Bitcoin Core is still enough to maintain its own chain and thus cause a split. Two “current” bitcoin chains would not be good.

But don’t miners have to add blocks to the longest chain? Wouldn’t “old-school” miners have to add to the longer, newer chain? And wouldn’t that sort things out automatically? No, “old-school” miners have the obligation to add blocks onto the longest valid chain. And for them, the new chains are not valid. So they would continue adding blocks to the old chain. And 25% of miners doing so is enough to make that chain a viable alternative to the newer one.

This is a huge issue, which goes much deeper than the actual technicality of the amount of transactions that can fit in any one block. This issue speaks to the governance of Bitcoin. Bitcoin theoretically doesn’t have any governance, it is a decentralized network run by the network’s many operators. A team of core developers maintains and updates the code, relying on consensus to push forward. Decisions have been made with the well-being of Bitcoin in mind. Up until now, pretty much everyone has been able to agree on what that requires.

The move by BitcoinXT’s promoters is motivated by the conviction that if their plan is not adopted, then Bitcoin will fail. Hence the need for an urgent hard fork, with a lower consensus threshold.

It’s risky.

But it might happen. So far BitcoinXT has only received explicit support of 10% of the mining community. However, the underlying premise of BitcoinXT seems to be popular, with most of the “big players” in the sector behind it (Xapo, Circle, BitPay,, Bitnet and others). That means that mining support could well increase significantly over the next few months. If that happens, more miners will sign on, “just in case”, which will gather momentum towards the 75% threshold. And if that is passed, two weeks later, the new block size limit of 8MB is in place, and Bitcoin has a hard fork.

It is very likely that consensus will be reached, simply because everyone has so much to lose if trust in Bitcoin fails. Yet the debate does highlight the difficulties in running a growing yet decentralized network. Personally, I believe that Bitcoin will remain Bitcoin, that some of the dissenting core developers will splinter off into altcoins or sidechain development, and that tempers will cool down. Until the next fundamental disagreement arises. But hopefully by then, Bitcoin’s use will have spread to the extent that its structure and market penetration are different from today. That in itself will give rise to a new decision-making organization, probably still decentralized, but with a clearer consensus-generating communication and with more to lose if it fails.

It’s not the size that matters, it’s what you do with it

When Clay Shirky said that “collaborative production is simple: no one person can take credit for what gets created, and the project could not come into being without the participation of many,” he could have been talking about Bitcoin. The digital currency was created anonymously, and is currently run by a handful of core programmers and a broad network of “miners” whose computers do the actual work of validating and publishing transactions, and an even broader network of “nodes” who help to maintain the public ledger of who sent what to whom. This is crowdsourcing almost at its purest. No one person decides what happens to the program, and no one person can unilaterally make a change to how it works.

While that no doubt has already raised some sceptical eyebrows, the system has worked pretty well since its initial roll-out in 2009. There have been differences of opinion, and accidents that needed fixing, but so far decisions have been made with the common good in mind. Because one of the beauties of this system is that the common good has, so far, equated to the individual good. If Bitcoin were to fail, all players would lose out.

Image by Jens Lelie for Unsplash
Image by Jens Lelie for Unsplash

The scene has changed, though. Now Bitcoin is coming up against a fundamental problem that has generated heated debate, with its share of prognoses of doom. The various sides are digging in, because each seems to be convinced that Bitcoin will fail if his adaptation isn’t universally adopted. This weekend core Bitcoin developers are gathering – apparently for the first time – in Montreal, Canada in the first (and sold-out) “bitcoin scalability workshop” to start the attempt to reach a consensus.

The upcoming problem is this: the volume of transactions is growing. It’s a problem because transactions are grouped into blocks for processing, and the original and current protocol stipulates a block size limit of 1MB. At first that wasn’t a problem, and even now, it’s not really an issue, as the average block size hovers around 425K. But if Bitcoin keeps growing, which it will, the limit is expected to be hit sometime next year. Smaller transactions will start to get pushed aside until a block comes along with space. The current confirmation time lag of about 10 minutes (the time between each block confirmation) could end up being hours or even days, which would seriously reduce its utility. We could end up in a bidding war, with miners giving preference to transactions with higher fees attached. Fees would climb to a level that would price out smaller transactions, which would further limit Bitcoin’s usefulness.

The debate is centred on the issue of a block size limit increase. Should there be one? And if so, to what? And when? It’s not a simple matter, as bigger block sizes can slow down the system and require even more bandwidth and energy to process. The main complication, though, comes from the democratic process. Opinion is divided. So who decides? Which option is best for the system as a whole?

The jury is still out on that, and will be at least until the end of the year. That’s when, effectively, the votes will be counted on the various proposals put forward by the Bitcoin factions (yes, I know, “votes” is an over simplification… I’ll talk more about this process another time). Let’s go over them:

  • Stay as we are. This group doesn’t see any reason to touch the block size. Miners like the idea of higher transaction fees, as it means more profits for them. Most, however, do realize that if Bitcoin transactions slow down, its growth will be limited, which will eventually hurt their income. Another possibility is that Bitcoin continues to grow, but with more and more transactions being done on sidechains (more on these later), so the actual blockchain capacity won’t need to increase. Sidechains tend to not enjoy the same level of decentralization as Bitcoin. But larger blocks could also lead to an erosion of the fundamental decentralizing principle.
  • BIP100. (BIP stands for Bitcoin Improvement Proposal). This proposal was authored by core developer Jeff Garzik, and stipulates a gradual increase of the block size, with thorough testing at each stage. This option is popular, largely due to the gradual increases proposed, and the voting power given to the miners. Every 90 days or so, miners can state what block size limit they would like to see, and the winning target becomes the new block size. This means that the block size can go down as well as up. The mining pools like this idea, but other major stakeholders (wallets, exchanges, etc.) worry about the concentration of influence.
  • BIP101 – by core developer Gavin Andresen, this proposal also suggests that the block size be gradually increased over time, at a linear rate, starting from 8MB on 11th of January 2016, and increasing linearly up to 8,192MB on the 6th of January 2036. This proposal has received support from big players such as Xapo, BitPay, and Circle, among others.
  • BIP102 – another proposal by core developer Jeff Garzik (confused yet?), who proposes here an increase of the limit to 2MB. Jeff suggests this as an “emergency fallback” if consensus is not achieved on the other proposals.
  • BIP103 (actually, for some reason this BIP doesn’t technically have a number, but people seem to be referring to it as BIP103) – Bitcoin core developer Peter Wiulle thinks that we should increase the block size by 17.7% a year, starting in January 2017. Why 17.7%? Apparently that’s the estimated amount needed to stay in line with technological growth.
  • BIP105 – this is similar to BIP100 in that it lets the miners vote on block size increases. The difference is that BIP105 stipulates a cost to the miners if they vote for an increase. Miners would pay for an increase if it is profitable for them, but it would become costly to do so just to gain a competitive or political advantage. Although it sounds counterintuitive, adding a cost to a proposed increase would increase efficiency, as miners would not want to waste their resources on a vote unless they were reasonably sure that other miners would vote the same. BIP100 stipulates an upper limit of 32MB. BIP105 lowers that to 8MB.
  • BIP106 calls for a dynamically adjusted block size limit, according to the previous block size, with the possibility of including the previous transaction fees in the calculation. If the average block size is almost full, double the size. If it’s not even half full, halve the size. If it’s sort of in the middle, the block size stays the same.
  • Reassess. Adam Back’s idea is similar to BIP102 (Jeff Garzik’s suggestion that the block size increases to 2MB in the short term if no other solution has been agreed on) in that the increases only contemplate the short term, since no-one really knows what the Bitcoin world will look like in the future. Back proposes a 2MB increase as soon as possible, eventually going up to 8MB after four years. Once we get there, then we can reassess, and see if further increases are necessary.

There are actually a ton of other interesting ideas out there to solve the scalability issue, but these seem (to me) to be the main ones. How important is the question of size? That itself is open to debate. Satoshi himself said back in 2010:

It would be nice to keep the [block chain] files small as long as we can.

The eventual solution will be to not care how big it gets.

But it is worth bearing in mind that Satoshi started the ball rolling, and then left the developer community to continue with the experimentation, tweaking and adjustments. As with just about any invention, no creator can foresee all future bottlenecks, bugs and case scenarios.

More than the future of Bitcoin is at stake here. Bitcoin is more than an efficient payment system. It’s more than an entirely new way to transfer value and verified information. It’s a social experiment. Can we, as a group, manage a concept as potentially powerful as Bitcoin, without a clear chain of command? Are we capable of that level of teamwork?

Looking further ahead, a more interesting question than “Can we?”, is “What if we can?”.