The only reason why Bitcoin has grown the way it has, and in fact the only
reason why we're all even here on this mailing list talking about this, is
because Bitcoin is growing, since it's "better money than other money". One
of the key characteristics toward that is Bitcoin being inexpensive to
transact. If that characteristic is no longer true, then Bitcoin isn't
going to grow, and in fact Bitcoin itself will be replaced by better money
that is less expensive to transfer.

So the importance of this issue cannot be overstated -- it's compete or die
for Bitcoin -- because people want to transact with global consensus at
high volume, and because technology exists to service that want, then it's
going to be met. This is basic rules of demand and supply. I don't
necessarily disagree with your position on only wanting to support
uncontroversial commits, but I think it's important to get consensus on the
criticality of the block size issue: do you agree, disagree, or not take a
side, and why?


On Tue, Aug 11, 2015 at 2:51 PM, Pieter Wuille <pieter.wui...@gmail.com>
wrote:

> On Tue, Aug 11, 2015 at 9:37 PM, Michael Naber via bitcoin-dev <
> bitcoin-dev@lists.linuxfoundation.org> wrote:
>
>> Hitting the limit in and of itself is not necessarily a bad thing. The
>> question at hand is whether we should constrain that limit below what
>> technology is capable of delivering. I'm arguing that not only we should
>> not, but that we could not even if we wanted to, since competition will
>> deliver capacity for global consensus whether it's in Bitcoin or in some
>> other product / fork.
>>
>
> The question is not what the technology can deliver. The question is what
> price we're willing to pay for that. It is not a boolean "at this size,
> things break, and below it, they work". A small constant factor increase
> will unlikely break anything in the short term, but it will come with
> higher centralization pressure of various forms. There is discussion about
> whether these centralization pressures are significant, but citing that
> it's artificially constrained under the limit is IMHO a misrepresentation.
> It is constrained to aim for a certain balance between utility and risk,
> and neither extreme is interesting, while possibly still "working".
>
> Consensus rules are what keeps the system together. You can't simply
> switch to new rules on your own, because the rest of the system will end up
> ignoring you. These rules are there for a reason. You and I may agree about
> whether the 21M limit is necessary, and disagree about whether we need a
> block size limit, but we should be extremely careful with change. My
> position as Bitcoin Core developer is that we should merge consensus
> changes only when they are uncontroversial. Even when you believe a more
> invasive change is worth it, others may disagree, and the risk from
> disagreement is likely larger than the effect of a small block size
> increase by itself: the risk that suddenly every transaction can be spent
> twice (once on each side of the fork), the very thing that the block chain
> was designed to prevent.
>
> My personal opinion is that we should aim to do a block size increase for
> the right reasons. I don't think fear of rising fees or unreliability
> should be an issue: if fees are being paid, it means someone is willing to
> pay them. If people are doing transactions despite being unreliable, there
> must be a use for them. That may mean that some use cases don't fit
> anymore, but that is already the case.
>
> --
> Pieter
>
>
_______________________________________________
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev

Reply via email to