Re: [Bitcoin-development] comments on BIP 100

2015-06-15 Thread Eric Lombrozo
OK. O() notation normally refers to computational complexity, but ... I
still don't get it - the vast majority of users don't run relaying nodes
that take part in gossiping. They run web or SPV wallets. And the nodes
that do take part don't connect to every other node.

It's a little scary, IMO, that the fact that the majority of nodes don't
relay and only perform the most rudimtentary level of validation if any is
considered an acceptable feature of the protocol.

- Eric Lombrozo
--
___
Bitcoin-development mailing list
Bitcoin-development@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development


Re: [Bitcoin-development] comments on BIP 100

2015-06-15 Thread Adam Back
I think he's more talking about like extension-blocks, however they
are actually soft-forkable even (and keep the 21m coins obviously)

See  See 
https://www.mail-archive.com/bitcoin-development%40lists.sourceforge.net/msg07937.html

and Tier Nolan tech detail
https://www.mail-archive.com/bitcoin-development@lists.sourceforge.net/msg07927.html

Discussion / claimed properties on

https://www.reddit.com/r/Bitcoin/comments/39kqzs/how_about_a_softfork_optin_blocksize_increase/

Adam

On 15 June 2015 at 19:53, Raystonn . rayst...@hotmail.com wrote:
 The solution is to hard-fork and merge-mine. This way, both can live, and
 mining power is not divided.

 No, this would essentially be blessing an increase to 42M bitcoins, half on
 each chain.  You could expect a severe market price correction if this were
 to happen.

 From: Rebroad (sourceforge)
 Sent: Monday, June 15, 2015 4:16 AM
 Cc: Bitcoin Dev
 Subject: Re: [Bitcoin-development] comments on BIP 100

 My understanding of this debate is that there are some people who want to
 keep Bitcoin at 1MB block limit, and there are some who want to increase it.

 I for one am curious to see how 1MB limited bitcoin evolves, and I believe
 we can all have a chance to see this AND hard-fork bitcoin to remove the
 block size restriction.

 To remove the 1MB limit required a hard fork. This is not disputed. It's
 what we do with the original (1MB limit) bitcoin that concerns me (and
 other's I am sure).

 What I would like to see is both being allowed to live. Harry Potter and
 Voldermort! (Except neither are evil!)

 The solution is to hard-fork and merge-mine. This way, both can live, and
 mining power is not divided.

 Dogecoin recently hardforked and this hardfork also involved switching to
 merge-mining, so it's been done successfully.

 So, simply, bitcoin as it is doesn't need to actually fork, but instead, at
 a certain block size, a forked bitcoin with the blocksize lmit removed will
 start to be merge-mined alongside bitcoin. Miners can be ready for this.
 Wallets can be ready for this - in fact, for most wallets and merchants they
 will possibly want to default to the bigger blocksize fork since this caters
 for more transactions per block.

 We still don't know how removing the block limit will pan out and what other
 problems with scalability will arise in the future, so by preserving the
 original bitcoin, we keep diversity, and people won't feel their investments
 in bitcoin are being unnecessarily put at risk (as their investments will
 stay in both the new and the old bitcoin).

 The bitcoin core developers can implement a patch like the one recently used
 for dogecoin, to allow the chain to fork at a set point, where at which
 point, bitcoins will be split into the new and the old. Branding will be an
 important issue here I think, so that there is as little confusion as
 possible. I think this is a small price to pay in return for not killing the
 original bitcoin (even if it's true that Satoshi did intend to remove the
 1MB limit at some point).

 If I'm missing something obvious please let me know.

 On Mon, Jun 15, 2015 at 1:50 PM, Mike Hearn m...@plan99.net wrote:

 The fact that using a centralized service is easier isn't a good reason
 IMHO. It disregards the long-term, and introduces systemic risk.


 Well sure, that's easy for you to say, but you have a salary :) Other
 developers may find the incremental benefits of decentralisation low vs
 adding additional features, for instance, and who is to say they are wrong?


 But in cases where using a decentralized approach doesn't *add* anything,
 I cannot reasonably promote it, and that's why I was against getutxos in the
 P2P protocol.


 It does add something though! It means, amongst other things, I can switch
 of all my servers, walk away for good, discard this Mike Hearn pseudonym I
 invented for Bitcoin and the app will still work :) Surely that is an
 important part of being decentralised?

 It also means that as the underlying protocol evolves over time, getutxos
 can evolve along side it. P2P protocol gets encrypted/authenticated? Great,
 one more additional bit of security. If one day miners commit to UTXO sets,
 great, one more additional bit of security. When we start including input
 values in the signature hash, great ... one more step up in security.

 Anyway, I didn't really want to reopen this debate. I just point out that
 third party services are quite happy to provide whatever developers need to
 build great apps, even if doing so fails to meet some kind of perfect
 cryptographic ideal. And that's why they're winning devs.

 Now back to your regularly scheduled block size debates ...


 --

 ___
 Bitcoin-development mailing list
 Bitcoin-development@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/bitcoin-development

Re: [Bitcoin-development] comments on BIP 100

2015-06-14 Thread Jeff Garzik
Adding - in re pay-to-FOO - these schemes are inherently short term, such
that it is near-impossible for the market to plan for what happens in 12+
months.

On Sun, Jun 14, 2015 at 10:28 PM, Jeff Garzik jgar...@bitpay.com wrote:

 On Sun, Jun 14, 2015 at 5:23 PM, Adam Back a...@cypherspace.org wrote:

 Hi

 I made these comments elsewhere, but I think really we should be
 having these kind of conversations here rather than scattered around.

 These are about Jeff Garzik's outline draft BIP 100 I guess this is
 the latest draft:  (One good thing about getting off SF would be
 finally JGarzik's emails actually not getting blocked!).

 http://gtf.org/garzik/bitcoin/BIP100-blocksizechangeproposal.pdf

 may have changed since the original [1]

 Over the original proposal:

 1. there should be a hard cap, not indefinitely growing.


 In the latest draft there is an explicit 32MB ceiling now.

 Users will need to opt into growth beyond 32MB via a 2nd hard fork.



 2. there should be  a growth limiter (no more than X%/year)


 As a general principle, this is an area of market disagreement, and should
 not be our call.  Encoding this into software veers into personal opinion
 about what economic policy should be.

 That said  -- BIP 100, as a compromise, includes a growth limiter.  Abrupt
 change (1MB - 32MB!) is awful on markets.  Good policies include a
 measured pace of transition from policy A to policy B.  It gives the
 community time to assess system effectiveness - while also allowing free
 market input.

 In the long run I hope the cap is removed (see below), and the intention
 is to -slowly- and -transparently- move from the tightly controlled limit
 to something the free market and users are choosing.




 3. I think the miners should not be given a vote that has no costs to
 cast, because their interests are not necessarily aligned with users
 or businesses.

 I think Greg Maxwell's difficulty adjust [2] is better here for that
 reason.  It puts quadratic cost via higher difficulty for miners to
 vote to increase block-size, which miners can profitably do if there
 are transactions with fees available to justify it. There is also the
 growth limiter as part of Greg's proposal. [3]


 paying with difficulty has severe negative elements that will likely
 cause it never to be used:
 - complex and difficult for miners to reason
 - fails the opportunity cost test - dollar cost lost losing the block race
 versus value gained by increasing block size
 - inherently unpredictable in the short term - user experience is that
 it's possibly difficult to see a gain in utility versus the revenue you are
 giving up
 - REQUIRES informal miner collusion - probably less transparent than BIP
 100 - in order to solve the who-goes-first problem.
 - net result: tough sell

 Paying bitcoins to future miners makes a lot more sense.  Initially I was
 a fan of pay-with-diff, but freezing bitcoins (CLTV) or timelock'd
 anyone-can-spend has much more clear incentives, if you want to go down
 that road.

 Problems with pay-to-increase-block-size:
 - how much to pay?  You are inherently setting your growth policy on top
 of bitcoin by choosing a price here.
 - another who-goes-first problem

 Anyway, there is a natural equilibrium block size that the free market and
 user choice will seek.

 Related:  There is a lot of naive miner = max income = max block size
 reasoning going on, with regards to fees.  This is defining the bounds of
 an economically scarce resource.  There are many reasons why a miner will
 today, in the real world, limit their block size. WRT fee income, if block
 size is too large the fee competition in the overall market is low-to-zero,
 fee income rapidly collapses.  Then factor in price and demand elasticity
 on top of that.

 Quite frankly, there seems to be a natural block size equilibrium ceiling,
 and I worry about miners squeezing the market by maximizing their fee
 income through constrained block sizes and competition at the low end.
 This is of course already possible today - miners may openly or covertly
 collude to keep the block size low.














 I think bitcoin will have to involve layering models that uplift
 security to higher layers, but preserve security assurances, and
 smart-contracts even, with protocols that improve the algorithmic
 complexity beyond O(n^2) in users, like lightning, and there are
 multiple other candidates with useful tradeoffs for various use-cases.

 One thing that is concerning is that few in industry seem inclined to
 take any development initiatives or even integrate a library.  I
 suppose eventually that problem would self-correct as new startups
 would make a more scalable wallet and services that are layer2 aware
 and eat the lunch of the laggards.  But it will be helpful if we
 expose companies to the back-pressure actually implied by an O(n^2)
 scaling protocol and don't just immediately increase the block-size to
 levels that are dangerous for