Please login or register.

Increasing the minimum block size

The minimum median block size is current 20 KB, meaning without the dynamic block size kicking in, any block can be up to 20 kilobytes without any miner penalty. In increasing the block time to two minutes it was initially proposed to increase the minimum median size to 40 KB, which retains the same rate of base block capacity.

It was later proposed to increase this to 60 KB, because of the observation that larger mix sizes will create more split transactions and split transactions are inconvenient for merchants, whose systems are often set up to accept one payment, and further sometimes find payments need to be processed manually due to various problems with split payments. 60 KB is observed to allow most routine payments to be handled without splitting.

So the current plan is to increase the minimum median block size to 60 KB. As a rate this is 300 KB per 10 minutes, so 30% of Bitcoin's maximum rate (1 MB/10 minutes), though of course Monero allows the block size to increase based on transaction demand. The minimum size reduces the resistance to spam attacks and increases the potential for more rapid blockchain growth relative to transaction demand.

It is not necessary to increase the minimum block size at all. In AEON when the block time was increased from 1 to 4 minutes, no increase in the minimum block size was made, and everything still works, with the block size dynamically increasing during high usage. However, this does not provide any relief from the need for many split transactions, and further AEON does not impose a hard minimum on mix factors the way Monero will (post-fork).

Open for discussion or feedback on the proposal to increase the minimum median block size to 60 KB.

Replies: 19
nioc posted 8 years ago Weight: -110 | Link [ - ]

It looks like the 60 KB was just added to GitHub

papa_lazzarou posted 8 years ago Replies: 1 | Weight: -117 | Link [ - ]


Here is a list of all transactions above 20KB for the past 100k blocks.

Reply to: papa_lazzarou
luigi1111 edited 8 years ago Replies: 1 | Weight: -117 | Link [ - ]

Thanks. I don't think this can tell us too much about splitting, but it does look like miners typically accept transactions up to ~20% higher than the median (assuming median has been 20kB?). Do you have a way to group by payment ID within X blocks (5 or so I'd guess)? This would illustrate part of the privacy problem with unencrypted payment IDs (orthogonal, but anyway), but would likely show splitting tendencies better, at least on the merchant side.

Reply to: luigi1111 papa_lazzarou
papa_lazzarou posted 8 years ago Weight: -110 | Link [ - ]

Yeah. I don't know what I am doing here. I just found out I have a ISFP or INFP personality. Which, on the face of it, makes sense, really. :P

> group by payment ID within X blocks

I think I can do that. Guess what? I'll not even be thinking about it. I'll just start writing the query now and see where it goes. I guess I'll get it OKish after 400 iterations...

Oh well

EhVedadoOAnonimato edited 8 years ago Replies: 1 | Weight: -118 | Link [ - ]

> In increasing the block time to two minutes

When this was first raised I asked if there was a true motivation for it here:

I still don't see a good motivation to changing a critical parameter like that, specially for something that makes usability less interesting. Anyways, I don't want to bring that discussion here in order to keep this on topic, but I'd appreciate some input there.

> It is not necessary to increase the minimum block size at all.

Then don't. Really, why changing an important parameter of a system in production, involving money, if there's no major opportunity of improvement from it?

Plus, letting this minimum limit low allows for the automatic adjustment system to start operating sooner. More opportunities to test whether it works well while the coins is still in its infancy. This minimum limit should be just a bit more than enough to comport any coinbase transaction, IMHO. (no, I'm not suggesting reducing it either! :))

Concerning the merchant systems: they should adapt to receive multiple payments, and having them confirming in different blocks. Monero works with stealth addresses, and soon there will be merge avoidance (important for privacy as outlined by MRL-0004). So multiple technical transactions to fulfill a single business transaction will be the norm. And they might have different confirmation heights even if the median block size is already much larger than that. So the merchants must take that into account. Better they start doing it from the beginning instead of having false expectations.

Reply to: EhVedadoOAnonimato
smooth edited 8 years ago Replies: 1 | Weight: -117 | Link [ - ]

"changing an important parameter of a system in production, involving money"

This is the sort of pretentious overheated rhetoric that might be common in the Bitcoin community but we don't need it here. This coin hasn't become "money" (something generally accepted for goods and services). It is an experimental system which we are trying to pragmatically build into something that is appealing to as many ecosystem participants as possible in order to in turn grow the network to the point where it is more objectively useful than experimental.

That includes considering factors such as the integration and support costs of merchants, who can alternatively tell us to get lost. I don't see how that helps this project at all.

The important consideration here is the trade off between somewhat less spam resistance from increasing the minimum vs greater support and integration costs from not increasing it. Rhetoric about important parameters involving money is a waste of everyone's time at this juncture. Please refrain.

Reply to: smooth EhVedadoOAnonimato
EhVedadoOAnonimato posted 8 years ago Replies: 1 | Weight: -99 | Link [ - ]

Ok then. Please let me know when Monero stops being an experimental toy used geeks and becomes and actual usable system, in production. I mistakenly thought that already was the case, since people are already paying actual money for XMR.

Sarcasm apart and talking seriously now, my points above remain. I'd still like to see a rebuttal to them, and more importantly, to the comment I've made in the other thread, where people argue about increasing the time interval between blocks. I like the 1 min interval and I'm not convinced that needs to change. Thanks.

Reply to: EhVedadoOAnonimato smooth EhVedadoOAnonimato
luigi1111 posted 8 years ago Weight: -99 | Link [ - ]

If we launched Monero today, we definitely wouldn't choose 1 minute for the blocktime. Why should we be stuck with it now? Because "we don't change important parameters in a working system"? I reject that wholly presently (for non-"social contract" stuff). If Monero grows, changing stuff will definitely get harder (rightly and reasonably so). We are nowhere near there yet. While we have the freedom to change stuff like this for the better, I think we should definitely go for it.

For the minimum median size, it's basically a bandaid, sure. But this bandaid will help out payment processors and users significantly in these early days of low, low usage, with very little (or no) obvious ill effects likely to arise. If transaction use rises to moderate use, it'll become almost entirely irrelevant anyway. There's even a possibility the blocksize adjustment algorithm could be changed to something significantly different (but better!) later.

Back to blocktime: this was another "genius" parameter chosen by TFT that basically no one agreed with. It was, and is, subpar. Fortunately, unlike the emission, nothing prevents us from changing it to something better (anything longer within an OOM or so being better). 1 minute confirms aren't "fast", nor are they secure since reorgs happen very frequently. I think you mentioned something about reorgs being fine because transactions will probably be confirmed on both chains. This is irrelevant because honest transactions you trust to "be there in both chains" you can trust even before they're in any chain. 1 minute blocktime is also a significant pool centralization pressure (much, much more than Bitcoin's configuration, and we know how much people talk/worry about that); it's almost like purposely encouraging it.

I do agree that "less coinbase transactions" will be mostly meaningless if usage takes off, but that doesn't mean it isn't beneficial now, when usage is low. It's kinda like this Satoshi quote: "It would be nice to keep the [blockchain] files small as long as we can. The eventual solution will be to not care how big it gets." Less blocks/coinbase transactions will certainly be beneficial now, allowing faster syncing/better experience for new users/etc, which may even be a contributing factor to increasing Monero's usage to the level that it's irrelevant!

ArticMine edited 8 years ago Weight: -120 | Link [ - ]

I am in favour of going to 60 KB or possibly even higher because of the split transaction issue. Has anyone contacted Poloniex on this? A exchange creates a situation where many withdrawal transactions could have a large number of small inputs.

fluffypony posted 8 years ago Replies: 1 | Weight: -122 | Link [ - ]

In terms of determining the average transaction size I pulled the last 20 payments and cold wallet transfers from MoneroDice and the Forum Funding System (always sent with mixin 3). The majority of them are under 15kb and don't need to be split, it was just 5 of them that hit that limit and spilled over. Of those 5 only 1 spilled over to 3 transactions, the rest were split over 2 transactions. This means that bumping the median to 40kb (the natural increase) would cover all but that one case, and 60kb would cover all of them.

I think this is fairly representative of the merchant side of things, where you may have quite a few outputs selected for a withdrawal / payment, but if there are any other merchant services that can comment that would be great.

Reply to: fluffypony
luigi1111 posted 8 years ago Weight: -117 | Link [ - ]

Interestingly enough, that 1 case actually perfectly fits Myagui and my (admittedly arbitrary) 95th percentile, somewhat supporting 40kB. Of course 20 transactions is not very many datapoints.

Myagui posted 8 years ago Replies: 2 | Weight: -123 | Link [ - ]

I certainly agree with the principle of setting a minimum block size balanced with a sort-of-baseline transaction size.

It would be great to see some sample values to better understand what exactly is the typical transaction size for an average transaction using an average mixin count.

I think that the change is worthwhile, if we see that even a single ordinary transaction with average mixin would require a split (when block size is at its minimum).

Reply to: Myagui
smooth edited 8 years ago Weight: -123 | Link [ - ]

It is certainly not the case that every "ordinary" transaction requires a split. Even with 3-5 mix you can still have transactions that are only 2 KB or maybe even less. It depends how many inputs are used. The issue seems to be that transactions that do require a splitting are not very unusual with the current size limit, so that creates problems.

Maybe fluffypony or others who run services can comment more specifically on how often problems occur and how the proposed increase would help.

Reply to: Myagui
luigi1111 posted 8 years ago Replies: 1 | Weight: -123 | Link [ - ]

I don't think covering the "typical" case counts for much, if anything. I think we should shoot to cover 80-95% cases (or even more?).

Well actually your last sentence basically agrees with my sentiment, but seems at odds with your second sentence.

I can post up some sample transaction sizes, but don't have the experience to say how often these are actually occurring.

Reply to: luigi1111 Myagui
luigi1111 posted 8 years ago Replies: 2 | Weight: -123 | Link [ - ]

Transaction 1 (idealized case): 1 input (mix 4), 6 outputs (3 significant digits to receiver + change): ~648 Bytes

Transaction 2 (made up average case): 10 inputs (mix 4), 10 outputs: ~4,247 Bytes

Transaction 3 (pool miner): 200 inputs (mix 4), 10 outputs: ~74,637 Bytes

Transaction 4 (well used wallet with small change): 100 inputs (mix 4), 10 outputs: ~37,537 Bytes

In general I'd expect transactions to be slightly to moderately smaller than what I have listed, due to my (IMO generous) choice of varint lengths. I chose mixin 4 because that's what the client software defaults to now (AFAIK), and it should be the typical, supported case. IMO we don't need to support sky high mixins via the minimum median size.

Note that outputs don't have a very large effect on transaction sizes, so changing those from 6-10 to something else will not change much. Pools that cater to a large number of participants shouldn't have any trouble splitting their payout transactions anyway.

Reply to: luigi1111 luigi1111 Myagui
smooth edited 8 years ago Replies: 1 | Weight: -123 | Link [ - ]

> Transaction 4 (well used wallet with small change): 100 inputs (mix 4), 10 outputs: ~37,537 Bytes

I guess it depends what you mean by well used. If a bunch of participants are all transacting with each other and they all have well used wallets then the average transaction will have the same number of inputs as outputs. If one participant has more inputs than outputs, then another participant will have more outputs than inputs.

The current network is probably more affected by small mining payouts (which then get aggregated into other transactions) than a more mature network would be, leading to current situation of more inputs being common (which in turn leads to large transaction sizes and more splitting)

Reply to: smooth luigi1111 luigi1111 Myagui
luigi1111 posted 8 years ago Replies: 1 | Weight: -117 | Link [ - ]

I agree that on average inputs = outputs. I wasn't trying to show an average transaction, but rather what you might expect to see "commonly".

Disregarding mining payouts, I'd expect the average number of outputs to remain largely consistent (to how many significant non-zero digits commonly exist in payments), but number of inputs to fluctuate wildly. On average they'll remain similar of course, but that doesn't seem very useful for determining a "good" minimum median.

Reply to: luigi1111 smooth luigi1111 luigi1111 Myagui
smooth posted 8 years ago Weight: -117 | Link [ - ]

Yeah I agree. It does seem that "many" inputs would be more common than "many" outputs. The later would only happen for some form of bulk payout (could be something other that pools though; for example exchanges could greatly reduce the cost of their withdrawals if they aggregated them).

There may be transactions with more outputs than inputs where people are spending from one big payment so the transaction has one input and several outputs, but this would be irrelevant to the "large" transaction case.

Reply to: luigi1111 luigi1111 Myagui
Myagui posted 8 years ago Weight: -123 | Link [ - ]

Thank you luigi1111, yes, I meant covering the majority of cases (95% or something along those lines), but I failed to phrase it properly :(

Going by your examples, I would be inclined to conclude that 40Kb is sufficient (so keeping the original capacity, only adjusted for the new block spacing).

It would help greatly if we had more specifics from the services or people that (frequently?) run into the need to split their transfers. Speak now or forever ... :)