At which stagger size is optimization no longer needed?



  • The size of your available system RAM determines how high a stagger size you can have. If I understand this correctly; the higher the stagger, the less seeks are required to read nonces each round, because the plots are arranged in larger parts. This is the purpose of optimization as well. So, at which stagger size will the plots be arranged so well that optimization is unnecessary?



  • To create an already optimized file you need to create a plot with stagger size=number of nonces.


  • Banned

    I get what he is saying, at what point will it not make that much of a difference betweeen optimized or not



  • wait a minute if I optimize a plot on a pc with 8 gb. Then take the plot file and use it on a machine with 16gb of ram. I would need to optimize them again to take advantage of the memory increase.

    Unfortunately that would mean when I take the plot file optimized with a machine running 16gb or ram back to a machine with 4 gb of ram then my plot will throw out corrupt errors correct?



  • @Focus said in At which stagger size is optimization no longer needed?:

    I get what he is saying, at what point will it not make that much of a difference betweeen optimized or not

    Yes, that was what I was wondering about. Interesting question above this comment as well.



  • @Burstde said in At which stagger size is optimization no longer needed?:

    wait a minute if I optimize a plot on a pc with 8 gb. Then take the plot file and use it on a machine with 16gb of ram. I would need to optimize them again to take advantage of the memory increase.

    Unfortunately that would mean when I take the plot file optimized with a machine running 16gb or ram back to a machine with 4 gb of ram then my plot will throw out corrupt errors correct?

    Hi,

    Im not sure but I think memory is only a matter when plotting. I mean, larger memory is larger "part of plot" will be.
    When mining, it result to read less (but bigger) files, that is faster to read.

    Anyone, please correct me if im wrong but optimizing plots remain to re-organise the parts in the plot.
    I never get any issues when transferring drives/plots between different computers.

    Ben


  • Banned

    Correct, memory is only dependent when plotting.



  • At no point. Optimising plots is better for the health of your harddrive and your read speed.
    If you can create a plot with a stagger size about 1/4 of the plot, it might not be necessary to optimise.
    That means, if you have a 1 TB harddrive, you need to have 250 GB memory. 4 TB = 1 TB memory.

    Edit: If someone recommend to divide the plots up into parts, that is not going to work.
    Having 2 optimised plots is the same as having 1 plots with stagger size half of the size.



  • So why do we have to guess how till fill a drive all the time? Someone write what numbers to use.

    2tb drive use

    address_0_7208960_4096



  • @Burstde 2 TB uses 8388608 nonces.

    You can find the number, by using this formula made in wolframalpha: http://www.wolframalpha.com/input/?i=TB+%3D+x*256%2F(1024^3)
    Put the number of terabytes where TB is. 100 gb = 0.1 TB.


  • admin

    @Burstde Take drive size in bytes, divide by 262144, divide by stagger size. round down to integer (eg 383.262 becomes 383) multiply by stagger. That's the number of nonces required to fill the free space.



  • @FrilledShark Nice thanks



  • @Burstde Like @Haitch just said, make sure the number of nonces is a multiple of the stagger.

    Estimate nonce divide by stagger -> Round down to integer -> multiply by stagger.


  • admin

    @FrilledShark You are right, optimized is best! ... however i think there is still a huge difference if your stagger is 4096 (optimize needed) or above 200000 ... so if someone has maybe 64GB/128GB of memory in his pc and ends up with below 10 drive seeks, that does not need to be optimized i guess, even if it is not perfect.