I reformated and changed stagger size. Results tommorow after plot!



  • Hey. I asked around a bit and did some research and from what I gathered and people tend to agree on is that stagger size can affect your mining read time, something ive been having issues with. I have a 3tb drive that was originaly plotted with a 2048 stagger. It would take 50 seconds to read my plots every round at 14 mb/s with usb 3.0, and I asked around and no one could come up with a solution and agreed that speed seemed slow. So i decided to experiment and changes my stagger from 2048 to 49,152. My rig has 16gb total and with windows and a bit of wiggle room i settled on 12gb of ram for my stagger size number (im not a expert I hope this is right thou windows task manager shows the program using 13.16gb ram?!?!) As 4096 stagger is 1gb ram i believe. Im hoping this will help me in the long run as i get more drives and I will also try optimizing later on aswell. Plotting has 14 hours left.... 13.2k nonces per minute.... was 16k I guess larger stagger affects plot speed? Thanks and appreiate any info as im new to all this and as far as i know i might of just wasted my time! But If i try now and it helps might save me down the road. And with your input ill learn more!



  • Larger stagger should read faster, if only because there are fewer seek operations on the disk. Ideally you could run an optimizer program (dcct tools etc) on the plot to do that for you. But you'll need enough free space for both the original and the optimized plot.

    If you use too much ram in the plotting process or optimization process you can run out of RAM and force the computer to use swap space instead, which can slow things down a LOT.

    Just something to keep in mind.

    My process if I had one 3TB and a random amount of space on another disk:

    1. Plot to random space as much as you can fit.
    2. Optimize the plot to the 3TB drive.
    3. repeat until the drive is full.

    If your random space drive only has 100GB free, that's 30 files, which might not be any better than leaving it unoptimized. But if you have 500GB or 1TB free elsewhere, you'd be down to 6 or 3 files, and that should be pretty speedy!

    Ideally you'd want 1 fully optimized file per disk where the stagger size is the same as the number of nonces, but you do what you can with what you have.



  • Thanks! Ya good info there. I like creating 1 or 2 large files per drive which people say can be risky if it fails or gets corrupt it can be time consuming but i have had no issues yet with my hdd. When I get my second drive I will optimize and then post the results here. Still got about 4 hours until my plots done. Excited to see how much time it will knock off 50 seconds. My stagger was only 2048 before and now its like 24x larger 🙂 i will be disappointed if it only knocks a few seconds off hahaha.



  • @kevmachine If you intend to replot I would not mess about with any intermediate stagger sizes, just replot with xplotter and create a fully optimised file, you will then know that you have the best plot possible.

    Rich


  • admin

    @kevmachine What @RichBC said - use xplotter to create an optimized plot.



  • @haitch ok I will keep that in mind thanks for the info. Also is it too late to get seriously into burst coin? I read that most of the coin is already been mined and soon it will be finished.



  • The results are in! and I am very happy with them. with my previous stagger size was 2048 and after reformation and plotting with a new stagger size of 49,152 I know get a read time of 14.2 seconds for 3tb, thats wayyyyy better then my previous 50 seconds. and its' not even optimized yet.



  • @kevmachine Thanks for posting this.
    How has the change to a stagger of 49,152 worked out for you over time? Is the miner stable with such a high stagger? Which miner are you using?

    I see that you we maxing your system RAM (16GB) with this value. I have 32GB RAM and was thinking of attempting 64,000 stagger.