Mining Grin with OCTOMINER X9 PRO & Integrated Intel 3855u CPU

I have seen references to low-end CPUs bottlenecking in grin. Do you guys think I’ll run into issues mining grin with an OCTOMINER X9 Pro with its integrated Intel Celeron 3855u CPU?
(here’s a link to the original 8 card OCTOMINER, which is out of stock, although the 9 slot variant is available:

That 3855u has only 2 cores at 1.6 GHz. Are 8 or 9 graphics cards going to bottleneck this CPU when mining grin?

What will be the difference for, say, a 1060 vs a 1080ti in terms of CPU requirements for GPU mining grin?

What about an OCTOMINER rig with 8 or 9 Sapphire RADEON RX 570 16GB grin-specific mining cards or other cards with large VRAM that may be suited to GPU mining the ASIC Friendly algorithm? Will these increase CPU requirements?

I was considering the riserless OCTOMINER platform for its easy setup and simplicity, as well as great customer support, but the low-end integrated CPU has me second guessing this MoBo. Any advice is greatly appreciated!

Hello @UeliSteck,
as for NVIDIA i can’t say by experience, buy i do know most miners have an option to use or not use the CPU a assist in the mining itself… so you are probably good to go (NOT IDEAL THO!)
as for AMD I was mining with 13 Sapphire RX 580 8GB on a boar with a Pentium G4400, i had 3 rigs like that… and no issues, i was using (MinerBabe’s KBMiner) and pulling ~0.2 GPS in C31…

So my guess is, you ARE going to be able to mine, but not the ideal setup!

If you go ahead and do it, tell us your results :wink:

I guess the crux of my post is

  1. Do higher end graphics cards increase CPU load / minimum requirements? (obviously yes)

  2. Does having more memory capacity on a graphics card increase CPU load / minimum requirements compared to the same card with less memory? (probably yes?)

  3. How does this all apply to grin? My understanding is that the computations in the grin mining algorithms are more memory intensive than for pretty much any other blockchain. Make the miners do extra computations now, and get better cryptography and blockchain size scaling at the expense of live transaction volume.

If both 1) and 2) are true, then 3) should imply that the low-end CPUs that other cryptocurrency miners have traditionally used should be upgraded to mine grin efficiently.

Specifically I am worried that 2 cores and 1.6 GHz are not enough threads and clockspeed to be useful for mining grin at all, orrrr maybe it will be fine for a rig of nine 1060 6GB cards, but I won’t be able to upgrade the same rig to, say, 1080ti’s due to CPU bottleneck. However if someone can explain exactly why that’s not the case, or give me some anecdotal evidence that it worked for them, maybe I would still go with OCTOMINER. At this point, I’m thinking I’ll have to go with a traditional riser-style mobo where I can pick a newer, quad-core CPU, and still be able to upgrade the CPU if necessary in the future. Bad move for these riserless board makers to go with crappy integrated CPUs.

Well you still have options like this one:

where you could just populate the RAM and CPU at your will.