Grin Mining Hardware

Hi everyone. Excited about the Grin launch!

Haven’t been mining anything since 2013 so it would be a fun hobby to start mining GRIN on January 15th.

What are everyone’s hardware recommendations for Grin mining?

Would you like to share the specs of your rigs?

Happy mining, hope you have a great one

1 Like

CPU Core i3-8100 or any 4-core CPU, 4-8 gigs of RAM
GTX 1080 TI is the best choice right now

grin is the only coin that requires CPU+GPU for mining others just use GPU

I will happily accept PRs that do cycle finding as efficiently on the GPU side…

ah ok maybe someone smart can figure out how to offload to GPU.
As of now, mega farms are out because they are usually running 8 gpus on celeren cpus

They can also increase ntrims, the number of trimming rounds, to vastly reduce the cpu workload…

Is it true we need a 4core cpu as a minimum or a dual core can do the job{have Intel cpu g4400 @3,3ghz.will it work with C31 or need to buy other cpu? Thanks… Getting everything ready for testing

Do I need to change my cpu if I want to mine C31 with 1080tis? I have a dual core Intel @3,3ghz.

Please help me. Thanks

Your CPU should be able to support a couple of GPUs. Why don’t you try it on floonet?

so celeron g3900 seems to be fine with up to 3 1080TI or 4 1080/1070 on a windows miner. After that is starts to throttle at 100% CPU load. If you’re running just one GPU any CPU will work.
Linux miner has a config to reduce CPU usage even further

I have 12 x 1080ti.
So to run all and not have 100% cpu usage have to buy a cpu with more cores, right? The maximum for Intel 1151 socket is 8core,so a 6 or 8 will do for 12 gpus? Thank you very much for the response everybody

On Linux, but do you want to reduce the cpu usage? Doesn’t it result in a reduce of Graphs/s as well?

my 6 1080TI and corei3-8100 get 65% CPU usage. I am not sure if it scales to 12 linearly. So you’re probably fine with just higher clock 4-core like core i5-7500 or core i7 7700 if youre paranoid enough

The things is soon some professional cuda coder will figure out how to offload the CPU task to the GPU efficiently and then higher end CPU will be redundant.

Tromp, such optimization is possible, isn’t it?

I expect so.
And I hope to see some GPU cycle chasing code before long.
The current hybrid solving method is admittedly somewhat ugly…

So from this thread, am I to assume that the OCTOMINER with its integrated Intel Celeron 3855u will bottleneck at the CPU if I try to mine grin with all the graphics card slots filled?