Best GPU for mining Grin

Been playing with one myself - seems great!

Hey @Alpha_Miner, any updates on your 4gb Rx570 mining rig now that CNY is over? Would love to see what magic sauce you have to get 6-9 gps. Post your mining logs.

While you guys are thoroughly engaged in discussion about the high-end video card market… you fail to take into account and consideration one simple reality: Most people are unable to afford it… and as a result, you neglect a massive chunk of usable hash rate that typically comes from the older generation of GPUs still present at large scale mining farms and cloud facilities.

If Grin wants to remain truly secure and decentralized - a lot of focus needs to be allotted towards creating long term support and compatibility for the low-end GPU market; This market comprises more than half of the world’s GPU hash power.

Since most of the hash rate comes from Chinese mining farms that are still on RX 470/480/P104/P106 with 4-6GB RAM - as well as global farms that value economy over price, and cloud mining services that are scattered all around the world… you are effectively blocking out a huge chunk of such decentralized validation power by creating memory requirements that are simply out of reach of many.

In a way, Grin is coming off as elitist and selective of who can participate in the validation of its blockchain. Kind of defeats the purpose of decentralization if only the wealthy investors can partake in mining…

While most people with low-end GPUs are begging for Grin developers to implement proper support for 4-6GB cards… you guys are talking about RTX 2080 Ti and cards with 16-24GB RAM that absolutely NO ONE* can afford! Goes to show where the priorities are in this project.

This post is written for Grin and mining software developers to read.

*Footnote: When I say ā€œno oneā€, I’m referring to large portions of the world’s population. It’s figurative and not literal, but it gets the point across.

Perfectly said.
Change Roo29 to Roo27 and let everyone mine. ASICs will not get created and decentralization and network security will be higher.

1 Like

Sapphire 570 16GB model offers the lowest cost large buffer card for C31 mining, with ePIC Boost specifically designed for the card, allowing an average of 0.47 GPS in C31 with a total system power draw of about 220 watts.

how is that profitable?
0,47 GPS = 0.097 Grin per day / 220 watts.
One 1080Ti has 0.2 or more Grin per day with same power consumption.
You can buy now cheap second hand 1080 ti cards

Can somebody please help me understand how an 11GB card does better than the Sapphire card that has 16GB? @Sapphire_Ed

  • Once you have enough memory to solve the problem the additional memory does not help you much.
  • NVidia cards can compute the SipHash faster than read it from memory so you don’t need the whole 16GB. It’s likely the opposite: The less memory you can use the faster you can feed those compute units.
  • Sapphire card has 224GB/s memory througput VS 616GB/s of the 2080Ti. If the miner plans to use more memory running at 36% bandwidth then it can deliver at most 36% of the hashrate.
  • I’m also guessing that the AMD version of the miner is not that optimized yet. The original miner was implemented in Cuda meaning that it’s useless on AMD silicon. EPIC miner needs to re-implement the whole thing and it seems that it’s not that mature yet. I expect it to improve over time.
3 Likes

Since this thread is called ā€œBest GPU for mining Grinā€, let me give my thoughts on this. The opinions are my own based on me prototyping a solver. The question I’m going to discuss is: ā€œIf you had a capacity to write ideal software for any GPU, what GPU would you write software to?ā€

Why this question? One of the drawbacks of Cuckatoo/Cuckaroo PoW seems to be that optimizing for a particular hardware/memory size is extremely important. Much more important than with other PoWs. You would approach a problem differently if you had 11GB vs 9GB vs 8GB. You need to take memory throughput into consideration etc. I don’t like this aspect of the PoW at all. Many people are left out because they happen to have hardware that does not have optimal software implemented.

For Cuckaroo29:

  • 5GB (or more) of RAM and maximize memory throughput

For Cuckatoo31:
There are two feasible approaches:

  • Memory heavy approach: 18GB (or more) of RAM and maximize memory throughput
  • Balanced approach: 9GB (or more) of RAM and balanced compute VS memory throughput. Balanced here is not precisely specified, but 2080Ti seems to have a good ratio between compute and memory throughput. I assume that many more GPUs do.

In addition it seems to be beneficial to maximize shared memory size on the GPU multiprocessor. In this regard NVidia RTX cards with 64k are superior to many GTX cards with 48k.

One final note: If the card does not have the optimal memory size but at least 82% of it (or 100*(1-1/(2*e))% to be precise) then I believe it can be made to mine at around 90% efficiency. I think that 8GB cards could be used for C31 mining then.
You would do this by doing the first round of pruning twice, each time with half of the edges.

If I wanted to implement ideal C31 solver I prefer balanced approach with 82% trick. I would do it for RTX 2070/2080. Too bad I already have some 2080Tis…
Radeon VII looks promising for C32 with the HBM2 high throughput memory. One TB per second FTW.

For C32 I would love to have a GPU with 128kB of shared memory, but I don’t see that happening any time soon.

2 Likes

Dzetkulic, the ePIC Boost team is working hard on optimizations and have even begun preparations for the C32 branch that is upcoming.

2 Likes

I own 2 RTX TITAN cards with NVLink Bridge. If anyone would like to know how something works on them like hashrate or whatever, I have plenty of free time. My internet is 900/45 cable 3.1
My system is: intel 9800x 64gb ddr4 platinum x299

1 Like

That would be an ultimate killer on C32!

I have 8 1060 3gb, I’m mining Beam with those, high memory requirement is simply a way to keep the algorithm competitive, do you think Grin would be profitable if you introduced all that hashing power into the network? Ethereum and Ethereum Classic can’t be mined with 3gb cards, Monero has higher hashrates with cards with more memory, Equihash algos prefer Nvidia, Cryptonight algos prefer AMD, do you consider that elitism as well?

And the other thing, when Ethereum was at it’s peak, people were mining it with 6gb+ cards, the extra memory was not being used, and this was at a time memory prices were insane. An algorithm that uses that memory was long overdue. And it’s ridiculous to consider mining farms as ā€˜ā€˜the little guy’’, because guess what, most people can’t afford 1000 RX 480s, farms are the main source of centralization, and making an algorithm that requires something different is exactly how you decentralized.

6gb 1060s and 1070s are not high end, in fact for the longest time the P104-100 and currently the P102-100 were/are out of the reach for the average person, that’s elitism, that mining cards were impossible to get for the average miner for so long. If you have 4gb cards, you will have to use linux or Windows 7, complaining that your 4gb cards can’t mine Grin on Windows 10, that’s pretty first world problem dude, if you can even call it a problem.

Like many others, I started mining knowing nothing, I had to learn how to flash a bios, run mining software, what cards to get, what risers to get, how to optimize the cards by undervolting, etc. Most miners can adapt, it’s great that the memory we paid for because 4gb RX cards were out of stock has a purpose now. I don’t care about Chinese mining farms, I care about small miners that put a lot of time, money and work into their rigs, and now that we have options I’m not going to complain that one of them requires more memory. Unlike the huge farm owners, I am not wealthy, I invested in better cards over time, God forbid I have to tweak the settings to be able to mine newer more profitable projects, it’s not like I’ve done that so many times I’ve lost track.

A large memory buffer no longer correlates with higher end cards, but it’s nice to know that there is now a value for it.

Hi Tromp!

I’m pretty new to mining, now I’m mining Grin with a RTX 2080 Ti using the GrinGoldMiner software. The thing is that only 1% of the capacity of the GPU is used. So I have a average secondary hashrate of 5.62
(primary hashrate is zero).

I’m running one instance of GrinGoldMiner.

How do I gear up this setup?

Please can you give me some tips?

Thanks a lot!

Ps: ofcourse any help from someone reading this is very much appreciated!

use Bminer.
https://www.bminer.me/releases/

1 Like

Hi please tell me more about V100… Like is it any good or worth it on cuckatoo 31 And please tell me in ur Opinion which coin or algorithm suites its specs best…??? Thanx in Advance

1 Like

Hi please tell me more about V100… Like is it any good or worth it on cuckatoo 31 And please tell me in ur Opinion which coin or algorithm suites its specs best…??? Thanx in Advance

1 Like

Hello, is a NVLink Bridge preferable? Or is it better to use two separate cards?

(I have a system with 2 RTX 2080 ti cards)

What mining software do you use? And are you running on C31?

thx

1 Like

If you have mounted both cards in a system (in one mother board ) no need for NVLink Bridge and I would suggest Gminer you can run any one C29 or C31.

I have a rig with 4x2080Ti, assembled especially for GRIN whet it’s price was >$10…
I use bminer. You can mine C29 or C31 (preferrable IMHO) algos with it. Also you can use SSL connection to mining pool.

This Rig with 12x RTX 2080 with centos operating system mines 48 - 56 GRIN per day