Bounty - 1 BTC + 100 Grin for a MacOS M1 C32 Open Source Miner with ≥0.5 Gps

This is an unserious offer. Not an official bounty, not from any dev or community fund, and not guaranteed by anyone including me.

I am casually, temporarily, offering 100 grin for an open source C32 grin miner that can run on an M1 Mac at anything greater than .02 Graphs per second.

No expectations of fulfillment, just throwing it out there for fun in case anyone is capable and bored.

4 Likes

Here’s a more serious offer:

I am offering 1 BTC for an open source Mac Studio (M1 Ultra) Cuckatoo32 miner achieving 0.5 Gps.

30 Likes

what is the guarantee that i will be paid :thinking:

As I said this is a casual bounty, no guarantees :stuck_out_tongue:

But I assume you’re asking for Tromp… I’ll let him speak for him.

What more do you want than my word?

17 Likes

Tromp is one of the most important people in the Grin community, he’s not a random stranger. You’ll be rewarded 1 BTC for an open source Mac Studio (M1 Ultra) Cuckatoo32 miner achieving 0.5 Gps according to him.

3 Likes

Would there even be any benefit to setup a MacOS miner? They aren’t powerful machines to begin with

I would like to have a low powered way to keep testnet running when no one else is mining it.

1 Like

@Trinitron, Maybe you can change the title to

Bounty - 1 BTC + 100 Grin for MacOST M1 C32 Open Source Miner with ≥0.5 Gps

2 Likes

Now that Apple silicon is well established, I wonder if anyone has even attempted this. Could grin-miner run without any modification? If so, what would the graph rate be? I’m just aware that the max GPU core count and unified RAM has increased with M2 (Although you need to be well-off to afford one!).

In terms of this challenge, would it require a complete re-write or can grin-miner be modified?

grin-miner only has an efficient CUDA miner, but it requires a very high end GPU with more than 16GB memory.
There’s also Lolliedieb’s slean at GitHub - Lolliedieb/cuckoo
that could be adapted to run on Apple silicon.

1 Like

@tromp
What would be the minimum setup for the hardware requirements?
Apple M1? 16GB GPU? Maybe an example of an existing Apple device would be nice.

M1/M2 Ultra are fastest chips to have better results, cause memory bandwidth 800Gb/s

@tromp mentioned Mac Studio, it can work on Macbook too, question is about temperature :slight_smile:

Okay thanks for reqponse. Have not the best ideas about MAC devices but I saw the M1/2 Ultra is quite expensive like 3-5k $ :face_with_diagonal_mouth:

Would cut the edges with Rust. But for PoC - I dont invest that much and would use own hardware.

Not so much expensive like incoming H200 GPU from Nvidia… it will have 4.8TB/s memory bandwidth, must have cool results on Cuckatoo, H100 with 3.35TB/s costs 25-40k $ :crazy_face:

Insane but I see wihthout ASIC it needs a lot of power (heat) to hash/find a graph. Anyway a good edge trimmer is not impossible.

And the research work is well paid :slight_smile: on success.

Hardware setup for PoC

  • AMD Threadripper 2950x - water cooled
  • 32 GB RAM with 2666 Mhz
  • GTX 1060

Road Map:

  • Review existing work
  • Graph theory / Algorithm design
  • Rust implementation
  • Stratum for connecting to test net
  • Evalulation / Optimization
  • Hopefully enough time duo to full time job
1 Like

I don’t think Apple has released anything substantially more powerful than the M1 Ultra, so any MacBook or MacStudio could be used.

1 Like

@tromp For cutting edges I would define the following algorithm:

  • Identify all nodes that have a degree of 0 (in OR out)
  • Remove all edges of these nodes and then the nodes too
  • Do this on the new created graph again while edges count >= 42
  • Submit result - poll new nonce