I see the cpu and gpu bounties, but where are asic design bounties? Airn’t those more important, even if a bit harder to judge
I think the ASIC bounties will be available upon launch, payable in Grin:-)
Seriously, I think we can at best have FPGA bounties.
First we’d need to answer the following question:
How much SRAM do semi-affordable FPGA have available?
That would determine the graph size for which they can compete
on bounties. I don’t think they have enough to run on 2^29 edges
(that would take 64-128MB).
I’m thinking of a rather different approach is possible, that evidently your ignoring or ruled out.
hmmm is the vague promise of a bounty in a currency that doesn’t exist yet good enough to code up an experiment?
What would be a good size of graph to run statistical shit on if I need to keep track of false negatives, and don’t want to have to generate 10,000 pows at 30 seconds each?
For 2^15 edges (cuckoo16 or cuckatoo15), you can run a million graphs in under an hour.
What do you mean by false negatives?
I’m thinking asic’s that use a heuristic-based pre-pre-processing edge cutting could then pass off a smaller graph to a gpu.
Such heuristic could produce false negatives if they cut an edge that ends up being used by the true answer, but if the speed up is higher then the false negative rate its worth its salt
But shhhhh don’t tell the cuckoo devs they may steal my idea
Oh, in that sense most solvers have false negatives.
E.g. the mean solvers distribute edges over buckets,
and in the (rare) case that a bucket is filled up, additional
edges simply get dropped.
The same holds for all Equihash solvers that I know of.