Hi,
I made some research and I 'm sure that their is a mistake but no idea where.
I found on 2miners.com that hashing rate is 6K G/s on the network
I found that a rtx 1070 has a hash rate of 6 G/s or 1/1000 of the hashing power
So it means that with a single RTX 1070 I could mine : 60x60x24x(1/1000) or 30 K grin A year !!!
at a price of half a dollar it should mean 15K dollars a year …
Where is my mistake?

Ok so I understand.
Second question then, I found a miner G1 for 42G/S and 2800w at a price of 25 000.
It means 0.08% (if not change) of hash rate so 25 K grin a year… without the cost of electricity …
case 1 : I’m still wrong
case 2 : it will be not profitable at prices of last months
case 3 : price will reach break even so at least 1 or 1.2 dollars

I think you should asume there are enough miners to increase the hash rate. Also it is very hard to get a G1 miner. You can also buy the G1 mini miner which has a graph rate of 1.4. The economics are the same for G1 and G1 mini, the price per graph rate is approximately the same, also the power consumption per chip is the same.