Parties that send or receive large numbers of transactions may choose to merge and cut-through transactions, which would help obscure the transaction graph.
The wallet could support and encourage this by allow outgoing transactions to be delayed and merged according to policies set by the wallet owner.
An example policy might be “when an outgoing transaction is received, hold on to it for one hour and during that time aggregate it with all other outgoing transactions, before finally broadcasting the merged transaction to the network”.
I’m sure someone else has thought of this, but I wanted to post it just in case.
Edit: Also!
Some parties might receive funds from many different people, whereas some parties might send funds to many different people, while others might do both. If this is the case, it might be worth allowing wallets to negotiate who will finalize the transaction, in order to maximize merges.
At a protocol level, a user can signal this with the kernel lock_height.
In your use cases you should also add that some mixers could also just be transaction facilitators, like grinbox, and won’t be parties at all. Which makes it awkward for the wallet.
Practically I’m not convinced this fits in grin. The wallet isn’t necessarily a long-running daemon and therefore can’t monitor when it should broadcast an aggregated transaction (“why wasn’t my transaction ever sent!?”). Making it the users’ job wouldn’t be great UX. And as I just mentioned, the wallet isn’t even necessarily a party.
The transaction pool on the other hand is in the business of holding finished transactions. But holding non-mineable transactions for free opens an easy memory flooding attack. It could recognize transactions sent to it directly (from the wallet or the API) differently from the ones coming from the network but that’s a bit ugly.
So ultimately I think it fits better in the service integrating the wallet. Save the transactions to a file somewhere. Cron job to pick them up and send them away (perhaps with a slightly extended Dandelion patience). Does that make sense?
Assuming that the attacker is the nsa and will flat out buy enough nodes that tor and dandelion have a failure rate, they will be able to process the “true chain” or passable equivalent.
I believe with enough white market activity on grin with omni-present spyware on everything you begin to make np-hard but still semi-solvable logic problems. It may not be perfect knowledge but lets say its 10 potential deanonymous chains and they man power it down from there, its still a potential attack vector.
Add in the fact the nsa is willing to throw a super computer to take down a drug market and then “parallel construct” a bullshit legal and technical theory of how they hunted down the host; well the revolution doesn’t necessarily happen.
Taking an atomic example, alice, bob and charlie are all spywares-5-eyes-big-tech-corps-up compromised, alice sends dan money for work that alice fills out proper tax paper work for, dan buys groceries from bob with a system that tracks and saves every thing it can, the money is sent to an new change address as the protocol calls secure, but bob’s system knows it doesn’t do anything fancy, dan then buys drugs from charlie who slipped up and computer files got added to the database and didn’t necessarily do everything right. Isn’t dan for all intents and proposes known to have bought drugs, maybe not enough for a court but enough to get harassment and extra spying, in the current world we live in?
I’m unsure were the tipping point is, you’d probaly have to run sims for that, but I don’t trust others peoples computer security and if only one source of anonymous coins touching a long chain of events it can be reasonably assumed it has one owner, meaning I have to if coin mixing like systems are not on the table
Its like you made super cooled water, you need a seed before it can become ice; a transaction that is anonymous in good conditions is a great tool but its not enough as we don’t have good conditions, there needs to be a seed for it to crystallize the anonymity out ward, so normies have good conditions at random and join the anonymous side of things unintentionally
I see your point about how it might be awkward to integrate into the wallet. Although, I think it’s good to be pragmatic with such things. If it might dramatically increase mixing and privacy for everyone, it might be worth some awkward UX.
One way that it could be supported without putting extra demands on the wallet would be to let the wallet tell the node to put it into its stempool as if it had just broadcast it, but with a long timeout. This wouldn’t be much of a change on the node side, but would have the desired effect, since new transactions added to the stempool before the timeout would be aggregated.
Indeed, and it’s something we think would be really useful in the context of wallet713 and grinbox as igno mentioned. Right now, I envisage this as some sort of separate aggregator service which handles all your tx broadcasts in exchange for a (small) fee of sorts. Many different parties can point to the same aggregator, and you basically do aggregation before anything hits the pool. This way your originating IP also does not get exposed to the network itself.
I also am not sure this should be part of the core protocol layer. Instead this might be much better facilitated via services, where healthy competition ensues to provide the smartest/safest/most private service and companies and individuals can use the approach that fits their needs the best.
I think this would be a small reduction in privacy vs. one of the parties to the transaction doing it, since the aggregator and the originator would have seen the un-aggregated transactions.
So, just on the point of whether or not it belongs in the default distribution, I suspect it might a bit like transaction batching on the Bitcoin network. My understanding is that it took a long time for major services to start batching transactions, which lead to sub-optimal use of block space. If the Bitcoin Core wallet and daemon included the functionality to batch transactions, it’s likely that many more people would batch transactions, simply because it was easier to do so, and wouldn’t require custom software to do so.
By analogy, if it requires using an additional 3rd party service or software, far fewer merchants might mix Grin transactions than otherwise would if it just required a bit of configuration of the software they already have.
How about starting with a github issue on this? I think it’s a worthwhile discussion to explore a little more what can be done within grin at least. Maybe they conclusion will end up being the same (no good fit for grin) but maybe someone else will get a better idea (perhaps something around Dandelion patience config).
The reason I’m extremely excited about this project is that potentially you don’t need user consent to give them extra privacy.
Imagine a store that asks the user to upload the signed transaction to them and them alone, then does book keeping on it before passing it onto the main network, assuming an online market place with a natural daily shipping cycle, such a thing could be likely be implemented with slightly more customer trust then btc but still less then credit cards without miner network traffic so such a thing could maybe be done at a discount.
That the direction I would suggest, reward off-chain pruning and let the mercants try to figure out when they can trust someone a little extra to get a discount.