-
Notifications
You must be signed in to change notification settings - Fork 329
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CIP-0002 is obsolete since native tokens were launched #232
Comments
I agree on those changes. This is a problem that need to be taken care of as soon as possible. |
At a recent CIP editors' meeting we agreed that issues on the CIP repository should be reviewed sequentially just as we review PR's. I think this issue is actually the one I used as an example in that discussion. It hasn't been forgotten about & please stay tuned as we work out a process for reviewing these issues. |
p.s. I've just reviewed the details of our discussion about "issues" and I think this is one of those that's best turned into a PR with the changes above, especially with so many people showing approval of the OP with no reservations expressed. @nicolasayotte would you be willing to submit a PR for this, modifying CIP-0002 itself, and adding yourself as an author + this issue as a |
@rphair Absolutely! Will do that this weekend without fault 👍 |
Dear @nicolasayotte Many thanks for creating this issue! As the author of CIP-2, and as a co-author of the multi-asset UTxO selection algorithm currently used in Regarding the creation of new CIPsAs you may be aware, CIP-2 was originally intended to be an informational CIP, whose purpose was to describe the algorithm that Though as you rightly point out, CIP-2 is indeed obsolete in the context of a multi-asset UTxO blockchain. Therefore, I strongly agree that we need CIPs that describe effective selection algorithms for multi-asset UTxO blockchains. However, my personal inclination would be to create a completely new CIP specifically for this purpose. I believe that CIP-2 has some value as a historical record of how we performed selection in the single-asset era. It might even be useful as a reference for those wishing to implement UTxO selection on UTxO-based blockchains other than Cardano. I think it's also true that there are many potential solutions to the problem of UTxO selection for multi-asset blockchains. If we were to modify CIP-2 instead of creating a new CIP, this might create the impression that CIP-2 is the "preferred" way to perform multi-asset selection for Cardano, when in reality there may be multiple competing solutions with different advantages and tradeoffs. So my personal preference would be to leave CIP-2 as a historical CIP, and create new CIPs as necessary. As the author of CIP-2, I would be very happy for people to reuse any part of the text of CIP-2 in a new CIP. (The original CIP was licensed under https://creativecommons.org/licenses/by/4.0/legalcode). With that said, I'd also like to address some of the issues mentioned: Regarding the problems of over-selection and clustering
The multi-asset UTxO selection algorithm used in In fact, since the beginning of the multi-asset era, we've gone to great lengths to minimize the problem of over-selection (selecting a greater quantity of a given asset than is "optimal"). We currently use two strategies in an attempt to minimize this problem:
Priority orderingWhen selecting for a particular asset
In other words, the algorithm:
So for any given asset For reference, see: https://github.com/input-output-hk/cardano-wallet/blob/a854531d4e86d5053c3442721ecbecf22d57b023/lib/core/src/Cardano/Wallet/CoinSelection/Internal/Balance.hs#L1228. Round-robin processingAnother strategy we use to minimize the problem of over-selection is to use round-robin processing. With round-robin processing:
See here for reference: https://github.com/input-output-hk/cardano-wallet/blob/a854531d4e86d5053c3442721ecbecf22d57b023/lib/core/src/Cardano/Wallet/CoinSelection/Internal/Balance.hs#L1168 Regarding the use of fallback strategies
When selecting for ada, the multi-asset UTxO selection algorithm used in Regarding Largest-First, I think we have to be slightly careful, as:
I do agree with you though, that Largest-First could be considered as fall-back strategy in situations where Random-Improve has failed. However, for the algorithm used in These two strategies differ by how much of each asset the algorithm attempts to select:
The minimal strategy will tend to generate fewer and smaller change outputs than the optimal strategy, and therefore smaller transactions. It's therefore more likely to succeed in creating a transaction that is within the transaction size limit. However, we suspect the minimal strategy, if used repeatedly over time, is likely to cause an increase in the number of very small quantities within the UTxO set. Therefore, we currently use this strategy only as a fallback. See here for reference: https://github.com/input-output-hk/cardano-wallet/blob/a854531d4e86d5053c3442721ecbecf22d57b023/lib/core/src/Cardano/Wallet/CoinSelection/Internal/Balance.hs#L290 Limiting the sizes of created change outputs
The multi-asset UTxO selection algorithm used in For example, we already have support for limiting the sizes of outputs (when serialized) and partitioning them into smaller outputs if necessary. This support was introduced specifically to avoid creating change outputs that exceed the maximum size limit imposed by the protocol. However, this support is not currently configurable in any way for users of the Cardano Wallet API. We could (in theory) make this configurable. For example, by allowing callers of the algorithm to limit the number of different assets in change outputs to a particular value, or to limit the quantity of a particular asset in any single output. See here for reference: https://github.com/input-output-hk/cardano-wallet/blob/a854531d4e86d5053c3442721ecbecf22d57b023/lib/core/src/Cardano/Wallet/CoinSelection/Internal/Balance.hs#L2112 LibrariesI would like to share one last thing: Within the Adrestia team, we currently have plans to take the multi-asset UTxO selection algorithm currently used in We intend to release this library in a Haskell version at first, and then as a JavaScript library later on. The JavaScript version would be derived from the Haskell codebase, and share the same test suite. I personally also hope that after the release of this library, we could make a new informational CIP that documents our current UTxO selection algorithm in full. If you've reached the end of this long comment, many thanks for reading through! I would be very happy to clarify any of the points raised within. |
Well now, that will take me some time to address that reply. 🤣 But thanks a lot @jonathanknowles. |
Regarding the creation of new CIPsPeople in the ecosystem have been referring to this cip as gospel truth a lot and implementing the algorithm as the only way to do Coin Selection in popular libraries. So if I were to improve CIP-2 I would definitely leave all the current content in as it still is a good Coin Selection algorithm amongst UTxOs with only Ada. Then I would add a section saying -> if you are selecting for utxos with multiples assets: here is a good strategy. Regarding the problems of over-selection and clusteringThat priority ordering is not correct and will result in clustering. I will propose my current flow and then back it up with an example. When selecting for a particular asset a, while we do select randomly, we select from the following UTxO subsets in priority order: Set 1 ("singletons"): The subset of UTxOs that contain asset a and no other asset. Set 2 ("pairs"): The subset of UTxOs that contain asset a and one other asset. Set 3 ("multiples"): The subset of UTxOs that contain asset a and any number of additional assets. Set 4 ("ada only"): The subset of UTxOs that contain only ada. Set 5 ("everything"): All UTxOs not picked yet. You start at sets 1, then 2, then 3 but you need to jump out of sets 1, 2 and 3 the moment you have enough a asset and go to Set 4 to complete the amount of necessary Ada for the transaction (if you can) to avoid involuntary clustering of assets. Only if Set 4 is unable to pay for the Ada requirement do you start plundering into set 5. Example : I have : I want to send 40 ada and 100 asset a. The selection algorithm as you currently described will give me 5 utxos: The Coin Selection Algorithm I described would send 2 utxos for: Now imagine if you have very many nfts (like me, say 1000+) you algorithm NOT skipping to pure ada UTxOs once it is done accumulating the asset a total just grabs all my wallet at random trying to put together a little Ada while I have pure Ada tx just waiting to go that would result in no additional clustering. Regarding the use of fallback strategiesThe random improve article only applies with ONLY pure ada transactions. The assumptions of the whole article does not work at all with native assets that require a carry amount in Ada. This not to say the benefits of the Random Improve strategy is not super interesting, I read the article and approve the conclusions. It's simplify not directly applicable in a wallet with multiple assets and requires the use of proper subsets to maintain it's quality. If you were to run the simulation of the article with input transaction sometimes carrying native assets and a carry you would see catastrophic aggregation crop up quickly. We have seen it happen to every users that uses Daedalus or Yoroi or any other wallet than Eternl (who implemented Token Fragmentation) where wallets become unusable because its UTxOs are more than 16KB just to write the change transaction (the --tx-out command line is close to 16KB). Limiting the sizes of created change outputsWell, if that is configurable and already in it would be fantastic to offer that in the API so that transactions being created through the cardano-wallet have a predisposition towards orderly utxos outputs and maximize the potential advent of independent transactions (transactions using completely independent utxos). LibrariesThat is very cool and interesting, although the current cardano-wallet algorithms might address my current issue, having everyone use a high quality Haskell and very accessible Javascript library is probably the best way to standardize transaction creation on Cardano. As a final pointall of us vending machine providers have been building our machines using Cardano-cli directly, reading the utxos through cardano-db-sync, picking them using our own algorithms to sidestep those issues and building tx and balancing them through our own code. When we (a lot of the people liking are top tier vending machine devs) say that this is an issue it is because we have been working with transaction building for a year, seen wallet fail catastrophically since 11 months and been generally working at improving utxo selection and output balancing for a year. Adam Dean and I came up with the Token Fragmentation that was implemented in ccvault.io / ccwallet.io / eternl.io wallet that fixed the NFT enthusiasts wallets that got clustered by every other wallet software out there, including Daedalus. Of course I read all your comments, sir. And thank you for putting in the time, Nick AKA fencemaker |
And just to clarify, the point of the output formatting is not really to avoid hitting the 16KB limit, but to create utxos that are more easily consumable later. For example: if you need to send me 25000 ada and 50 NFTs, to increase the chances of me using those UTxOs efficiently you could send me 9 outputs: Doing that requires a slightly larger transaction to write out, but create 9 UTxOs which have a very high probability of being spendable independently from each other. Let me show you a live example of me breaking up a clustered utxo by sending it to myself in eternl.io wallet Here you see the utxo becomes 12 outputs that are all very likely to be usable independently. The "Maximum Ada before splitting" value should be customizable and the "Maximum amount of different tokens in a UTxO before splitting" should also be customizable. Both could have defaults. Right now, when I split my utxos that way using the eternl.io Token Fragmentation, the current Coin Selection algorithm used in most dApps will cluster all my stuff back together for every transaction I make... |
@jonathanknowles finally got around to discussing each point. As a scientist, I am really challenging all of this to help us get to a stronger solution. |
@jonathanknowles I am glad you took time to give me a thorough answer. I am kind of waiting on your feedback on my response before going forward though. |
I've published a package for UTxO selection. https://www.npmjs.com/package/cardano-utxo-wasm |
The 2018 article is obsolete when UTxOs contain NFTs or fungible tokens. The random selection algorithm simply will pick and cluster all your NFTs together over time resulting in a "change" UTxO that becomes extremely large until simply moving it around breaks the transaction size limit. This will keep happening over and over until your wallet is simply stuck.
I propose the following changes :
Also, it is recommended to build output transactions with the two following parameters (user settings in a wallet or algorithm):
Token Fragmentation allows change transactions or big NFT transfers to land in the destination wallet in a format that is easier to wield and also allows a more natural way of creating independent transactions, which should be a goal in this eUTXO model.
Make Me Some Change is simply a way to allow the creation of concurrent independent transactions by using UTxOs that were not previously used in a transaction.
The text was updated successfully, but these errors were encountered: