You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Summary or problem description
Oracle contract merged with #1738 mandates the caller to specify the amount of GAS to be used for response transaction transmission and execution. It can also be seen in the demo contract from neo-project/neo-devpack-dotnet#370.
The problem is --- how should a contract developer determine the amount of GAS to be passed into Oracle.Request()? It completely depends on the oracle response, so theoretically developer should specify big enough value to cover all possible cases (and waste some GAS doing so if the majority of the cases need less than the maximum value). But it's not easy to precalculate that even for regular contract execution flow, and what if we're to make one more oracle request from the callback or just call another contract that can do whatever it wants to depending on its input parameters? And it is to be set by the developer, so some kind of RPC-driven test invocation (like invokefunction) is not applicable either.
Do you have any solution you want to propose?
I don't. But I expect this question to be raised by smart contract developers and I think we need some answer for it.
Neo Version
Neo 3
Where in the software does this update applies to?
Oracles
The text was updated successfully, but these errors were encountered:
That's a good mitigation, at least eliminating some waste, but still contract developer needs to decide how much GAS to pay initially. And this decision would affect contract call cost, if the worst-case (in terms of GAS cost) callback processing scenario costs 10 GAS, then even if this scenario happens once per 1000 calls all the other ones need to spend this 10 GAS first and only receive their refund after response processing.
Summary or problem description
Oracle contract merged with #1738 mandates the caller to specify the amount of GAS to be used for response transaction transmission and execution. It can also be seen in the demo contract from neo-project/neo-devpack-dotnet#370.
The problem is --- how should a contract developer determine the amount of GAS to be passed into
Oracle.Request()
? It completely depends on the oracle response, so theoretically developer should specify big enough value to cover all possible cases (and waste some GAS doing so if the majority of the cases need less than the maximum value). But it's not easy to precalculate that even for regular contract execution flow, and what if we're to make one more oracle request from the callback or just call another contract that can do whatever it wants to depending on its input parameters? And it is to be set by the developer, so some kind of RPC-driven test invocation (likeinvokefunction
) is not applicable either.Do you have any solution you want to propose?
I don't. But I expect this question to be raised by smart contract developers and I think we need some answer for it.
Neo Version
Where in the software does this update applies to?
The text was updated successfully, but these errors were encountered: