-
Notifications
You must be signed in to change notification settings - Fork 695
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ensure!
is better than assert!
?
#6322
Comments
ensure!
is better than assert!
ensure!
is better than assert!
?
Wherever we can return an error, we should probably use ensure, as it allows the benchmarking pipeline to exit gracefully. For example, imagine you run benchmarks on the whole runtime, all of that data will be saved in memory before it is parsed and put into JSON / handlebars templates. If the benchmark process panics, all of that data is just thrown away. AFAIK, it should be possible to get lots of reasonable and usable data even when a benchmark returns an error. Some functions may not handle returning an error (like some kind of setup function), in which case assert is fine. But ensure is def better on average afaik. cc @ggwpez |
The benchmarking CLI handles panics and errors in the same way, since it spawns a new wasm instance, these panics never propagate upwards anyway.
This is pretty much mitigated since we run them per-pallet (and also continuously in CI to check that they are not broken).
Yea it could be improved to still write partial weight files or output the intermediary JSON for recovery.
No this would be too much time and reviews wasted on something minuscle. For new code we can keep it in mind, but i would not dig up old code just for that change. |
Originally posted by @shawntabrizi in #6310 (comment)
I noticed that the benchmark code throughout this repository heavily utilizes
assert!
. Should we consider replacing all instances of it?The text was updated successfully, but these errors were encountered: