You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It looks like commit c809c0c (intended to fix #1028) introduced an issue where numeric values lose precision prior to being returned as BigDecimal instances. (Feature USE_BIG_DECIMAL_FOR_FLOATS enables this; given that it's disabled for performance by default, the reason to enable it is when you specifically want to avoid precision loss.)
This issue is still present in 2.8.1 (which is where I encountered it).
Some initial discussion took place as comments on the commit. I don't know enough about the contracts involved in the parsing state, but some things that caught my eye:
Retrieving the parser value as a Double is where the precision loss occurs.
The fix was intended to deal with returning NaN/Inf, which BigDecimal doesn't support. However these aren't legal values in JSON, and there's a ALLOW_NON_NUMERIC_NUMBERS feature to allow this as extension. (Maybe the feature-check is somewhere else? I haven't looked.)
The text was updated successfully, but these errors were encountered:
Handling is bit sub-optimal, in that I think decoding from textual form to numeric may need to be done twice now; ideally should be able to see if NaN value is encountered without such parsing.
I may file an RFE for streaming parser to indicate this state to avoid parsing twice.
It looks like commit c809c0c (intended to fix #1028) introduced an issue where numeric values lose precision prior to being returned as
BigDecimal
instances. (FeatureUSE_BIG_DECIMAL_FOR_FLOATS
enables this; given that it's disabled for performance by default, the reason to enable it is when you specifically want to avoid precision loss.)This issue is still present in 2.8.1 (which is where I encountered it).
Some initial discussion took place as comments on the commit. I don't know enough about the contracts involved in the parsing state, but some things that caught my eye:
Double
is where the precision loss occurs.BigDecimal
doesn't support. However these aren't legal values in JSON, and there's aALLOW_NON_NUMERIC_NUMBERS
feature to allow this as extension. (Maybe the feature-check is somewhere else? I haven't looked.)The text was updated successfully, but these errors were encountered: