I am using formulas / RuleJS. If, as a specific example, you enter:
= 1.23 + 45.66
into a cell, it displays 46.88999999999999
as the value. Other values may give “correct” result, e.g.
= 1.00 + 45.89
displays 46.89
.
I am deliberately not defining the cell type as “numeric” here. I accept that if I give it type: "numeric", format: "0.00"
the first case will be output as 46.89
. That is not my point.
I totally understand that floating point numbers are stored/represented “approximately”. But I have never seen a language give such a non-accurate result with numbers like these, and find it hard to believe that JavaScript is “off” on this particular calculation. Is this inaccuracy coming from RuleJS? Is numeral.js still involved if the cell is not defined as numeric? This issue means that essentially you can never dare use a formula anywhere without defining the cell as type numeric, with all the consequences that entails…