Floating point accuracy of MHA parser variables
Posted: Thu Mar 19, 2020 1:37 am
I am trying to allow some of my algorithm's tuning parameters to be updated at runtime with the help of the MHA-parser. However, when I inspect my self-developed plugin with 'analysemhaplugin my_plugin' it seems that the floating point numbers are not quite the same as what I set them as. What is the cause of this?
Output of 'analysemhaplugin my_plugin':
Code: Select all
MHAParser::float_t param_1;
MHAParser::float_t param_2;
MHAParser::float_t param_3;
// In constructor
param_1("param_1 description",
"0.98",
"[0.90, 0.99]"),
param_2("param_2 description",
"0.98",
"[0.90, 0.99]"),
param_3("param_3 description",
"0.3",
"[0.01, 0.5]"),
insert_item("param_1", ¶m_1);
insert_item("param_2", ¶m_2);
insert_item("param_3", ¶m_3);
Code: Select all
# param_1 description
# float:[0.899999976,0.99000001] (writable)
alphaY = 0.980000019
# param_2 description
# float:[0.899999976,0.99000001] (writable)
alphaV = 0.980000019
# param_3 description
# float:[0.00999999978,0.5] (writable)
minGain = 0.300000012