computing large roots : bigdecimal / java -
I tried to use standard running algorithms to calculate nth roots.
For example (111 ^ 123) ^ (1/123).
The standard algorithm describes the high power of the base (111 ^ 123 in this case) , which takes a lot of time. The algorithm is given here
However, I noticed that the only thing to use double is less than a millisecond. So obviously they use some smart ideas. Any sign on this?
However, I noticed that the only thing that uses double is the least one Milliseconds, therefore, obviously they use some smart ideas.
Not really double
is simply limited precision, so it's basically just calculating the most important 52 bits of results and can leave the remaining calculation And of course, it also helps in being implemented in hardware.
Comments
Post a Comment