Hi people, sorry if this is a bit random, but i googled for an appropriate forum to ask this in and this was one of the results

So i was just wondering if anyone could lend some advice regarding the following scenario...

* A game consists of 500 xml files.
* Within these 500 files there is data which defines how an object acts.
* Let's just say for example its an xml file for a vehicle within a first person shooter.
* Within just this one file there is 100 cases of data not being rounded up or down. i.e; "0.19999999999999999" instead of "0.2" or just "0.19".

So, over the entire game we have 50000 cases of data where the server/client has to calculate to a large decimal place when it doesn't need to.

The question is; will this have a noticeable effect on anything, be it lag, performance or load times? Any input appreciated