...

View Full Version : modulus operator is not working if i give number more than 24 digits



pradeep_s3
12-07-2009, 11:56 AM
If number is more than 24 digits, modulus operator is not giving correct output
here attached sample code
[code]<script type="text/javascript">
var a=10000000000000000000000.0;
var b=10.0;
var c=a % b;
alert("c"+c);
</script>[code]

please tel me solution

Philip M
12-07-2009, 12:47 PM
The largest number that Javascript can handle reliably without loss of precision is 9e15 or 9000000000000000. Any number greater than that is liable to return incorrect values for parseInt(), % modulus etc.

See also:- http://jsfromhell.com/classes/bignumber

All advice is supplied packaged by intellectual weight, and not by volume. Contents may settle slightly in transit.

oesxyl
12-07-2009, 01:04 PM
If number is more than 24 digits, modulus operator is not giving correct output
here attached sample code
[code]<script type="text/javascript">
var a=10000000000000000000000.0;
var b=10.0;
var c=a % b;
alert("c"+c);
</script>[code]

please tel me solution
your problem is that you use decimals. It work for me if I remove decimal part.

best regards

Philip M
12-07-2009, 01:29 PM
your problem is that you use decimals. It work for me if I remove decimal part.


Not so! The real problem is that which I mentioned above. Integer numbers greater than 9e15 may or may not render accurately, depending of course on whether they are amenable to binary or not. Just as you cannot write 1/3 as a binary floating point number (resolves to 0.3333333333333333) but 1/4 is correctly evaluated to 0.25.



<script type="text/javascript">
var a=100000000000000000000000;
var b=10;
var c=a % b;
alert("c"+c); // 2

var a=90000000000000000000000; // 9e22
var b=10;
var c=a % b;
alert("c"+c); // 6

var a=9000000000000000000000; // 9e21
var b=10;
var c=a % b;
alert("c"+c); // 0

var a = 9971992547409847;
document.write(a); // 9971992547409848

</script>

Old Pedant
12-07-2009, 09:52 PM
As a further point of clarification:

JavaScript doesn't *REALLY* use integer arithmetic when ANY value involved exceeds 2147483647 (2^31-1). (And the spec doesn't require it to ever use integers, but I would strongly suspect that all modern implementations do so when they can.) That number is the largest positive integer that can be held in a standard 32-bit signed integer. (The smallest negative number is one greater, thanks to the foibles of 2's-complement binary representation.)

So...instead, JS must use a double precision floating point number. And the IEEE/ANSI format for such numbers (used by all modern CPUs) gives only 53 bits for the "mantissa". And 2^53 is 9007199254740992, whence the number that Philip is citing as the maximum possible integer before you begin losing precision. (It's probably actually 9007199254740991, one less than 2^53, again because of 2's-complement notation, but it's been too long since I investigated the format for me to remember that for sure.)



EZ Archive Ads Plugin for vBulletin Copyright 2006 Computer Help Forum