...

View Full Version : The size of the internet??



bhav
12-24-2006, 07:26 PM
Is there actually a way to check how big the internet is in size of mb or tb or whatever lol!!

VIPStephan
12-24-2006, 08:40 PM
Haha, that's a good one. :)
Well, maybe you can try to download the internet (http://www.w3schools.com/downloadwww.htm)?

bhav
12-24-2006, 08:48 PM
Lol, my Local Disk could hold that anyday :P

Seriously though is there a way to find out? lol

oracleguy
12-24-2006, 10:38 PM
Not really, but you can assume its at least in the hundreds of petabytes.

jkd
12-24-2006, 10:59 PM
The size of Google's cache might serve as a good lower bound. I'm sure that data is published somewhere.

Spookster
12-24-2006, 11:20 PM
Yeah I think it's around 36Gb's. (GoogleBytes).

bhav
12-25-2006, 03:44 PM
Lmao thats loads.. ps whats a perabyte? 1000 terabytes?

gsnedders
12-25-2006, 07:17 PM
Lmao thats loads.. ps whats a perabyte? 1000 terabytes?

Yeah - 1,000,000,000,000,000 bytes.

More recent estimates as to Google's capacity put 4PB of RAM - not HD storage.

oracleguy
12-25-2006, 07:40 PM
Yeah - 1,000,000,000,000,000 bytes.

Technically a petabyte is 1,125,899,906,842,624 bytes or 2^50 bytes.

The wayback machine has at least 2PB of data just on their site alone according to their FAQ (http://www.archive.org/about/faqs.php#9).

gsnedders
12-25-2006, 07:55 PM
Technically a petabyte is 1,125,899,906,842,624 bytes or 2^50 bytes.
Surely that's what a pebibyte is?

marek_mar
12-25-2006, 11:54 PM
Indeed. But their FAQ doesn't explicidly state that. They actually state it's a petabyte, not a pebibyte.

bhav
12-26-2006, 12:30 AM
god bless the man hu invented the internet! ne1 know hu it was?

marek_mar
12-26-2006, 12:48 AM
Wikipedia knows.

oracleguy
12-26-2006, 01:06 AM
The Internet evolved from ARPANET which was developed by what is known today as DARPA (http://en.wikipedia.org/wiki/DARPA), which is an agency of the US DoD.

rpgfan3233
12-26-2006, 07:21 PM
A pebibyte is 1,024 Tebibytes = 1,048,576 Gibibytes = 1,099,511,627,776 Mebibytes = 1,125,899,906,842,624 Kibibytes = 1,152,921,504,606,846,976 bytes.

According to the way they were named, a petabyte is 1,000 Terabytes = 1,000,000 Gigabytes = 1,000,000,000 Megabytes = 1,000,000,000,000 Kilobytes = 1,000,000,000,000,000 bytes. However, in terms of computer science, they are supposed to be equivalent to their counterparts above.

The problem: Do you mean 1000 * 1 byte = 1 Kilobyte or do you actually mean 1024 * 1 byte = 1 Kilobyte (Kibibyte) as it was originally intended?
The solution: Use Kibibyte in place of Kilobyte, Mebibyte in place of Megabyte, and so forth. Unfortunately, the world is not changing as rapidly as it needs to change to make this solution work.

daniel_g
12-27-2006, 09:26 AM
Just multily 100 Million webpages times the averge size of a webpage(15 kbytes). Then add the average amount of disk space used by internet users times the number of internet users(6,499,697,060 * 80 Gigs).
So:


= (100 Million * 15 kbytes) + (6,499,697,060 * 80 Gigs) //80 Gigs is my own estimate. Couldn't find that on google.
= (100,000,000 * 9.53674316 × 10-7 Gigs) + (5.2E11 Gigs)
= (95 Gigs) + (5.2E11 Gigs)
= 5.2E11 Gigs
= 507 812 500 terabyte

Sources:
http://edition.cnn.com/2006/TECH/internet/11/01/100millionwebsites/
http://www2.sims.berkeley.edu/research/projects/how-much-info/internet.html

croatiankid
12-27-2006, 12:34 PM
Wow. Could you calculate this though: at the current rate of growth of the internet, would ANY connection speed be fast enough to download the whole thing?

gsnedders
12-27-2006, 01:02 PM
Wow. Could you calculate this though: at the current rate of growth of the internet, would ANY connection speed be fast enough to download the whole thing?

Nowhere near.

bhav
12-27-2006, 02:23 PM
you reckon a script could be made that could monitor it?

Karen S. Garvin
12-27-2006, 03:38 PM
Mmmm... kibbles and bits! kibbles and bits!

I'm going to eat some kibbles and bits! :D Arf!

oracleguy
12-27-2006, 07:44 PM
you reckon a script could be made that could monitor it?

No. There is no way to get anywhere near an accurate number.

bhav
12-28-2006, 12:08 AM
Oh, but you could make something that gives a approx?

rpgfan3233
12-28-2006, 04:10 AM
Mmmm... kibbles and bits! kibbles and bits!

I'm going to eat some kibbles and bits! :D Arf!

I think you mean nybbles (a.k.a. "nibbles") and bits! :thumbsup:

_Aerospace_Eng_
12-28-2006, 08:56 AM
Maybe one day you can stumble onto the last page of the internet found here (http://www.w3schools.com/html/lastpage.htm).

bhav
12-29-2006, 01:42 PM
so what was the first then? ;)

Basscyst
01-04-2007, 11:29 PM
Some cgi script it would seem. lol.

http://www.endoftheinternet.com/

liorean
01-05-2007, 01:46 AM
The size of the internet? Disk size isn't even the right unit for that. The internet is a network, the best estimate of it's size would be counting the number of nodes in it. And behind a vast number of those nodes you can find entire networks of computers behind firewalls and NATs.

And if you really mean the web, then there's loads of other issues:
- Many servers contains heaps of files that you cannot find a link to anyhere, and with no method of autodiscovery. How would you count those?
- Many servers build resources from a database or a templating system when requested. How do you deal with these?
- Many servers block spiders, serve content differently depending on UA sniffing, serve different content depending on cookies or which user is logged in and so on. How would you count those files?
- Many files are only available for logged in users. How do you deal with these?
- How do you deal with data on non-default ports?


Rest assured that the size of the wayback machine, or the search engine indices for that matter, are only the tip of the iceberg. EMail and BitTorrent already stand for a several times greater part of the internet traffic than the web. Add in FTP, Usenet, peer-to-peer networks, IMs and game traffic and you'll find they together dwarf the web. Adding the non-spiderable parts of the web, you'll find 2 PiB is a laughingly small number even if you just want to measure the amount of fixed data and not traffic or nodes.

daniel_g
01-05-2007, 02:10 AM
I don't know about the first page of the internet(probably a message sent to someone else with the wording "success"). Anyways, according to wikipedia, the very first page on the www was one that looked like:
http://www.w3.org/History/19921103-hypertext/hypertext/WWW/TheProject.html
And the guys at CERN held the very first web server:
http://info.cern.ch/

Oh, btw, here's a link to a picture of the worlds first web server:
http://en.wikipedia.org/wiki/Image:First_Web_Server.jpg

oracleguy
01-05-2007, 06:35 AM
The size of the internet? Disk size isn't even the right unit for that. The internet is a network, the best estimate of it's size would be counting the number of nodes in it. And behind a vast number of those nodes you can find entire networks of computers behind firewalls and NATs.

And if you really mean the web, then there's loads of other issues:
- Many servers contains heaps of files that you cannot find a link to anyhere, and with no method of autodiscovery. How would you count those?
- Many servers build resources from a database or a templating system when requested. How do you deal with these?
- Many servers block spiders, serve content differently depending on UA sniffing, serve different content depending on cookies or which user is logged in and so on. How would you count those files?
- Many files are only available for logged in users. How do you deal with these?
- How do you deal with data on non-default ports?


Rest assured that the size of the wayback machine, or the search engine indices for that matter, are only the tip of the iceberg. EMail and BitTorrent already stand for a several times greater part of the internet traffic than the web. Add in FTP, Usenet, peer-to-peer networks, IMs and game traffic and you'll find they together dwarf the web. Adding the non-spiderable parts of the web, you'll find 2 PiB is a laughingly small number even if you just want to measure the amount of fixed data and not traffic or nodes.

That's basically why I said there was no way to get anywhere near an accurate number, I just didn't feel like typing that much. :)

Regarding BitTorrent, I heard something like last year or the year before that saying that BT accounted for like 1/3 of the traffic on the internet.

Vapor
01-05-2007, 08:21 PM
LOL, this is the funniest thread ever.

karmona
10-23-2008, 03:36 PM
I have a nice round guesstimate average of 20 billion indexed pages.

... multiplying it with an average page size of 70K = ~1300* Tera = ~1.3 Peta of known/indexed web which might hide a ~600 Peta of deeper web…

See more details in this post: The Size of the Internet (http://blog.karmona.com/index.php/2007/09/26/the-size-of-the-internet/)

-- Karmona

liorean
10-23-2008, 04:22 PM
Karmona: That's wrong. Let me give you a reason: Ian Hickson took as an example of the distinction between useful content and just any content in spidering that his site Da Mow Mow (http://damowmow.com/) contains an infinite amount of pages. All of them have the same content, but they have distinct URLs, so are different resources. Which means that on that single site alone, you can fill an infinite amount of hard drive space but downloading all those resources. There are many, many other sites out there that have infinitely many pages.

Yay
10-23-2008, 06:44 PM
The size of the internet is 1googol bytes

rafiki
10-23-2008, 06:50 PM
http://www.com is the start of the internet.
as for size do you think they have made up a name for it to be something as small as a 1.2......?


There was actually a site on www.com when I last checked...

liorean
10-23-2008, 07:41 PM
Well, here you can find the real start of, well... not the internet but at least... the web: http://info.cern.ch/

karmona
10-24-2008, 11:04 AM
Karmona: That's wrong. Let me give you a reason: Ian Hickson took as an example of the distinction between useful content and just any content in spidering that his site Da Mow Mow (http://damowmow.com/) contains an infinite amount of pages. All of them have the same content, but they have distinct URLs, so are different resources. Which means that on that single site alone, you can fill an infinite amount of hard drive space but downloading all those resources. There are many, many other sites out there that have infinitely many pages.

Liorean, this is philosophical question:

"If there is an infinite amount of pages which all of them have the same content e.g. by simply redirection rules, does it means the internet size is the sum of all this virtual pages?"

I think this isn't an interesting question and my answer is NO:

1. The data is stored in the same place so I don't see the logic in counting it again and again for each virtual URL
2. Search engines easily ignore such duplicates and doesn't index it twice.
3. I can't be wrong ;)

-- Karmona

abduraooft
10-24-2008, 01:45 PM
http://www.com
off topic: How to check the whois info of this domain?

Shoot2Kill
10-24-2008, 03:13 PM
when using the URL in a whois lookup, it ignores the www, and sees it as com.com, so you have to enter the address to the end of the URL manually..
like so:

http://whois.domaintools.com/www.com

tagnu
10-25-2008, 08:07 AM
Cool tweak!

sybil6
11-13-2008, 06:37 PM
i agree

Millenia
11-13-2008, 07:16 PM
Man, I've just realised that www.com (http://www.com) is a subdomain...
am I a bit slow?
also am I right? Because on their site, you can get email addresses, but they are
something@www.com

rafiki
11-14-2008, 02:08 PM
the whole domain is www.www.com accessible through the no www. leaving just www.com
like http://codingforums.com/ has no www.

sybil6
11-16-2008, 12:10 AM
funny :cool:

bcarl314
11-16-2008, 04:01 AM
Perhaps a more pragmatic approach to the calculation.

I can GUARANTEE the maximum size of the "internet" is a number less than the total capacity of all storage media sold.

So, simply searching for total sales volumes on hard drives, disks, USB, etc will give you a method to extrapolate a maximum number.

For example. If we knew there were 200 billion hd sold at an average of 250GB each, then we have 2.5e+11 * 2.0e+11 or 5e+22

This would at least give us an upper bound. Then you could calculate out how many drives are defunct / not in use to limit that number further.

tagnu
11-17-2008, 04:27 AM
Out of curiosity.. you teach math at university? :)

ljuwaidah
11-17-2008, 07:19 AM
I don't know how big it is but i know where its center is: http://www.exactcenteroftheinternet.com/

rafiki: Ironically, their slogan is: the web starts here.

ljuwaidah
11-17-2008, 07:22 AM
I don't know how big it is but i know where its center is: http://www.exactcenteroftheinternet.com/

rafiki: Ironically, their slogan is: the web starts here.

borntoslow
11-17-2008, 01:27 PM
Size i can not help you with, but if you want to see a map of it...here you go.


http://www.opte.org/maps/static/1069646562.LGL.2D.700x700.png

bcarl314
11-17-2008, 03:06 PM
Out of curiosity.. you teach math at university? :)

Nope, but I do coach a Junior High Math team ;)

effpeetee
11-21-2008, 09:40 AM
Technically a petabyte is 1,125,899,906,842,624 bytes or 2^50 bytes.

The wayback machine has at least 2PB of data just on their site alone according to their FAQ (http://www.archive.org/about/faqs.php#9).
How many floppy disks is that?

Frank

oesxyl
11-21-2008, 03:16 PM
How many floppy disks is that?

Frank
745,654,045 floppy disk x 1.44 M. on the last one are stored only 655 Kb.

regards

effpeetee
11-21-2008, 05:00 PM
Show off!

I bet you used a calculator. That's cheating.

Frank

oesxyl
11-21-2008, 05:10 PM
Show off!

I bet you used a calculator. That's cheating.

Frank
don't bet! I use it, :) , ..., I'm dependent, :confused:

best regards

effpeetee
11-21-2008, 08:28 PM
don't bet! I use it, :) , ..., I'm dependent, :confused:

best regards

Join the club.

Happy coding.

Frank



EZ Archive Ads Plugin for vBulletin Copyright 2006 Computer Help Forum