...

View Full Version : Javascript generating list of hyperlinks



newscrash
03-03-2012, 04:07 AM
I'm new to programming,
Here's my problem:

I'm using a for loop to generate a list of hyperlinks

for(i = 100; i < 200; i++)
{

var str = "link";
document.write(str.link('http://someWebsite.com/' + i +'.html'));
document.write("<br />");
}


so the list that is generated looks like...

link
link
link
---------and so on until the loop is finished.

My question: is it possible for each links text to be dependent on the pages title?

for example instead of the the text "link", the program would follow the URL and know in the pages header was the title "Bob's website" or whatever the pages title was and would generate the list.

Example:

bobs website
daves website
rons website
susies website
betsys website
______________________________
Any help to differentiate the links text based on following the URL would be helpful Thank you!

sunfighter
03-03-2012, 04:44 PM
You will need to make an array of the site names, change 'str' in the write statement to name of array as follows:

function listem(){
var myArray = new Array();
myArray[0] = "Bob's Website";
myArray[1] = "Jim's Website";
myArray[2] = "Jill's Website";

for(i = 0; i < 3; i++)
{
document.write(myArray[i].link('http://someWebsite.com/' + i +'.html'));
document.write("<br />");
}
}

I did 3, looks like you need 199.:o

Philip M
03-03-2012, 05:48 PM
Try this:-



<html>
<head>
</head>
<body onload = "listem()">

<script type = "text/javascript">

function listem(){
var myArray = new Array(); // text of the links
myArray[0] = "Bob's Website";
myArray[1] = "Jim's Website";
myArray[2] = "Jill's Website";
myArray[3] = "Google";

var links = new Array();
links[0] = "http://www.bobswebsite.com"; // corresponding urls
links[1] = "http://www.jimswebsite.com";
links[2] = "http://www.bobswebsite.com";
links[3] = "http://www.google.com";

for (var i=0; i <myArray.length; i++) {
document.write(myArray[i].link(links[i]));
document.write("<br>")
}

}
</script>

</body>
</html>

You should be aware that document.write() statements must (as here) be run before the page finishes loading. Any document.write() statement that runs after the page finishes loading will create a new page and overwrite all of the content of the current page (including the Javascript which called it). So document.write() is at best really only useful to write the original content of your page. It cannot be used to update the content of your page after that page has loaded.

"I know that you believe that you understand what you think I said, but, I am not sure you realise that what you heard is not what I meant". (Robert
McCloskey)

newscrash
03-04-2012, 02:17 AM
sorry for the confusion...

the program I'm using should create the URL, but I personally can't know what the title of each page is until the list is generated, I need the links to TELL me what they point to (in depth) WITHOUT me having to following hundreds of links to SEE what they are. I'm wondering if there's a way for the program to receive a cookie from each URL? or ask the given URL what it's title is, THEN make that the anchor text for the links?

ex:

susies website
dans website
roberts website

***BUT I had no idea whose websites would appear in the list, the program found out by using the URL given.

Is there a way to do this? any help is greatly appreciated! :)

webdev1958
03-04-2012, 02:28 AM
You should be aware that document.write() statements must (as here) be run before the page finishes loading. Any document.write() statement that runs after the page finishes loading will create a new page and overwrite all of the content of the current page (including the Javascript which called it). So document.write() is at best really only useful to write the original content of your page. It cannot be used to update the content of your page after that page has loaded.


Nowadays, document.write is largely considered antiquated and should not be used at all unless for writing to a new page from a parent page.

To write to the current page, best practice nowadays is to use the apropriate DOM methods like createElement(), appendChild() etc etc etc.

webdev1958
03-04-2012, 02:35 AM
sorry for the confusion...

the program I'm using should create the URL, but I personally can't know what the title of each page is until the list is generated, I need the links to TELL me what they point to (in depth) WITHOUT me having to following hundreds of links to SEE what they are. I'm wondering if there's a way for the program to receive a cookie from each URL? or ask the given URL what it's title is, THEN make that the anchor text for the links?

ex:

susies website
dans website
roberts website

***BUT I had no idea whose websites would appear in the list, the program found out by using the URL given.

Is there a way to do this? any help is greatly appreciated! :)

Yes this can be done, but it should be done server side using something like PHP. I doubt it can be done client side.

You could open the remote page, parse the <title> tag and extract its contents. Then you can use the title contents however you like to make links or whatever.

More info and an example (http://php.net/manual/en/features.remote-files.php)

TooCrooked
03-04-2012, 02:58 AM
it wont work without invoking some type of server side logic. if you were to try to use javascript to request an external resource (such as the URLs of your link) with AJAX, it would fail to work properly for security reasons.

newscrash
03-07-2012, 02:43 AM
Okay, so without server side logic I couldn't make any requests.

So instead of making a list of hyperlinks, say I just want to filter some of the results of my generated links.

Now I'm wondering is there a way I can load each page then save each url's source HTML locally?

Then I could run something to search each saved HTML file saved to see if they contained the keywords: "page not found" and if so delete the html file or omit it from my list of hits.

So if i save 1000 pages, whose URLS are generated from the loop, I can easily see which are valid and which aren't.

???

rnd me
03-07-2012, 06:15 PM
Okay, so without server side logic I couldn't make any requests.

So instead of making a list of hyperlinks, say I just want to filter some of the results of my generated links.

...

So if i save 1000 pages, whose URLS are generated from the loop, I can easily see which are valid and which aren't.

???

if you can get a page of link that point to the files you want, you can use the "downthemall" firefox extension to grab all linked pages from that page.

It will tell you which ones worked and which ones didn't, and you can save the list or re-fetch from there...


once you have the pages in a folder you control, you can use ajax to search the contents.
If this is something that served to the public, you will probably want to do your searching on the server-side so that 1000 files don't need to be download and searched...

newscrash
03-11-2012, 08:24 AM
thank you! any code for saving all the links on the page to my hard drive for local use/filtering?



EZ Archive Ads Plugin for vBulletin Copyright 2006 Computer Help Forum