Hello and welcome to our community! Is this your first visit?
Register
Enjoy an ad free experience by logging in. Not a member yet? Register.
Results 1 to 10 of 10
  1. #1
    New to the CF scene
    Join Date
    Mar 2012
    Posts
    4
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Javascript generating list of hyperlinks

    I'm new to programming,
    Here's my problem:

    I'm using a for loop to generate a list of hyperlinks

    for(i = 100; i < 200; i++)
    {

    var str = "link";
    document.write(str.link('http://someWebsite.com/' + i +'.html'));
    document.write("<br />");
    }


    so the list that is generated looks like...

    link
    link
    link
    ---------and so on until the loop is finished.

    My question: is it possible for each links text to be dependent on the pages title?

    for example instead of the the text "link", the program would follow the URL and know in the pages header was the title "Bob's website" or whatever the pages title was and would generate the list.

    Example:

    bobs website
    daves website
    rons website
    susies website
    betsys website
    ______________________________
    Any help to differentiate the links text based on following the URL would be helpful Thank you!

  • #2
    Senior Coder
    Join Date
    Jan 2011
    Location
    Missouri
    Posts
    4,225
    Thanks
    23
    Thanked 606 Times in 605 Posts
    You will need to make an array of the site names, change 'str' in the write statement to name of array as follows:
    Code:
    function listem(){
    var myArray = new Array();
    myArray[0] = "Bob's Website";
    myArray[1] = "Jim's Website";
    myArray[2] = "Jill's Website";
    
    	for(i = 0; i < 3; i++)
    	{
    	document.write(myArray[i].link('http://someWebsite.com/' + i +'.html'));
    	document.write("<br />");
    	}
    }
    I did 3, looks like you need 199.

  • #3
    Supreme Master coder! Philip M's Avatar
    Join Date
    Jun 2002
    Location
    London, England
    Posts
    18,022
    Thanks
    203
    Thanked 2,538 Times in 2,516 Posts
    Try this:-

    Code:
    <html>
    <head>
    </head>
    <body onload = "listem()">
    
    <script type = "text/javascript">
    
    function listem(){
    var myArray = new Array();  // text of the links
    myArray[0] = "Bob's Website";
    myArray[1] = "Jim's Website";
    myArray[2] = "Jill's Website";
    myArray[3] = "Google";
    
    var links = new Array();
    links[0] = "http://www.bobswebsite.com";  // corresponding urls
    links[1] = "http://www.jimswebsite.com";
    links[2] = "http://www.bobswebsite.com";
    links[3] = "http://www.google.com";
    
    for (var i=0; i <myArray.length; i++) {
    document.write(myArray[i].link(links[i]));
    document.write("<br>")
    }
    
    }
    </script>
    
    </body>
    </html>
    You should be aware that document.write() statements must (as here) be run before the page finishes loading. Any document.write() statement that runs after the page finishes loading will create a new page and overwrite all of the content of the current page (including the Javascript which called it). So document.write() is at best really only useful to write the original content of your page. It cannot be used to update the content of your page after that page has loaded.

    "I know that you believe that you understand what you think I said, but, I am not sure you realise that what you heard is not what I meant". (Robert
    McCloskey)
    Last edited by Philip M; 03-03-2012 at 04:54 PM.

    All the code given in this post has been tested and is intended to address the question asked.
    Unless stated otherwise it is not just a demonstration.

  • #4
    New to the CF scene
    Join Date
    Mar 2012
    Posts
    4
    Thanks
    0
    Thanked 0 Times in 0 Posts
    sorry for the confusion...

    the program I'm using should create the URL, but I personally can't know what the title of each page is until the list is generated, I need the links to TELL me what they point to (in depth) WITHOUT me having to following hundreds of links to SEE what they are. I'm wondering if there's a way for the program to receive a cookie from each URL? or ask the given URL what it's title is, THEN make that the anchor text for the links?

    ex:

    susies website
    dans website
    roberts website

    ***BUT I had no idea whose websites would appear in the list, the program found out by using the URL given.

    Is there a way to do this? any help is greatly appreciated!

  • #5
    Banned
    Join Date
    Apr 2011
    Posts
    656
    Thanks
    14
    Thanked 69 Times in 69 Posts
    [ot]
    Quote Originally Posted by Philip M View Post
    You should be aware that document.write() statements must (as here) be run before the page finishes loading. Any document.write() statement that runs after the page finishes loading will create a new page and overwrite all of the content of the current page (including the Javascript which called it). So document.write() is at best really only useful to write the original content of your page. It cannot be used to update the content of your page after that page has loaded.
    Nowadays, document.write is largely considered antiquated and should not be used at all unless for writing to a new page from a parent page.

    To write to the current page, best practice nowadays is to use the apropriate DOM methods like createElement(), appendChild() etc etc etc.

    [/ot]

  • #6
    Banned
    Join Date
    Apr 2011
    Posts
    656
    Thanks
    14
    Thanked 69 Times in 69 Posts
    Quote Originally Posted by newscrash View Post
    sorry for the confusion...

    the program I'm using should create the URL, but I personally can't know what the title of each page is until the list is generated, I need the links to TELL me what they point to (in depth) WITHOUT me having to following hundreds of links to SEE what they are. I'm wondering if there's a way for the program to receive a cookie from each URL? or ask the given URL what it's title is, THEN make that the anchor text for the links?

    ex:

    susies website
    dans website
    roberts website

    ***BUT I had no idea whose websites would appear in the list, the program found out by using the URL given.

    Is there a way to do this? any help is greatly appreciated!
    Yes this can be done, but it should be done server side using something like PHP. I doubt it can be done client side.

    You could open the remote page, parse the <title> tag and extract its contents. Then you can use the title contents however you like to make links or whatever.

    More info and an example
    Last edited by webdev1958; 03-04-2012 at 01:38 AM.

  • #7
    New Coder
    Join Date
    Aug 2009
    Location
    Dirty Jersey
    Posts
    30
    Thanks
    0
    Thanked 2 Times in 2 Posts
    it wont work without invoking some type of server side logic. if you were to try to use javascript to request an external resource (such as the URLs of your link) with AJAX, it would fail to work properly for security reasons.

  • #8
    New to the CF scene
    Join Date
    Mar 2012
    Posts
    4
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Okay, so without server side logic I couldn't make any requests.

    So instead of making a list of hyperlinks, say I just want to filter some of the results of my generated links.

    Now I'm wondering is there a way I can load each page then save each url's source HTML locally?

    Then I could run something to search each saved HTML file saved to see if they contained the keywords: "page not found" and if so delete the html file or omit it from my list of hits.

    So if i save 1000 pages, whose URLS are generated from the loop, I can easily see which are valid and which aren't.

    ???

  • #9
    Senior Coder rnd me's Avatar
    Join Date
    Jun 2007
    Location
    Urbana
    Posts
    4,349
    Thanks
    11
    Thanked 589 Times in 570 Posts
    Quote Originally Posted by newscrash View Post
    Okay, so without server side logic I couldn't make any requests.

    So instead of making a list of hyperlinks, say I just want to filter some of the results of my generated links.

    ...

    So if i save 1000 pages, whose URLS are generated from the loop, I can easily see which are valid and which aren't.

    ???
    if you can get a page of link that point to the files you want, you can use the "downthemall" firefox extension to grab all linked pages from that page.

    It will tell you which ones worked and which ones didn't, and you can save the list or re-fetch from there...


    once you have the pages in a folder you control, you can use ajax to search the contents.
    If this is something that served to the public, you will probably want to do your searching on the server-side so that 1000 files don't need to be download and searched...
    my site (updated 13/9/26)
    BROWSER STATS [% share] (2014/5/28) IE7:0.1, IE8:5.3, IE11:8.4, IE9:3.2, IE10:3.2, FF:18.2, CH:46, SF:7.9, NON-MOUSE:32%

  • #10
    New to the CF scene
    Join Date
    Mar 2012
    Posts
    4
    Thanks
    0
    Thanked 0 Times in 0 Posts
    thank you! any code for saving all the links on the page to my hard drive for local use/filtering?


  •  

    Tags for this Thread

    Posting Permissions

    • You may not post new threads
    • You may not post replies
    • You may not post attachments
    • You may not edit your posts
    •