Okay, so without server side logic I couldn't make any requests.
So instead of making a list of hyperlinks, say I just want to filter some of the results of my generated links.
Now I'm wondering is there a way I can load each page then save each url's source HTML locally?
Then I could run something to search each saved HTML file saved to see if they contained the keywords: "page not found" and if so delete the html file or omit it from my list of hits.
So if i save 1000 pages, whose URLS are generated from the loop, I can easily see which are valid and which aren't.