Hey all,

I got a few sites and looking to combine them into one. I'm basically thinking of making this "one" a mini-search engine for my sites. Because my sites are optimized for search engines, my links are pretty detailed. Thus, I was thinking of just indexing the links and searching these links for keywords upon a search.

While searching the big G for a link extractor, I found a pretty useful code (which I can't use for some reason). It returns nothing. Can someone look over it to find what's wrong or maybe supply me with another simmilar one that will hopefully help? (just need links of a page).

PHP Code:
<?php
// This script will extract all the hyperlinks from a given web page
// To use this script, you must provide a link back to www.WAY2WEB.net
// Thanks!
// (C) 2007 - Anthony Eden | www.WAY2WEB.net

function hyperlinkextract($s1,$s2,$s){
  
$myarray=array();
  
$s1=strtolower($s1);
  
$s2=strtolower($s2);
  
$L1=strlen($s1);
  
$L2=strlen($s2);
  
$scheck=strtolower($s);

  do{
  
$pos1 strpos($scheck,$s1);
  if(
$pos1!==false){
    
$pos2 strpos(substr($scheck,$pos1+$L1),$s2);
    if(
$pos2!==false){
      
$myarray[]=substr($s,$pos1+$L1,$pos2);
      
$s=substr($s,$pos1+$L1+$pos2+$L2);
      
$scheck=strtolower($s);
      }
        }
  } while ((
$pos1!==false)and($pos2!==false));
return 
$myarray;
}

$content = @get_file_contents('http://www.way2web.net/');
$myarray hyperlinkextract("href=\"","\"",$content);

// Process all the links
foreach($myarray as $key => $val) {
echo 
"<br />".$val."\n";
}
?>