View Full Version : Somebody please help

08-01-2012, 06:54 AM
Hey. Im not sure if this falls under PHP but if it doesn't i will just move my topic.

So what i want to know is can i create a website that will automatically make posts, or essentially duplicate them onto my website???

For example.
I create my website.
My website has a script that looks at craigslist, and whenever they have a new post because somebody needs to sell something, it will appear on my website.

Or if the above is not do able, can i
create a search engine on my website.
I search for example "wooden chairs" and i get results from craigslist on that topic. Once somebody clicks the craigslist ad on my site, it will take them to the ad with my referral link.

Can this be done or would i have to repost all the ads on my site manually?

Thanks for reading and have a good day

08-01-2012, 10:36 PM
No, this is completely doable.
First you need to verify that the TOS from craigslist allows you to do this. Otherwise, it will likely just block access from your domain.
You simply use curl to connect to pull the content you need. I won't write curl code for you since its easy to abuse, but I'm sure you can find plenty of tutorials for it.

Alternatively, and better option would be to ask them if they have a webservice for the search engine. Then you just feed it through soap clients.

08-01-2012, 10:52 PM
Craigslist will likely block you before you know what's going on. They have their own script jockeys, and are pretty quick on the trigger.

08-01-2012, 10:53 PM
So assuming that their TOS does not state that i cannot use/advertise their posts on my website, i should be able to automatically have my site repost their posts and when clicked would link to the post on their site. Also what do you mean by "easy to abuse". Would this affect me in a negative manor?

08-01-2012, 11:01 PM
If their TOS doesn't state that you cannot use a bot to retrieve information in an automated fashion, then you are good to go. Doesn't mean they still won't block you of course.
I mean that CURL is easy to abuse for what it is capable of doing. I provide very limited help on the use of curl or remote connected technology except for soap.

08-01-2012, 11:04 PM
So i see a lot of you aren't too satisfied with using CURL. is there another alternative that would work better?

EIDT: Or maybe a method that would ensure i would not get blocked. Considering that if i get blocked it can impact my website quite hard.

08-02-2012, 05:29 AM
If they don't offer a remote application such as rss/xml feeds or soap services, than curl is really your only option.
Assuming that TOS isn't broken, you won't get blocked as long as the requests are reasonable. Single searches are fine. But don't pull for a pile of information every few minutes or they will shut your connectivity to them down. Remember you are sucking their bandwidth in the process of doing this.

08-02-2012, 08:44 PM
Well what i need to do is take their posts and put i on my site. I would assume they get maybe 100 new ones a day. So i would have to copy al the data from those 100 posts. Do you think i would get blocked? or should i be fine?

EDIT: and how much does a coder usually charge to set up an advanced curl like this?

08-03-2012, 04:20 PM
Craigslist will block you out.
The only valid and ethical way is to use their RSS (XML) feeds.

I'm not sure if that will work for you or not, but below is an example I've
made that uses the XML feeds for displaying items on a website. Others
might find this useful ... for example, use a CRON job to run the script every
hour to see if someone lists a particular item you're looking for. Perhaps a
PHP script could text message you when it sees a new entry? That way,
you can snipe it before anyone else buys it.

This is the test form:

<body OnLoad="document.myform.target.focus();">
<form name="myform" method="post" action="craigslist.php">
Enter Search Term: <input type="text" name="target" value="" style="width:350px;"> <br />
<input type="checkbox" name="desc"> Include Description<br />
<input type="checkbox" name="narrow" checked="checked"> Multiple words searched literally.<br /><br />
<input type="submit" name="submit" value="Submit">
<br /><br />

Below is the PHP called "craigslist.php".
Note that I have my own city area programmed-in. You would put in your own.


// Get target
$search = str_replace(" ","+",$search);
$search = str_replace("&","",$search);


// Put your own CL feed URL on the line below, so it matches your area.
// Also, note where the $search string is placed in the URL.
// If you do special searches, like max and min prices, those fields will also appear in your RSS feed URL
// Use PHP variables for any of the CL search properties.

$feed_url = "http://minneapolis.craigslist.org/search/?areaID=19&catAbb=sss&query=".$search."&format=rss";

$curl = curl_init();

curl_setopt($curl, CURLOPT_URL,"$feed_url");
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl, CURLOPT_CONNECTTIMEOUT, 0);

$data = curl_exec($curl);


$doc = new SimpleXmlElement($data, LIBXML_NOCDATA);


function parseCraigslist($xml){
echo "<strong>".$xml->title."</strong>";
$cnt = count($xml->item);
for($i=0; $i<$cnt; $i++)
$title = $xml->item[$i]->title;
$url = $xml->item[$i]->link;
$desc = $xml->item[$i]->description;

//Use that namespace
$dc = $xml->item[$i]->children("http://purl.org/dc/elements/1.1/");
$date = $dc->date;
$part = explode("T",$date);
$list[] = $part[0]." | <a href='".$url."'>".$title."</a><br />".$desc."<br /><hr><br />";
$list[] = $part[0]." | <a href='".$url."'>".$title."</a><br />";

foreach($list as $row){
echo $row;
echo "No items found.";


If you actually go to CL and do a search for something in your city (nearby area),
the search result will have an RSS link. View that to see an example of what
the URL looks like for your area.