There should be much more than that?
There should be much more than that?
Apologies, I lost you on the second page and the phpinfo must have loaded in stages, I see a max execution time of 30s?
too long for a post on here it seems
From the php manual:
I found mine under Configuration - apache2handler as Timeouts and it was Connection: 300 - Keep-Alive: 5Quote:
Your web server can have other timeout configurations that may also interrupt PHP execution. Apache has a Timeout directive and IIS has a CGI timeout function. Both default to 300 seconds. See your web server documentation for specific details.
If this is the problem you might need to do a partial download and repeat it a number of times til you get everything.
gah, knocking this on the head for the night
Thank-you for your input to the thread, I appreciate it as it's at least helped me understand the problem, I've emailed the Holiday Rentals feeds dept with a view to a single file for the World or even better France, heaven knows why they've chopped their data up into so many small pieces
People must have dealt with this ok though, there must be a way, a glaringly obvious one I'm assuming once it's sorted
I'll post back with whatever the outcome is
ie am I back to my original ludicrous suggestion of a Cron job to run a very long series of copies of the same script each downloading the next 50 pages?
maybe 4 cron jobs. looping 1 to 20 then 21 to 40 etc.
at 20 a cron that's ummm, about 200 or 300 cron jobs, I can't see the webhost being too happy with that
and surely that's ludicrous?
There must be a solution that lets a php script chug along for hours if need be while waiting for things happen in their own time
I really am going home now, back tomorrow, thanks all
They do say at http://www.homeaway.co.uk/info/affil...ler-data-feeds that they have other feeds available; perhaps they have or would be willing to crank one out specifically for France? Seems like a pretty easy task.
If this is a php problem, setting error_reporting to E_ALL (your display_errors is already on), may give you additional information about why the script is stopping (perhaps a memory issue.)
I know I reported the original problem as being unable to open and parse up to 80 pages but there are more than 2500 pages in total to process, I just got stuck at 80
I have asked them for a tailored France feed, finger crossed although they were very slow to respond in my previous dealings with them
OK, get me clear on this, are you parsing the entire files, or just extracting certain information from them? If it's the latter you do have other means available. And also, how fast is the process of downloading them. How long to does it take to DL the eighty, without doing the parsing?
I noticed that your server is setup nearly identical to my hosts' and to my localhost machine, including safe mode OFF. So you have available something called SED, the stream editor, which can perform many perl like tasks, and do them very fast.
There are a sequence of nearly 3,000 web pages, each with 20 odd properties. I have to parse the country of each to see if it's in France and then parse about a third of it for the info I want if it is in France
The problem appears to be a timeout of some kind that occurs at about 80seconds or 75 pages, the lost time appears to be an accumulation of the time for the remote pages to load/respond to the simple xml load file line.
It seems to my inexpert eye that my options are either to find a way to get the whole xml file loaded in one go and parse it then (hopefully their techies can supply that) or I find a way to parse a page or small sections of pages at a time.
I need to run this operation once a night automated so it doesn't have to be fast although heaven knows I expected it to be going into this