Hello and welcome to our community! Is this your first visit?
Register
Enjoy an ad free experience by logging in. Not a member yet? Register.
Page 2 of 3 FirstFirst 123 LastLast
Results 16 to 30 of 32
  1. #16
    Senior Coder
    Join Date
    Aug 2006
    Posts
    1,263
    Thanks
    10
    Thanked 277 Times in 276 Posts
    There should be much more than that?

  2. #17
    Regular Coder
    Join Date
    Oct 2004
    Location
    London E4 UK
    Posts
    320
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Apologies, I lost you on the second page and the phpinfo must have loaded in stages, I see a max execution time of 30s?

    http://www.englishspoken.info/phpinfo.php

    too long for a post on here it seems

  3. #18
    Senior Coder
    Join Date
    Jan 2011
    Location
    Missouri
    Posts
    4,175
    Thanks
    23
    Thanked 601 Times in 600 Posts
    From the php manual:
    Your web server can have other timeout configurations that may also interrupt PHP execution. Apache has a Timeout directive and IIS has a CGI timeout function. Both default to 300 seconds. See your web server documentation for specific details.
    I found mine under Configuration - apache2handler as Timeouts and it was Connection: 300 - Keep-Alive: 5

    If this is the problem you might need to do a partial download and repeat it a number of times til you get everything.
    Evolution - The non-random survival of random variants.

    "If you leave hydrogen alone, for long enough, it begins to think about itself."

  4. #19
    Regular Coder
    Join Date
    Oct 2004
    Location
    London E4 UK
    Posts
    320
    Thanks
    0
    Thanked 0 Times in 0 Posts
    gah, knocking this on the head for the night

    Thank-you for your input to the thread, I appreciate it as it's at least helped me understand the problem, I've emailed the Holiday Rentals feeds dept with a view to a single file for the World or even better France, heaven knows why they've chopped their data up into so many small pieces

    People must have dealt with this ok though, there must be a way, a glaringly obvious one I'm assuming once it's sorted

    I'll post back with whatever the outcome is

  5. #20
    Regular Coder
    Join Date
    Oct 2004
    Location
    London E4 UK
    Posts
    320
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by sunfighter View Post
    From the php manual:

    I found mine under Configuration - apache2handler as Timeouts and it was Connection: 300 - Keep-Alive: 5

    If this is the problem you might need to do a partial download and repeat it a number of times til you get everything.
    I've tried feebly to work out to do that but wouldn't it have to be done by separate scripts as any single script running for longer than the timeout limit will erm, timeout

    ie am I back to my original ludicrous suggestion of a Cron job to run a very long series of copies of the same script each downloading the next 50 pages?

  6. #21
    Senior Coder
    Join Date
    Jan 2011
    Location
    Missouri
    Posts
    4,175
    Thanks
    23
    Thanked 601 Times in 600 Posts
    maybe 4 cron jobs. looping 1 to 20 then 21 to 40 etc.
    Evolution - The non-random survival of random variants.

    "If you leave hydrogen alone, for long enough, it begins to think about itself."

  7. #22
    Regular Coder
    Join Date
    Oct 2004
    Location
    London E4 UK
    Posts
    320
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by sunfighter View Post
    maybe 4 cron jobs. looping 1 to 20 then 21 to 40 etc.
    there's thousands of pages, every property they list for rent in the entire world

    at 20 a cron that's ummm, about 200 or 300 cron jobs, I can't see the webhost being too happy with that

    and surely that's ludicrous?

    There must be a solution that lets a php script chug along for hours if need be while waiting for things happen in their own time

  8. #23
    Regular Coder
    Join Date
    Oct 2004
    Location
    London E4 UK
    Posts
    320
    Thanks
    0
    Thanked 0 Times in 0 Posts
    I really am going home now, back tomorrow, thanks all

  9. #24
    Senior Coder
    Join Date
    Aug 2006
    Posts
    1,263
    Thanks
    10
    Thanked 277 Times in 276 Posts
    They do say at http://www.homeaway.co.uk/info/affil...ler-data-feeds that they have other feeds available; perhaps they have or would be willing to crank one out specifically for France? Seems like a pretty easy task.

  10. #25
    Senior Coder
    Join Date
    Jan 2011
    Location
    Missouri
    Posts
    4,175
    Thanks
    23
    Thanked 601 Times in 600 Posts
    Quote Originally Posted by Tynan View Post
    there's thousands of pages, every property they list for rent in the entire world

    at 20 a cron that's ummm, about 200 or 300 cron jobs, I can't see the webhost being too happy with that

    and surely that's ludicrous?

    There must be a solution that lets a php script chug along for hours if need be while waiting for things happen in their own time
    No Tynan, I didn't say 200 or 300 cron jobs. I said to call his php script with the loop running 1 to 20 (and maybe 1 to 40) Then a little later call it again but this time the loop runs 21 to 41(OR better 41 to 80). That's two cron jobs maybe 4 if small loops are needed. But I bet it's two. The crons are only needed if Tynan can't get them to just give him France rentals.
    Last edited by sunfighter; 08-19-2013 at 10:57 PM.
    Evolution - The non-random survival of random variants.

    "If you leave hydrogen alone, for long enough, it begins to think about itself."

  11. #26
    Senior Coder CFMaBiSmAd's Avatar
    Join Date
    Oct 2006
    Location
    Denver, Colorado USA
    Posts
    3,037
    Thanks
    2
    Thanked 316 Times in 308 Posts
    If this is a php problem, setting error_reporting to E_ALL (your display_errors is already on), may give you additional information about why the script is stopping (perhaps a memory issue.)
    If you are learning PHP, developing PHP code, or debugging PHP code, do yourself a favor and check your web server log for errors and/or turn on full PHP error reporting in php.ini or in a .htaccess file to get PHP to help you.

  12. #27
    Regular Coder
    Join Date
    Oct 2004
    Location
    London E4 UK
    Posts
    320
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by sunfighter View Post
    No Tynan, I didn't say 200 or 300 cron jobs. I said to call his php script with the loop running 1 to 20 (and maybe 1 to 40) Then a little later call it again but this time the loop runs 21 to 41(OR better 41 to 80). That's two cron jobs maybe 4 if small loops are needed. But I bet it's two. The crons are only needed if Tynan can't get them to just give him France rentals.
    Morning

    I know I reported the original problem as being unable to open and parse up to 80 pages but there are more than 2500 pages in total to process, I just got stuck at 80

    I have asked them for a tailored France feed, finger crossed although they were very slow to respond in my previous dealings with them

  13. #28
    Regular Coder
    Join Date
    Oct 2004
    Location
    London E4 UK
    Posts
    320
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by CFMaBiSmAd View Post
    If this is a php problem, setting error_reporting to E_ALL (your display_errors is already on), may give you additional information about why the script is stopping (perhaps a memory issue.)
    Thanks, will investigate this this evening (GMT time for you far away peoples)

  14. #29
    Senior Coder
    Join Date
    Sep 2010
    Posts
    1,974
    Thanks
    15
    Thanked 229 Times in 229 Posts
    OK, get me clear on this, are you parsing the entire files, or just extracting certain information from them? If it's the latter you do have other means available. And also, how fast is the process of downloading them. How long to does it take to DL the eighty, without doing the parsing?

    I noticed that your server is setup nearly identical to my hosts' and to my localhost machine, including safe mode OFF. So you have available something called SED, the stream editor, which can perform many perl like tasks, and do them very fast.
    Welcome to http://www.myphotowizard.net

    where you can edit images, make a photo calendar, add text to images, and do much more.


    When you know what you're doing it's called Engineering, when you don't know, it's called Research and Development. And you can always charge more for Research and Development.

  15. #30
    Regular Coder
    Join Date
    Oct 2004
    Location
    London E4 UK
    Posts
    320
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by DrDOS View Post
    OK, get me clear on this, are you parsing the entire files, or just extracting certain information from them? If it's the latter you do have other means available. And also, how fast is the process of downloading them. How long to does it take to DL the eighty, without doing the parsing?

    I noticed that your server is setup nearly identical to my hosts' and to my localhost machine, including safe mode OFF. So you have available something called SED, the stream editor, which can perform many perl like tasks, and do them very fast.
    Hi

    There are a sequence of nearly 3,000 web pages, each with 20 odd properties. I have to parse the country of each to see if it's in France and then parse about a third of it for the info I want if it is in France

    The problem appears to be a timeout of some kind that occurs at about 80seconds or 75 pages, the lost time appears to be an accumulation of the time for the remote pages to load/respond to the simple xml load file line.

    It seems to my inexpert eye that my options are either to find a way to get the whole xml file loaded in one go and parse it then (hopefully their techies can supply that) or I find a way to parse a page or small sections of pages at a time.

    I need to run this operation once a night automated so it doesn't have to be fast although heaven knows I expected it to be going into this


 
Page 2 of 3 FirstFirst 123 LastLast

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •