...

View Full Version : Offline processing



Nomad
04-14-2004, 06:23 AM
I hope someone can help with my problem. It is not anything that I have done before so I don't even know where to start.

Certain actions on my website require a considerable amount of processing. Rather than make the person wait until the processing is complete, I would like to "fire" off an offline job to do the work. That way, they would get control back immediately and can do other things with their browser.

The programs have all been written in Perl

Hoping someone can help. :confused:

Roy Sinclair
04-14-2004, 10:45 PM
If this is a "normal" CGI type request and the rest of the processing doesn't send additional data back to the user then there's no reason why you can't send the complete response page back to the user and then perform the rest of the processing in that same script.

If there is additional information that needs to be relayed to the user then you've got a case for using a popup (a valid use for a popup and the purpose for which popups were made possible) to get the reply leaving the user's main browser free for their use.

Nomad
04-15-2004, 12:56 AM
To Roy Sinclair

Thanks for your reply.

Like I said, I have never done something like this before and I am still trying to get my head around it.

Firstly, the additional processing does not require anything being sent back to the user so that is not a concern.

As I understand it, you are saying that I can build an HTML page and send it to the user and then continue processing. I was under the impression that any HTML page built in the script would only be sent when the script finishes regardless of where in the flow it was built. Or am I misunderstanding how CGI works?

The main point I am not sure of is how I get control back after the page has been sent?

firepages
04-16-2004, 05:05 AM
In PHP you can use a combination of output buffering (to send data to the client even though the script is still runnning) & ignore_user_abort() to ensure the script completes even after the user has moved on ... now whilst I am sure that this is possible with PERL .. I would not personally know how to tackle that (you should probably be posting this in the PERL forum) , if you have PHP available you can get it to run the PERL scripts using virtual() or similar using the logic above.

The other option is to get your page to set a flag (in textfile or DB) that script $x needs running for user $y, then have a CRON job running that every 10 minutes or so checks that file/db and runs the scripts for you , it would depend how soon after a request is made that the scripts need to be run to decide whether that was viable or not.

Roy Sinclair
04-16-2004, 08:01 PM
The "how" is partly dependant on what MOD you're using to write your hmtl code to the user. If you're just using the simple "print" verb I think all you need to do will be close the STDOUT file and your content will be sent to the user while the script continues to run and can go ahead and perform the long processing.

BTW, I'm far from a PERL expert but I do maintain one really huge script and have previously worked on a number of small CGI scripts.

whackaxe
04-16-2004, 08:55 PM
ok, i'll tell you about something i did in php, which may or may not be relevant

i wrote a script to launch a 3D renderer (terragen) but php waits for program return so it took ages. what i did was use this command

exec("D:\\PROGRA~1\\TERRAGEN\\TERRAGEN.EXE> nul");

this is the equivelent of command line the >nul means that it sends retun into nowhere, making the php script not wait for it.

hope that helped (a bit)



EZ Archive Ads Plugin for vBulletin Copyright 2006 Computer Help Forum