...

View Full Version : Saving data across script executions



jkd
06-23-2006, 02:49 AM
So I have an imaginary share.php. Say 10 users are submitting data to that script - every time they submit their own data, they get the most recent submissions of others as their response.

Now, this will happen very often, every couple of seconds at least. File IO is slow, so writing to a temporary file is terrible. Similarly, hitting a DB everytime feels very inefficient (though I suppose if the DB is smart, it'll keep the row in memory so I just suffer a few API calls but no real File IO or processing costs).

I'm under the impression the mod_php keeps the same interpretter running for all script calls, so it seems reasonable that I should be able to modify the actual interpretter's state with data (or do the functional equivalent - perhaps some deliberately exposed "Global" object). I know that Java servlets are perfect for this, but I'm unfortunately limited to a PHP environment.

Ultimately, is there any way in PHP to save data directly to the interpretter's state, so subsequent script executions can pull the data directly from memory and alter it themselves instead of relying on some indirect data storage mechanism?

raf
06-23-2006, 12:23 PM
Ultimately, is there any way in PHP to save data directly to the interpretter's state, so subsequent script executions can pull the data directly from memory and alter it themselves instead of relying on some indirect data storage mechanism?
i don't think so, no.
the biggest disappointment for me, when i started with PHP, is that there isn't something as an application-object. but to be honest: i never actually couldn't do something because this was missing.

i would try using a db, because it's simple to set up a small testscript and db-table, so you shouldn't waste to much time on finding out if this sollution is performant enough.
depending on the choosen db, you'll be able to use some optimalisations. if you for instance use mysql, then you can use the recordset-buffering feature to get faster returns.

obviously, a db has more overhead then writing the data to and from a textfile, but using a db will give you more flexability (for selects, sorting, agregations, ...) and has a lower lock-level then using files. so i would first try a db-approach and only look for an alternative if the db realy is a noticable bottleneck.

fci
06-23-2006, 01:01 PM
mysql has a heap storage engine which could be useful...
http://dev.mysql.com/doc/refman/5.0/en/memory-storage-engine.html

firepages
06-24-2006, 02:34 AM
you could use shared memory.. though its only practical for smaller amounts of data look at shmop (http://www.php.net/shmop)

I have seen this used in conjunction with a small standalone socket server which you could load as a resource (if you were bored enough to write one :))

jkd
06-24-2006, 03:56 AM
Both the Heap/Memory MySQL feature and the shmop (*excellent* name, btw) seem to be about as close as I'm going to get. Thanks! :)



EZ Archive Ads Plugin for vBulletin Copyright 2006 Computer Help Forum