View Full Version : lots of data lots of users..

08-01-2006, 05:35 AM
If I have around 5,000-10,000 pieces of data (which will get updated daily) that I need to display in a grid and will expect 1000+ visitors a day viewing this data, what is the best way to go about doing this? Would it be better to export all the data to a database and then import to grid or just put the data into text file or something and import to grid from that? I'm really new to all this and not really sure what I need to do! I want to do it the most efficient way and need your help...Thanks

08-01-2006, 09:04 AM
welcome here!

we can not advise you on that, based on so little info.
databases are better if you always only show one 'chunk' of your 'pieces of data' at a time to the visiters. like if you allow the visitors to make custom selections, or search, or if you use pagination etc.
if you would always be showing all data (or predefined subsets of your data) then dumping the content into textfiles and importing these could lead to better performance.

it also depends on how you update the data --> is it an incremental refresh (updating existing + inserting new) or do you completely replace the data, or are you just inserting new data?
if you want incremental refreshes, then a db is the way to go. you could then still store the data in a db, and then, after the refresh, generate static html or txt files (if you always show predefined data, see above)

and it also depends on how the data is delivere to you + how frequent it is updated + if you still need to do some data-manipulation before you can display it to the user.

by the way: 5-10k of data is not 'lots'. it's peanuts for a db-server like mysql.

08-02-2006, 07:59 AM
Thank you for the reply...

To clarify a few things, the data will be completely replaced everyday and I will definately use pagination whether I use a DB or not. I will also need to do some data-manipulation (still learning how) and then display everything in a grid.

I am not familiar with using txt files (maybe csv?) to do this. The way I see it is I can either completely replace all data in my DB everyday and then have it exported to a php grid on my website or (correct me if I'm wrong) replace a txt/csv file on the server with all the data and then export that to a php grid (is there a name for this second method?).

I will need to do some data-manipulation and I should expect many surfers simultaneously viewing this data. I have no idea how to use txt files to do this but would definately learn how if this is the most efficient method.

Thank you for all the help so far!

08-02-2006, 09:20 AM
i think a db will be the easiest sollution to setup, so me personally, i would go for the db-sollution and then see if its performant enough.

if you completely replace the data every day then you're daily db-update for a table would look like
- drop all indexes
- truncate table
- load the csv's with a LOAD DATA INFILE statement
- recreate the indexes

this means that you need to take that table offline for a few seconds...
an alternative could be to create an empty copy of your table, load the content into that + create the indexes, drop your original tale, rename the copy to the original table. downtime should be minimal then.
if i you can not afford any downtime at all, then you'll need to parse the csv files and insert/update the records one at a time. look into the REPLACE command which might be usefull for your situation.

08-03-2006, 06:55 AM
This is exactly what I needed to know! Thanks for all the help!

08-03-2006, 08:31 AM
This is exactly what I needed to know! Thanks for all the help!
you're welcome :thumbsup: