View Full Version : Breaking an 800 record .csv file into individual .csv files.

10-12-2009, 07:16 PM
I have a website that allows customers to see dynamic quotes in about a thousand equity securities.

the quotes are generated in a spreadsheet on my local machine, then copied to individual .csv files with the following information:


the .csv files are then ftp'd up to the server where the php script on the web page pulls data from the .csv files and from the mysql database for display to the customer.

the problem is that it takes like 5 minutes to move the individual files from my side to the server side. that's not exactly dynamic. on the other hand, it takes less than a second to upload one .csv file with all the data.

what i'd like to find out is whether or not it's possible to use fgetcsv to turn each record of the large .csv file into an array, and then use fputcsv to copy the individual arrays into individual .csv files.

also, if it is possible, where would i put the .php file with the script to accomplish that?

10-12-2009, 08:05 PM
You could have an extra column in your big csv file that indicates what section/category the data belongs in. When parsing the big csv, just look at that column to determine where to save the data.

Why are you using csv's and mysql? Why not just mysql?

10-12-2009, 08:26 PM
Thanks very much for the look!

I'm using both because a good bit of the data is static so it's a matter of not having 30 or 40 fields per record on the spreadsheet, and minimizing what needs to be uploaded.

do you mean putting the path or just the file name the extra column?

for example: NSRGY,42.93,43.13,200,200,/PRICES/NSRGY.CSV ?

sorry if i'm not following.

thanks again.

10-14-2009, 04:24 PM
You could put the path in the extra column, but it doesn't really matter.
It just needs to be an identifier that your parsing script can check for.

Its up to you to decide.