Enjoy an ad free experience by logging in. Not a member yet? Register.
Results 1 to 2 of 2
02-09-2014, 09:00 AM #1
- Join Date
- Feb 2014
- Thanked 0 Times in 0 Posts
Read large tab delimited text file & put in MYSQL database table
We already have a table set up, and we have a large txt file that we are FTPing to our server. The file has over one million rows of data with 27 fields all tab delimited. I have never processed such a large file and was wondering the best method to buffer it so that it does not bog down my server. Could someone point me to a good tutorial or provide information on how to best do this?
02-09-2014, 05:09 PM #2
- Join Date
- Sep 2002
- Saskatoon, Saskatchewan
- Thanked 2,662 Times in 2,631 Posts
I wouldn't use PHP for this at all. If you're doing it web based, it would definitely bog the server down.
I'd upload it first, then use something like perl to do this work.
Presuming you have a csv file, you could do it a part at a time though. A thousand entries at a time, then rest a couple minutes than a thousand more. I wouldn't execute it from a web based environment though, I'd do it directly on the server (which is why I suggest that perl would probably be a better option for this task).
Been gone for a few months, and haven't programmed in that long of a time. Meh, I'll wing it ;)PHP Code:
header('HTTP/1.1 420 Enhance Your Calm');