Hello and welcome to our community! Is this your first visit?
Enjoy an ad free experience by logging in. Not a member yet? Register.
Results 1 to 2 of 2
  1. #1
    New to the CF scene
    Join Date
    Feb 2014
    Thanked 0 Times in 0 Posts

    Read large tab delimited text file & put in MYSQL database table

    We already have a table set up, and we have a large txt file that we are FTPing to our server. The file has over one million rows of data with 27 fields all tab delimited. I have never processed such a large file and was wondering the best method to buffer it so that it does not bog down my server. Could someone point me to a good tutorial or provide information on how to best do this?

  2. #2
    God Emperor Fou-Lu's Avatar
    Join Date
    Sep 2002
    Saskatoon, Saskatchewan
    Thanked 2,668 Times in 2,637 Posts
    I wouldn't use PHP for this at all. If you're doing it web based, it would definitely bog the server down.
    I'd upload it first, then use something like perl to do this work.

    Presuming you have a csv file, you could do it a part at a time though. A thousand entries at a time, then rest a couple minutes than a thousand more. I wouldn't execute it from a web based environment though, I'd do it directly on the server (which is why I suggest that perl would probably be a better option for this task).
    PHP Code:
    header('HTTP/1.1 420 Enhance Your Calm'); 
    Been gone for a few months, and haven't programmed in that long of a time. Meh, I'll wing it ;)


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts