Hello and welcome to our community! Is this your first visit?
Register
Enjoy an ad free experience by logging in. Not a member yet? Register.
Page 1 of 2 12 LastLast
Results 1 to 15 of 17
  1. #1
    New Coder
    Join Date
    Dec 2007
    Posts
    68
    Thanks
    25
    Thanked 2 Times in 2 Posts

    Parsing Data from a 50 meg text file.

    Hi, I have a problem with pulling data from a very large file. About 50 megs

    I use file_get_contents() for my purpose but the script stops working due to memory problem.

    Is there any solution to this?

    I hope you could help. Thanks in advance

  • #2
    Regular Coder mic2100's Avatar
    Join Date
    Feb 2006
    Location
    Scunthorpe
    Posts
    562
    Thanks
    15
    Thanked 28 Times in 27 Posts
    hi,

    i have had simlar problems trying open large files

    i found a bit of code that help me.
    PHP Code:
    ini_set("memory_limit","128M"); 
    it changes the memory limit just for the duration of the script


  • #3
    Regular Coder
    Join Date
    Nov 2007
    Location
    Leeds, UK
    Posts
    514
    Thanks
    24
    Thanked 19 Times in 19 Posts
    Quote Originally Posted by mic2100 View Post
    hi,

    i have had simlar problems trying open large files

    i found a bit of code that help me.
    PHP Code:
    ini_set("memory_limit","128M"); 
    it changes the memory limit just for the duration of the script

    This will only work if php is not in safe mode and your host allows the ini_set as you can temp config alot in the php.ini file with that command so alot of hosts dont allow it you can get php to run a virus from a mod_ file
    Working towards a Internet where we don't have website just browser applications Kill the Hyper-link and say hello to 3D Games in the browser :)

  • #4
    Regular Coder mic2100's Avatar
    Join Date
    Feb 2006
    Location
    Scunthorpe
    Posts
    562
    Thanks
    15
    Thanked 28 Times in 27 Posts
    Quote Originally Posted by barkermn01 View Post
    This will only work if php is not in safe mode and your host allows the ini_set as you can temp config alot in the php.ini file with that command so alot of hosts dont allow it you can get php to run a virus from a mod_ file
    Yeah sorry i forgot to mention the only real servers i had used this on were ones i had complete control of.

  • #5
    Master Coder
    Join Date
    Jun 2003
    Location
    Cottage Grove, Minnesota
    Posts
    9,471
    Thanks
    8
    Thanked 1,085 Times in 1,076 Posts
    Try using Perl instead of PHP.

    So the Perl script would be uploaded into your cgi-bin directory,
    and of course, it would be Perl scripting. Google search for some
    script examples.

    I don't know what your memory limit is for Perl, but I know it's
    very much larger than PHP.

  • #6
    Master Coder
    Join Date
    Dec 2007
    Posts
    6,682
    Thanks
    436
    Thanked 890 Times in 879 Posts
    Quote Originally Posted by kairog View Post
    Hi, I have a problem with pulling data from a very large file. About 50 megs

    I use file_get_contents() for my purpose but the script stops working due to memory problem.

    Is there any solution to this?

    I hope you could help. Thanks in advance
    for big file is better, no matter what language you use, php or perl, to not store data in memory.
    process them as stream, more exactly repeat a cycle read data - process - write results until you process all file.

    best regards

  • #7
    Master Coder
    Join Date
    Dec 2007
    Posts
    6,682
    Thanks
    436
    Thanked 890 Times in 879 Posts
    Quote Originally Posted by mlseim View Post
    Try using Perl instead of PHP.

    So the Perl script would be uploaded into your cgi-bin directory,
    and of course, it would be Perl scripting. Google search for some
    script examples.

    I don't know what your memory limit is for Perl, but I know it's
    very much larger than PHP.
    no matter what language is used perl or php, is not safe to store temporary data in cgi-bin.

    best regards

  • #8
    New to the CF scene
    Join Date
    Dec 2008
    Posts
    2
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by kairog View Post
    Hi, I have a problem with pulling data from a very large file. About 50 megs

    I use file_get_contents()
    Can't you read it a line at a time?

    good luck.

  • #9
    Master Coder
    Join Date
    Jun 2003
    Location
    Cottage Grove, Minnesota
    Posts
    9,471
    Thanks
    8
    Thanked 1,085 Times in 1,076 Posts
    Quote Originally Posted by oesxyl View Post
    no matter what language is used perl or php, is not safe to store temporary data in cgi-bin.
    not safe as in the data might be "sensitive" or "private"?
    or not safe as it might crash the server?

  • #10
    Master Coder
    Join Date
    Dec 2007
    Posts
    6,682
    Thanks
    436
    Thanked 890 Times in 879 Posts
    Quote Originally Posted by mlseim View Post
    not safe as in the data might be "sensitive" or "private"?
    or not safe as it might crash the server?
    I try to avoid to use my english to explain that, :

    http://www.verysimple.com/blog/2006/...-your-cgi-bin/

    best regards

  • Users who have thanked oesxyl for this post:

    PappaJohn (12-13-2008)

  • #11
    Senior Coder CFMaBiSmAd's Avatar
    Join Date
    Oct 2006
    Location
    Denver, Colorado USA
    Posts
    3,027
    Thanks
    2
    Thanked 315 Times in 307 Posts
    The correct way of handling a large amount of data in a file, especially if the size of the file is expected to continue to grow, is to "page" through that file in smaller, manageable blocks.

    However, using parsed/tokenized/interpreted scripting languages like php/perl are 100 times slower at doing the searching, parsing, processing than using the complied code of a database engine. After the amount of data your application uses exceeds a few thousand rows, it is time to put that data into a proper database.
    If you are learning PHP, developing PHP code, or debugging PHP code, do yourself a favor and check your web server log for errors and/or turn on full PHP error reporting in php.ini or in a .htaccess file to get PHP to help you.

  • #12
    Master Coder
    Join Date
    Jun 2003
    Location
    Cottage Grove, Minnesota
    Posts
    9,471
    Thanks
    8
    Thanked 1,085 Times in 1,076 Posts
    oesxyl,
    Thanks for that ... a good explanation that I was not aware of.

  • #13
    New Coder
    Join Date
    Dec 2007
    Posts
    68
    Thanks
    25
    Thanked 2 Times in 2 Posts
    Thank you for all your helpful posts.

  • #14
    New Coder
    Join Date
    Dec 2007
    Posts
    68
    Thanks
    25
    Thanked 2 Times in 2 Posts
    Quote Originally Posted by oesxyl View Post
    for big file is better, no matter what language you use, php or perl, to not store data in memory.
    process them as stream, more exactly repeat a cycle read data - process - write results until you process all file.

    best regards
    Hi oesxyl, I've been coding PHP but this one has been a tough job for me...looping through the file. Do you have any sample code where you can refer me to?

    Thanks in advance.

  • #15
    Master Coder
    Join Date
    Dec 2007
    Posts
    6,682
    Thanks
    436
    Thanked 890 Times in 879 Posts
    Quote Originally Posted by kairog View Post
    Hi oesxyl, I've been coding PHP but this one has been a tough job for me...looping through the file. Do you have any sample code where you can refer me to?

    Thanks in advance.
    somethink like that:
    PHP Code:
    <?php
    $chunksize 
    8192// probably bigger
    $handle fopen("http://www.example.com/""r");
    if(
    $handle){
     while (!
    feof($handle)) {
       
    $contents fread($handle$chunksize);
       
    // process $contents here
     
    }
    fclose($handle);
    }
    ?>
    if you need, inside the while loop, you can store some data in variables for later use, write the results of processing into a file, temporary or not, or insert into a database.

    best regards

  • Users who have thanked oesxyl for this post:

    kairog (12-15-2008)


  •  
    Page 1 of 2 12 LastLast

    Posting Permissions

    • You may not post new threads
    • You may not post replies
    • You may not post attachments
    • You may not edit your posts
    •