Re: [PHP] fgets function for very large files

2009-05-25 Thread shahrzad khorrami
in the following loop:

$row = 1;
while (($line = fgets($handle, 1024)) !== FALSE )
{
$line = str_replace(SEPARATOR, ",", $line);

$data = explode(",", $line);

   $row++;
}

How put 1000 by 1000 lines of csv file to new ones?
and you know I think this works slowly
what do you recommend?

Thanks in advance,
shahrzad





>>
>>
> 1: http://php.net/fgetcsv
> 2: if( ($row % 1) == 0 ) {
> 3: http://php.net/fputcsv
>
>
>


Re: [PHP] fgets function for very large files

2009-05-24 Thread Lars Torben Wilson
2009/5/24 shahrzad khorrami :
> :-o
> I want to divide this large csv file with programming to small one!

If you're on *nix:

% man split


Regards,

Torben

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] fgets function for very large files

2009-05-24 Thread Nathan Rixham

shahrzad khorrami wrote:

:-o
I want to divide this large csv file with programming to small one!



1: http://php.net/fgetcsv
2: if( ($row % 1) == 0 ) {
3: http://php.net/fputcsv

csv split in to multiple csv's of 10k rows each

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] fgets function for very large files

2009-05-24 Thread shahrzad khorrami
:-o
I want to divide this large csv file with programming to small one!


Re: [PHP] fgets function for very large files

2009-05-24 Thread kranthi
1. open that in a text editor
2. copy a few lines
3. create a new text file
4. paste the copied lines
5. save with the extension .csv

but i doubt this process works i faced exactly same problem few months
back and i found
http://www.tech-recipes.com/rx/2345/import_csv_file_directly_into_mysql/
to be the best solution

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] fgets function for very large files

2009-05-24 Thread shahrzad khorrami
How to divide a large csv file to small ones?

Thanks,
Shahrzad


Re: [PHP] fgets function for very large files

2009-05-23 Thread kranthi
i accept the fact that PMA is full of security holes, and it should
not be used on production server.
but it does not mean that we can never use it on a development server
probably you may have a bit of trouble while moving from development
server to production server. but u can always export your database to
an .sql file and import it into production server...
but if u dont have access to mysql command line(which is the case in
nearly all of the projects i worked on), PMA  will probably be the
only option...(unless u want to rewrite the entire PMA code.)

and certainly 
http://www.tech-recipes.com/rx/2345/import_csv_file_directly_into_mysql/
is best option for file with > 5000 lines..(but you may hav to prefer
PMA if mysql is not in your PATH env var)

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] fgets function for very large files

2009-05-23 Thread Ashley Sheridan
On Sat, 2009-05-23 at 02:59 -0400, Eddie Drapkin wrote:
> On Sat, May 23, 2009 at 2:58 AM, Ashley Sheridan
> wrote:
> 
> >
> > If it's a CSV, I'd recommend using phpMyAdmin directly to import it into
> > the database, assuming you are using a MySQL database that is. It's
> > using tried and tested code for large files like that.
> >
> >
> Tried and true to be what, exactly? Full of security holes and exploits and
> promoting bad habits?
> 
> Really, if all you need to do for the database is import hte .csv, import it
> directly into mysql, from mysql:
> 
> http://www.tech-recipes.com/rx/2345/import_csv_file_directly_into_mysql/
> 
> And on a related note, you should never, ever use PMA on a production
> machine as it's so easy to exploit and hack.  Furthermore, if you use it on
> your dev server, you'll get used to managing your database with it and have
> trouble using it on the production server.  Take the time to use a real DB
> administration app (like SQLyog of the one that comes with KDE) or an IDE
> with integrated SQL management (like PDT or Zend Studio or Aptana I think
> too).
> 
> Bottom line is if you said you used PMA in an interview I had any say in,
> I'd never hire you and I'd never work with a developer who was that
> uncomfortable with SQL.
I guess I don't get the job then! :p


Ash
www.ashleysheridan.co.uk


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] fgets function for very large files

2009-05-23 Thread Michael A. Peters

shahrzad khorrami wrote:

one thing!  I need four fields of  7 fields in each line,
in my code from original csv file first four of fields choose(mappping these
fields with columns of table in db)
for example:
a line in csv file:
"a","b","c","d","e","f","g"

in table of database 4 column : name,ext,tel,date
that 4 field of csv file must map to these column

then I can't directly import my csv file into db. some process must do to
insert  just my selected fields of csv file...


use awk to read the file and create a .sql file with the specified 
fields you want.


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] fgets function for very large files

2009-05-23 Thread shahrzad khorrami
one thing!  I need four fields of  7 fields in each line,
in my code from original csv file first four of fields choose(mappping these
fields with columns of table in db)
for example:
a line in csv file:
"a","b","c","d","e","f","g"

in table of database 4 column : name,ext,tel,date
that 4 field of csv file must map to these column

then I can't directly import my csv file into db. some process must do to
insert  just my selected fields of csv file...

Thanks,
Shahrzad


Re: [PHP] fgets function for very large files

2009-05-23 Thread shahrzad khorrami
Thanks for repply :)

It must be automatically, it means that there is a form that an operator
browse a csv with large size(more than 2-3 GB)
and in next step(page) first 1000 line insert into db then by clicking on
next button, next 1000 line
operator don't know any thing about phpmyadmin and programming just browse
the csv file and everything will do automatically
in my code all of lines inserted correctly with @set_time_limit(600) but
it's too slow and bad way...
no use of phpmyadmin and any splitter csv installer program..

Thanks,
shahrzad


Re: [PHP] fgets function for very large files

2009-05-23 Thread Eddie Drapkin
On Sat, May 23, 2009 at 2:58 AM, Ashley Sheridan
wrote:

>
> If it's a CSV, I'd recommend using phpMyAdmin directly to import it into
> the database, assuming you are using a MySQL database that is. It's
> using tried and tested code for large files like that.
>
>
Tried and true to be what, exactly? Full of security holes and exploits and
promoting bad habits?

Really, if all you need to do for the database is import hte .csv, import it
directly into mysql, from mysql:

http://www.tech-recipes.com/rx/2345/import_csv_file_directly_into_mysql/

And on a related note, you should never, ever use PMA on a production
machine as it's so easy to exploit and hack.  Furthermore, if you use it on
your dev server, you'll get used to managing your database with it and have
trouble using it on the production server.  Take the time to use a real DB
administration app (like SQLyog of the one that comes with KDE) or an IDE
with integrated SQL management (like PDT or Zend Studio or Aptana I think
too).

Bottom line is if you said you used PMA in an interview I had any say in,
I'd never hire you and I'd never work with a developer who was that
uncomfortable with SQL.


Re: [PHP] fgets function for very large files

2009-05-22 Thread Ashley Sheridan
On Sat, 2009-05-23 at 10:41 +0430, shahrzad khorrami wrote:
> hi all,
> 
> I have a csv file with more than 100,000 lines. I want to insert each line
> as a record in a
> database. but for the reason of very number of lines,
> I put a button with caption Next,  when we click on it, 1000 line will
> insert into db and then again click next button
> and insert next 1000 line and
>  is this good way? what do you recommend? and how can I do that?
> 
> 
> Thanks in advance,
> shahrzad

If it's a CSV, I'd recommend using phpMyAdmin directly to import it into
the database, assuming you are using a MySQL database that is. It's
using tried and tested code for large files like that.


Ash
www.ashleysheridan.co.uk


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php