Re: Convert a huge SQL file to CSV

2018-06-04 Thread Steven Schveighoffer via Digitalmars-d-learn

On 6/3/18 2:04 AM, biocyberman wrote:

On Friday, 1 June 2018 at 10:15:11 UTC, Martin Tschierschke wrote:

On Friday, 1 June 2018 at 09:49:23 UTC, biocyberman wrote:
I need to convert a compressed 17GB SQL dump to CSV. A workable 
solution is to create a temporary mysql database, import the dump, 
query by python, and export. But i wonder if there is something 
someway in D to parse the SQL file directly and query and export the 
data. I imagine this will envolve both parsing and querying because 
the data is stored in several tables. I am in the process of 
downloading the dump now so I can’t give excerpt of the data.


You don't need python:
https://michaelrigart.be/export-directly-mysql-csv/

SELECT field1, field2
FROM table1
INTO OUTFILE '/path/to/file.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
FIELDS ESCAPED BY '\'
LINES TERMINATED BY '\n';

Most important:

INTO OUTFILE : here you state the path where you want MySQL to store 
the CSV file. Keep in mind that the path needs to be writeable for the 
MySQL user


You can write a parser for SQL in D, but even if the import into mysql 
would take some time, it's only compute time and not yours.



Regards mt.


Ah yes, thank you Martin. I forgot that we can do a "batch" SQL query 
where mysql server can parse and run query commands. So no need for 
Python. But I am still currently waiting for the import to  finish the 
importing of mysql dump. It took 18 hours and is still counting! The 
whole mysql database is 68GB at the moment. Can we avoid the import and 
query the database dump directly?


Well, it could be done quick-and-dirty by simply ignoring most of SQL 
syntax, and focusing on the actual things used in the dump. mysqldump is 
pretty consistent with how it outputs data.


You might even be able to do it with an awk script :)

-Steve


Re: Convert a huge SQL file to CSV

2018-06-02 Thread biocyberman via Digitalmars-d-learn

On Friday, 1 June 2018 at 10:15:11 UTC, Martin Tschierschke wrote:

On Friday, 1 June 2018 at 09:49:23 UTC, biocyberman wrote:
I need to convert a compressed 17GB SQL dump to CSV. A 
workable solution is to create a temporary mysql database, 
import the dump, query by python, and export. But i wonder if 
there is something someway in D to parse the SQL file directly 
and query and export the data. I imagine this will envolve 
both parsing and querying because the data is stored in 
several tables. I am in the process of downloading the dump 
now so I can’t give excerpt of the data.


You don't need python:
https://michaelrigart.be/export-directly-mysql-csv/

SELECT field1, field2
FROM table1
INTO OUTFILE '/path/to/file.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
FIELDS ESCAPED BY '\'
LINES TERMINATED BY '\n';

Most important:

INTO OUTFILE : here you state the path where you want MySQL to 
store the CSV file. Keep in mind that the path needs to be 
writeable for the MySQL user


You can write a parser for SQL in D, but even if the import 
into mysql would take some time, it's only compute time and not 
yours.



Regards mt.


Ah yes, thank you Martin. I forgot that we can do a "batch" SQL 
query where mysql server can parse and run query commands. So no 
need for Python. But I am still currently waiting for the import 
to  finish the importing of mysql dump. It took 18 hours and is 
still counting! The whole mysql database is 68GB at the moment.  
Can we avoid the import and query the database dump directly?


Re: Convert a huge SQL file to CSV

2018-06-01 Thread Martin Tschierschke via Digitalmars-d-learn

On Friday, 1 June 2018 at 09:49:23 UTC, biocyberman wrote:
I need to convert a compressed 17GB SQL dump to CSV. A workable 
solution is to create a temporary mysql database, import the 
dump, query by python, and export. But i wonder if there is 
something someway in D to parse the SQL file directly and query 
and export the data. I imagine this will envolve both parsing 
and querying because the data is stored in several tables. I am 
in the process of downloading the dump now so I can’t give 
excerpt of the data.


You don't need python:
https://michaelrigart.be/export-directly-mysql-csv/

SELECT field1, field2
FROM table1
INTO OUTFILE '/path/to/file.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
FIELDS ESCAPED BY '\'
LINES TERMINATED BY '\n';

Most important:

INTO OUTFILE : here you state the path where you want MySQL to 
store the CSV file. Keep in mind that the path needs to be 
writeable for the MySQL user


You can write a parser for SQL in D, but even if the import into 
mysql would take some time, it's only compute time and not yours.



Regards mt.


Convert a huge SQL file to CSV

2018-06-01 Thread biocyberman via Digitalmars-d-learn
I need to convert a compressed 17GB SQL dump to CSV. A workable 
solution is to create a temporary mysql database, import the 
dump, query by python, and export. But i wonder if there is 
something someway in D to parse the SQL file directly and query 
and export the data. I imagine this will envolve both parsing and 
querying because the data is stored in several tables. I am in 
the process of downloading the dump now so I can’t give excerpt 
of the data.