[web2py] Re: Migrating Postgres to MySql

2016-10-12 Thread Ben Lawrence
For anyone using raspberry pi, postgresql can't be used with the scheduler, 
due to incompatibility in one of the JSON fields. I have had no trouble 
using scheduler with mysql on raspberry pi.


On Sunday, September 18, 2016 at 12:10:45 AM UTC-7, Joe Barnhart wrote:
>
> When I first started using MySql awhile ago (after having used Postgres 
> longer) I was screaming mad all the time.  I absolutely hated MySql and the 
> crap it put me through.  That was a few years back, and I can't quite 
> remember what made me so mad.  Other than performance, of course.  I feel 
> Postgres may have the upper hand in some areas of performance, but to be 
> honest correctly "tuning" a DBS is such a difficult topic that few of us 
> really master it.  I certainly never have.
>
> My interest in MySql is purely because it is more widely available in the 
> commercial server-for-hire world.  Left alone I probably would have happily 
> continued using Postgres.
>
> On Saturday, September 17, 2016 at 6:39:55 PM UTC-7, 
> junde...@launchpnt.com wrote:
>>
>> Joe, I'd be curious to know how you feel MySql performs in comparison to 
>> Postgres.
>>
>>
>>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[web2py] Re: Migrating Postgres to MySql

2016-09-18 Thread Joe Barnhart
When I first started using MySql awhile ago (after having used Postgres 
longer) I was screaming mad all the time.  I absolutely hated MySql and the 
crap it put me through.  That was a few years back, and I can't quite 
remember what made me so mad.  Other than performance, of course.  I feel 
Postgres may have the upper hand in some areas of performance, but to be 
honest correctly "tuning" a DBS is such a difficult topic that few of us 
really master it.  I certainly never have.

My interest in MySql is purely because it is more widely available in the 
commercial server-for-hire world.  Left alone I probably would have happily 
continued using Postgres.

On Saturday, September 17, 2016 at 6:39:55 PM UTC-7, junde...@launchpnt.com 
wrote:
>
> Joe, I'd be curious to know how you feel MySql performs in comparison to 
> Postgres.
>
>
>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[web2py] Re: Migrating Postgres to MySql

2016-09-17 Thread junderhill
Joe, I'd be curious to know how you feel MySql performs in comparison to 
Postgres.

On Friday, September 16, 2016 at 2:28:09 PM UTC-7, Joe Barnhart wrote:
>
> I've been using PostgresSQL for my production site for quite awhile.  My 
> future improved site, however, will feature MySql for its database engine. 
>  (Not a diss on Postgres, it's just a lot easier to find hosting companies 
> who will rent me a supported-and-maintained MySql instance.)  I have 
> thousands of user logins and lots 'o history I want, yea even NEED, to 
> preserve.  I've been reading up on strategies for converting from one 
> database to another.
>
> For example, I can dump the Postgres database in its entirety into CSV 
> using the DAL, then read the data back into MySql.  That has some 
> attractiveness but I couldn't help noticing how s--l-o-w this process is. 
>  I let the dump run for about an hour and it wasn't done yet.  Then there's 
> the little problem of the developer.  He just can't leave things alone. 
>  The MySql database design is "much better" (read: different) than the 
> Postgres design.  (It makes no difference that I am actually the developer 
> of both.)  So I can't just export CSV and import it.  Not without editing 
> the CSV on the way.
>
> What if, instead, I create the MySql tables with an extra column to hold 
> the "old" id from the Postgres database?  I could then develop a 
> "transition" class that would open connections to both databases, read 
> records from the Postgres one, and insert them (with whatever mods are 
> required) into the MySql database.  When I process records from Postgres 
> which have dependencies on the "id" field from another table, I use the 
> "old_id" field to map it to the new id value.
>
> This process could take as long as it wants, and run concurrently with the 
> old system.  I could shut it down and restart it as needed if I kept a 
> "fence" of the highest id processed from the old database.  I would need to 
> do the tables in order of no dependencies to fullest dependencies, to make 
> sure the required dependent records are already in place when the table is 
> processed.
>
> After all is said and done, I could simply delete the "old_id" columns 
> from the affected MySql tables.  If I even care at that point.
>
> Am I missing something?  Or does this seem viable for a system with a 
> large database to migrate?
>
> -- Joe
>
>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[web2py] Re: Migrating Postgres to MySql

2016-09-17 Thread Joe Barnhart
Ha.  Looks like I'm going to have to write the two-db app regardless.  

The surprising thing is, Postgres allowed me to put in strings that only 
differ in case into tables where the string was declared "unique".  MySql, 
however, considers them the same and throws an IntegrityError exception 
when I try to load the CSV.  Because of the nature of the CSV file loading, 
that means I have to scrub the database,  rebuild it empty, and then try 
again.  

I've loaded the same database 5 times now and get a new IntegrityError on 
each load. 

Time for Plan B.


On Friday, September 16, 2016 at 10:36:35 PM UTC-7, Joe Barnhart wrote:
>
> Huh.  Just tried db.export_to_csv_file() again and it wasn't nearly as 
> slow as I thought before.  It took about 2 minutes for a largish (but not 
> my biggest) database, exported from Postgres.  As a fun thing to try, I ran 
> the same command under the most recent pypy and the same export took only 
> 23 seconds!!  I had to do it twice and compare the output files to convince 
> myself it actually worked.  But it did.
>
> I'm not sure how long the import will take using python into MySql.  Pypy 
> is not an option here as I don't have an all-python driver for mysql that 
> works with pypy.  I'm sure it will take longer than the export, but I don't 
> have to do this very often...
>
> -- Joe
>
> The idea about setting a "fence" and getting changes added since the last 
> export is a pretty compelling one, however.
>
>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[web2py] Re: Migrating Postgres to MySql

2016-09-16 Thread Joe Barnhart
Huh.  Just tried db.export_to_csv_file() again and it wasn't nearly as slow 
as I thought before.  It took about 2 minutes for a largish (but not my 
biggest) database, exported from Postgres.  As a fun thing to try, I ran 
the same command under the most recent pypy and the same export took only 
23 seconds!!  I had to do it twice and compare the output files to convince 
myself it actually worked.  But it did.

I'm not sure how long the import will take using python into MySql.  Pypy 
is not an option here as I don't have an all-python driver for mysql that 
works with pypy.  I'm sure it will take longer than the export, but I don't 
have to do this very often...

-- Joe

The idea about setting a "fence" and getting changes added since the last 
export is a pretty compelling one, however.

On Friday, September 16, 2016 at 2:28:09 PM UTC-7, Joe Barnhart wrote:
>
> I've been using PostgresSQL for my production site for quite awhile.  My 
> future improved site, however, will feature MySql for its database engine. 
>  (Not a diss on Postgres, it's just a lot easier to find hosting companies 
> who will rent me a supported-and-maintained MySql instance.)  I have 
> thousands of user logins and lots 'o history I want, yea even NEED, to 
> preserve.  I've been reading up on strategies for converting from one 
> database to another.
>
> For example, I can dump the Postgres database in its entirety into CSV 
> using the DAL, then read the data back into MySql.  That has some 
> attractiveness but I couldn't help noticing how s--l-o-w this process is. 
>  I let the dump run for about an hour and it wasn't done yet.  Then there's 
> the little problem of the developer.  He just can't leave things alone. 
>  The MySql database design is "much better" (read: different) than the 
> Postgres design.  (It makes no difference that I am actually the developer 
> of both.)  So I can't just export CSV and import it.  Not without editing 
> the CSV on the way.
>
> What if, instead, I create the MySql tables with an extra column to hold 
> the "old" id from the Postgres database?  I could then develop a 
> "transition" class that would open connections to both databases, read 
> records from the Postgres one, and insert them (with whatever mods are 
> required) into the MySql database.  When I process records from Postgres 
> which have dependencies on the "id" field from another table, I use the 
> "old_id" field to map it to the new id value.
>
> This process could take as long as it wants, and run concurrently with the 
> old system.  I could shut it down and restart it as needed if I kept a 
> "fence" of the highest id processed from the old database.  I would need to 
> do the tables in order of no dependencies to fullest dependencies, to make 
> sure the required dependent records are already in place when the table is 
> processed.
>
> After all is said and done, I could simply delete the "old_id" columns 
> from the affected MySql tables.  If I even care at that point.
>
> Am I missing something?  Or does this seem viable for a system with a 
> large database to migrate?
>
> -- Joe
>
>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.