Re: [GENERAL] Switching Database Engines

2011-05-03 Thread Carlos Mennens
On Sat, Apr 30, 2011 at 4:29 AM, Greg Smith g...@2ndquadrant.com wrote:
 I wouldn't fight with this too much though.  Unless you have some really
 customized stuff in your wiki, there really is nothing wrong with the idea
 of dumping everything into XML, creating a blank PostgreSQL-backed MediaWiki
 install, then restoring into that.  That's what I always do in order to get
 a plain text backup of my server, and to migrate a wiki from one server to
 another.  There are all kinds of issues you could have left here before this
 works, trying to do a database-level export/reload--encoding, foreign key
 problems, who knows what else.  The database-agnostic export/import into XML
 avoids all of those.

Greg,

I'm with you and think that just doing an XML dump of the Wiki itself
is the best way to go. My question is when I do the XML dump as
follows:

/var/www/html/int/main/wiki/maintenance
[root@ideweb1 maintenance]# php dumpBackup.php --full  mw_wiki_2011_05_03.xml
PHP Warning:  PHP Startup: mcrypt: Unable to initialize module
Module compiled with module API=20050922, debug=0, thread-safety=0
PHPcompiled with module API=20060613, debug=0, thread-safety=0
[...]

So now I have a backup file of the Wiki on my old server running
MySQL. I have created the database and installed MediaWiki on the new
server using PostgreSQL as the backend. My question now is what are
your recommended steps in order to get the XML data imported on
MediaWiki using PostgreSQL? I know I also have to move the users since
the XML script / backup doesn't do anything in regards to the users.

Thanks again so much!

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


Re: [GENERAL] Switching Database Engines

2011-04-30 Thread Greg Smith

On 04/28/2011 12:19 PM, Carlos Mennens wrote:

It seems that the 'mysql2postgres.pl' tool has instructions embedded
into the file so I ran the command as instructed to take the output
file and insert it into my PostgreSQL server and got the following
error message:

$ psql -p 5432 -h db1 -U wiki -f mediawiki_upgrade.pg
Password for user wiki:
BEGIN
SET
SET
SET
psql:mediawiki_upgrade.pg:25: ERROR:  relation category does not exist
   


My guess is that you need to run the main MediaWiki installer for 
PostgreSQL first, to create a blank install, in order for the category 
table to exist.  The export tool is aimed to get the data out, not the 
schema to create all the tables.  After you create a blank instance, 
then you do the data export.


If you have additional problems, try running that like this instead:

$ psql -p 5432 -h db1 -U wiki -e -f mediawiki_upgrade.pg


Note the extra -e on the command line.  That will show you the line it 
is executing as the script runs, so you'll see the one that fails too.  
Very handy for debugging what's gone wrong in this sort of situation.


I wouldn't fight with this too much though.  Unless you have some really 
customized stuff in your wiki, there really is nothing wrong with the 
idea of dumping everything into XML, creating a blank PostgreSQL-backed 
MediaWiki install, then restoring into that.  That's what I always do in 
order to get a plain text backup of my server, and to migrate a wiki 
from one server to another.  There are all kinds of issues you could 
have left here before this works, trying to do a database-level 
export/reload--encoding, foreign key problems, who knows what else.  The 
database-agnostic export/import into XML avoids all of those.


--
Greg Smith   2ndQuadrant USg...@2ndquadrant.com   Baltimore, MD
PostgreSQL Training, Services, and 24x7 Support  www.2ndQuadrant.us
PostgreSQL 9.0 High Performance: http://www.2ndQuadrant.com/books


--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


Re: [GENERAL] Switching Database Engines

2011-04-28 Thread Carlos Mennens
It seems that the 'mysql2postgres.pl' tool has instructions embedded
into the file so I ran the command as instructed to take the output
file and insert it into my PostgreSQL server and got the following
error message:

$ psql -p 5432 -h db1 -U wiki -f mediawiki_upgrade.pg
Password for user wiki:
BEGIN
SET
SET
SET
psql:mediawiki_upgrade.pg:25: ERROR:  relation category does not exist

Obviously this tool isn't written or supported by the Postgres
community but I figured I would ask in case someone understands this
and can advise me of a work around. If not it looks like this just
isn't going to work. :(

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


Re: [GENERAL] Switching Database Engines

2011-04-27 Thread Carlos Mennens
I was able to export the Wiki database into a single file using the
conversion tool mentioned previously.

root@ideweb1 postgres]# ./mediawiki_mysql2postgres.pl --db=wiki
--user=mediawiki --pass=**
Writing file mediawiki_upgrade.pg

As you can see above that generated a new file in my current working
directory which is output of running the conversion tool:

[root@ideweb1 postgres]# ls -lh mediawiki_upgrade.pg
-rw-r--r-- 1 root root 112M Apr 27 09:24 mediawiki_upgrade.pg

So my question is now that I have the file above which I assume
contains the entire SQL database structure parameters and associated
data, how do I insert this into my existing PostgreSQL server? Do I
need to create a wiki database using template0 or template1? Or should
I just create a database as normal w/o templates and assign a wiki
role to that database?

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


[GENERAL] Switching Database Engines

2011-04-26 Thread Carlos Mennens
We've been using a Wiki server at the office for years. It was
originally configured to use MySQL and finally after 8+ years we're
moving the Wiki to a new platform of hardware. My question is the Wiki
software (MediaWiki) is the only thing still tied to and using MySQL
which we want to decommission but we've been using it for years so I'm
worried we will lose the data. I've done some Google'ing to find out
how can I change the MySQL database dump and successfully export it
into my new PostgreSQL database however I don't know how practical or
recommended this process is. I found sites like the following:

http://en.wikibooks.org/wiki/Converting_MySQL_to_PostgreSQL

Can you guys tell me if this is something that will work? I don't mean
the exact link above but just in general taking a database from MySQL
and successfully migrating it for PostgreSQL use?

From what I can see in the MySQL database, there appears to be 43
tables with lots of column data and who knows what else:

mysql show tables;
+--+
| Tables_in_wiki   |
+--+
| dp_archive   |
| dp_category  |
| dp_categorylinks |
| dp_change_tag|
| dp_externallinks |
| dp_filearchive   |
| dp_hitcounter|
| dp_image |
| dp_imagelinks|
| dp_interwiki |
| dp_ipblocks  |
| dp_ipblocks_old  |
| dp_job   |
| dp_langlinks |
| dp_logging   |
| dp_math  |
| dp_objectcache   |
| dp_oldimage  |
| dp_page  |
| dp_page_props|
| dp_page_restrictions |
| dp_pagelinks |
| dp_protected_titles  |
| dp_querycache|
| dp_querycache_info   |
| dp_querycachetwo |
| dp_recentchanges |
| dp_redirect  |
| dp_revision  |
| dp_searchindex   |
| dp_site_stats|
| dp_tag_summary   |
| dp_templatelinks |
| dp_text  |
| dp_trackbacks|
| dp_transcache|
| dp_updatelog |
| dp_user  |
| dp_user_groups   |
| dp_user_newtalk  |
| dp_valid_tag |
| dp_validate  |
| dp_watchlist |
+--+
43 rows in set (0.01 sec)

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


Re: [GENERAL] Switching Database Engines

2011-04-26 Thread Greg Sabino Mullane

-BEGIN PGP SIGNED MESSAGE-
Hash: RIPEMD160


 http://en.wikibooks.org/wiki/Converting_MySQL_to_PostgreSQL

 Can you guys tell me if this is something that will work? I don't mean
 the exact link above but just in general taking a database from MySQL
 and successfully migrating it for PostgreSQL use?

In general, yes. For your specific use case, it might be best to use 
MediaWiki's XML dump and restore. You could also use the conversion 
script that comes with MediaWiki, at:

maintenance/postgres/mediawiki_mysql2postgres.pl

It's a little old so I can't promise it will work with recent versions 
of MediaWiki, but should be enough to get you started testing.

- -- 
Greg Sabino Mullane g...@turnstep.com
End Point Corporation http://www.endpoint.com/
PGP Key: 0x14964AC8 201104261344
http://biglumber.com/x/web?pk=2529DF6AB8F79407E94445B4BC9B906714964AC8
-BEGIN PGP SIGNATURE-

iEYEAREDAAYFAk23BH4ACgkQvJuQZxSWSsgTOwCg7s0JXN8ZHrgng5F/c82+fbF6
Z3UAoLjBMqywR6ELmtGaJPz2IAiQoL2W
=SvR1
-END PGP SIGNATURE-



-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


Re: [GENERAL] Switching Database Engines

2011-04-26 Thread Steve Atkins

On Apr 26, 2011, at 10:24 AM, Carlos Mennens wrote:

 We've been using a Wiki server at the office for years. It was
 originally configured to use MySQL and finally after 8+ years we're
 moving the Wiki to a new platform of hardware. My question is the Wiki
 software (MediaWiki) is the only thing still tied to and using MySQL
 which we want to decommission but we've been using it for years so I'm
 worried we will lose the data. I've done some Google'ing to find out
 how can I change the MySQL database dump and successfully export it
 into my new PostgreSQL database however I don't know how practical or
 recommended this process is. I found sites like the following:

It's certainly possible to dump a mysql database and import it into
postgresql, without too much difficulty in most cases. The problem
with porting the data tends to be bad data in the mysql database
that was allowed by mysql but is caught by postgresql.

Changing the app to support it is usually the bigger problem.

 
 http://en.wikibooks.org/wiki/Converting_MySQL_to_PostgreSQL

At a quick glance, that doesn't look like a great resource. It's
suggesting using password rather than md5 authentication,
amongst other things. There are also some obvious thinkos
or copy/paste problems (suggesting that '=' in mysql is equivalent
to '' in postgresql, for instance). While much of what it says
looks reasonable, I wouldn't rely on it.

http://www.mediawiki.org/wiki/Manual:PostgreSQL is a better
place to look, perhaps.

Most of the core mediawiki runs OK with postgresql, but most
addons don't.

You definitely want to set up a test wiki instance, running on
postgresql - that's the first thing to do regardless of how you
migrate the data.

Then doing an XML dump from your existing mediawiki
instance and importing it into your test instance will give
you an idea of how well that will work. If that's good enough,
you don't need to care about the underlying database.
There are several ways to import xml dumps, with different
tradeoffs - check the mediawiki docs.

 Can you guys tell me if this is something that will work? I don't mean
 the exact link above but just in general taking a database from MySQL
 and successfully migrating it for PostgreSQL use?
 
 From what I can see in the MySQL database, there appears to be 43
 tables with lots of column data and who knows what else:

Cheers,
  Steve


-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


Re: [GENERAL] Switching Database Engines

2011-04-26 Thread Carlos Mennens
On Tue, Apr 26, 2011 at 1:44 PM, Greg Sabino Mullane g...@turnstep.com wrote:
 In general, yes. For your specific use case, it might be best to use
 MediaWiki's XML dump and restore. You could also use the conversion
 script that comes with MediaWiki, at:

 maintenance/postgres/mediawiki_mysql2postgres.pl

 It's a little old so I can't promise it will work with recent versions
 of MediaWiki, but should be enough to get you started testing.

Just to be clear and make sure I understand correctly, I can export
the Wiki info using the MediaWiki XML export tool (which I found) or I
can try to use the MediaWiki tool referenced as
'mediawiki_mysql2postgres.pl', right? I think from reading the
options, I should try the XML export 1st.

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


Re: [GENERAL] Switching Database Engines

2011-04-26 Thread Greg Sabino Mullane

-BEGIN PGP SIGNED MESSAGE-
Hash: RIPEMD160


 Just to be clear and make sure I understand correctly, I can export
 the Wiki info using the MediaWiki XML export tool (which I found) or I
 can try to use the MediaWiki tool referenced as
 'mediawiki_mysql2postgres.pl', right? I think from reading the
 options, I should try the XML export 1st.

Correct. Keep in mind I don't think the XML route will convert the users 
table, just the wiki data itself. As someone else mentioned, the 
wiki itself will work fine, but support for any MediaWiki extensions 
is hit or miss because MediaWiki was a MySQL only shop for so long 
(and Wikimedia still uses it, and they influence a lot of the MW 
development).

- -- 
Greg Sabino Mullane g...@turnstep.com
End Point Corporation http://www.endpoint.com/
PGP Key: 0x14964AC8 201104261505
http://biglumber.com/x/web?pk=2529DF6AB8F79407E94445B4BC9B906714964AC8
-BEGIN PGP SIGNATURE-

iEYEAREDAAYFAk23F70ACgkQvJuQZxSWSsgzxgCg3sdCbLN4dqqK7xwmDBufgVoj
Zq0AoLT7VMRG7U2VhGHgBmHmR3eGuh0o
=p0Gv
-END PGP SIGNATURE-



-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


Re: [GENERAL] Switching Database Engines

2011-04-26 Thread Carlos Mennens
On Tue, Apr 26, 2011 at 3:06 PM, Greg Sabino Mullane g...@turnstep.com wrote:
 Correct. Keep in mind I don't think the XML route will convert the users
 table, just the wiki data itself. As someone else mentioned, the
 wiki itself will work fine, but support for any MediaWiki extensions
 is hit or miss because MediaWiki was a MySQL only shop for so long
 (and Wikimedia still uses it, and they influence a lot of the MW
 development).

So what is the ideal or best suggested approach on tackling this task?
I built the new Apache server and installed the new (latest) version
of MediaWiki on it. I then pointed the installation of the new server
to my existing PostgreSQL database server. I created a user and
database for MediaWiki to use but it self generated a new table schema
during the installation. Should I proceed or do I somehow need to
blast what the new installation did and migrate my old data in it's
place. I guess there just isn't a clean understanding on my part on
what I need to do in what particular order.

I tried running the tool as suggested before and unless I'm doing
something wrong, I have no idea why it didn't work:

[root@db_old postgres]# pwd
/var/www/html/int/main/wiki/maintenance/postgres

[root@db_old postgres]# ls -l
total 60
drwxr-xr-x 2 root root  4096 Jul 13  2009 archives
-rw-r--r-- 1 root root 13988 Mar 12  2009 compare_schemas.pl
-rw-r--r-- 1 root root 14063 Nov 22  2008 mediawiki_mysql2postgres.pl
-rw-r--r-- 1 root root 23596 Mar 19  2009 tables.sql

[root@db_old postgres]# sh mediawiki_mysql2postgres.pl
mediawiki_mysql2postgres.pl: line 12: use: command not found
mediawiki_mysql2postgres.pl: line 13: use: command not found
mediawiki_mysql2postgres.pl: line 14: use: command not found
mediawiki_mysql2postgres.pl: line 15: use: command not found
mediawiki_mysql2postgres.pl: line 17: syntax error near unexpected token `('
mediawiki_mysql2postgres.pl: line 17: `use vars qw(%table %tz %special
@torder $COM);'

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


Re: [GENERAL] Switching Database Engines

2011-04-26 Thread John R Pierce

On 04/26/11 12:24 PM, Carlos Mennens wrote:


I tried running the tool as suggested before and unless I'm doing
something wrong, I have no idea why it didn't work:

[root@db_old postgres]# pwd
/var/www/html/int/main/wiki/maintenance/postgres

[root@db_old postgres]# ls -l
total 60
drwxr-xr-x 2 root root  4096 Jul 13  2009 archives
-rw-r--r-- 1 root root 13988 Mar 12  2009 compare_schemas.pl
-rw-r--r-- 1 root root 14063 Nov 22  2008 mediawiki_mysql2postgres.pl
-rw-r--r-- 1 root root 23596 Mar 19  2009 tables.sql

[root@db_old postgres]# sh mediawiki_mysql2postgres.pl
mediawiki_mysql2postgres.pl: line 12: use: command not found
mediawiki_mysql2postgres.pl: line 13: use: command not found
mediawiki_mysql2postgres.pl: line 14: use: command not found
mediawiki_mysql2postgres.pl: line 15: use: command not found
mediawiki_mysql2postgres.pl: line 17: syntax error near unexpected token `('
mediawiki_mysql2postgres.pl: line 17: `use vars qw(%table %tz %special
@torder $COM);'


wild guess says, use perl, not sh to run a .pl



--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


Re: [GENERAL] Switching Database Engines

2011-04-26 Thread Alban Hertroys
On 26 Apr 2011, at 19:24, Carlos Mennens wrote:

 We've been using a Wiki server at the office for years. It was
 originally configured to use MySQL and finally after 8+ years we're
 moving the Wiki to a new platform of hardware. My question is the Wiki
 software (MediaWiki) is the only thing still tied to and using MySQL
 which we want to decommission but we've been using it for years so I'm
 worried we will lose the data. I've done some Google'ing to find out
 how can I change the MySQL database dump and successfully export it
 into my new PostgreSQL database however I don't know how practical or
 recommended this process is. I found sites like the following:


I don't know much about mediawiki (except for how to use it), but it's not 
unusual for modern web-apps to have some functionality to dump their contents 
in a consistently formatted file (often XML) that it can subsequently import 
into a new environment. Might be worth looking into.

Alban Hertroys

--
If you can't see the forest for the trees,
cut the trees and you'll see there is no forest.


!DSPAM:737,4db72c7511671681815569!



-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


Re: [GENERAL] Switching Database Engines

2011-04-26 Thread Carlos Mennens
On Tue, Apr 26, 2011 at 4:34 PM, Alban Hertroys
dal...@solfertje.student.utwente.nl wrote:
 I don't know much about mediawiki (except for how to use it), but it's not 
 unusual for modern web-apps to have some functionality to dump their contents 
 in a consistently formatted file (often XML) that it can subsequently import 
 into a new environment. Might be worth looking into.

Yes MediaWiki can dump pages into XML but that's what scares me. It
does it in pages so I would have to dump every Wiki page into a
separate XML file rather than doing one huge Wiki dump. I guess I need
to check the MediaWiki forums and find out how I can export everything
into XML. But lets say I do export every thing to XML. Now I have XML
file(s) and a new database for MediaWiki. How do I get all the old
data on the new server? Do I do a fresh install 1st and let the
installer configure my database as the assigned role, then import the
XML data through the MediaWiki tool? I guess I should check their
forums.

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general