On 05/03/2012 08:29 PM, Robert Chase wrote:
Hello,
We have an existing galaxy instance that uses MySQL, has workflows and
data libraries and countless changes to the source code and tools. We
would like to save the database, workflows and data libraries but bring
all the sourcecode up to date with the standard galaxy code. Because
over 3000 of our files have been changed, we do not believe that we can
realistically run hg update and then hg merge.
We have built a test instance of galaxy that runs on a different virtual
machine. One thought is to clone it, migrate all the datasets to it and
then use the clone as the new production instance.
One specific piece of information would be very helpful. Can we just
copy the ~/data files to the test instance and have galaxy recognize
them or is there metadata stored in the database?
Hi Rob 3000
You need a copy of your MySQL database for your new test instance of
galaxy to connect to. This should get you going initially.
Most likely, you will then stumble across a few problems:
- you need to sync ~/galaxy_dist/tool-data/shared/
- you need to check the data libraries: are they copies or sym-links
- you need to decide how many of you changes to the tools you want to
port to the new server. In an ideal world you would need all in
order to maintain the reproducibility...this might be tricky,
especially if you have defined your own data types. In this case a
simple copy of the 'tools/' directory and the 'tool_conf.xml' is not
sufficient anymore
- an more...
Nevertheless, you said you made more than 3000 changes to code and
tools, so I guess, you are very proficient with the inner works of
Galaxy. Hence, I see no problem, why you should not be successful.
Regards, Hans
-Rob
On Wed, Mar 28, 2012 at 10:16 AM, Hans-Rudolf Hotz <h...@fmi.ch
<mailto:h...@fmi.ch>> wrote:
Hi Rob
I don't know what exactly you mean by "new server". Assuming you are
just talking about different hardware:
Have you considered moving 'everything' to the new server? ie
copying the Galaxy directory tree (especially make sure you copy
~/database/ ) and connect to the same PostgreSQL (or MySQL) database.
If you want, I can give you more details how we successfully changed
the hardware of our server last year
Regards, Hans
On 03/28/2012 03:31 PM, Robert Chase wrote:
Thank you that could be very helpful to our users if we can't
download
them all as one big batch. I was hoping to be able to do
something to
save the users the headache of having to export and upload their
workflows to the new server.
Regards,
-Rob
On Wed, Mar 28, 2012 at 4:07 AM, Hans-Rudolf Hotz <h...@fmi.ch
<mailto:h...@fmi.ch>
<mailto:h...@fmi.ch <mailto:h...@fmi.ch>>> wrote:
Hi Robert
Have you looked at the "Download or Export" option? - you get this
option, when you click on the triangle next to the names of you
workflows.
Regards, Hans
On 03/27/2012 08:36 PM, Robert Chase wrote:
Hello,
We are going to migrate to a new instance of the galaxy server. Our
current instance contains a number of workflows and other data
that we
would like to have available on the new server. Is there a way
to backup
the workflows and then reload them on the new server?
Regards,
-Robert Paul Chase
Channing Lab
_________________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client. To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
http://lists.bx.psu.edu/
___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client. To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
http://lists.bx.psu.edu/