Hello Nik,

I have put together a project that builds heavily on mi-deployment
that does some of what you want to do (and probably a bunch of things
you don't), but you could hopefully pull out just the parts you need.

The idea behind the project is to build really cheap cloud galaxy
instances by not using EBS or S3. From one simple command-line
execution you can launch an instance, configure Galaxy, and transfer
data into it (including optionally placing the data in a configured

The command for doing this would be:

./run.sh --action=configure --action=transfer file1 file2 file3

The entire system is configure from one yaml file  (including tools,
genomes, and galaxy data (users, passwords, workflows, histories),
etc...). See settings.yaml-sample for the myriad of options.

This project is the driver, configure everything you need here and run
the launch/configure/transfer script:
This project is a fork of galaxy-dist with changesets needed to
implement this functionality:

So to prepopulate a history the way you described, you will want to
define a user and create a history for that user by modifying the
"galaxy" section of settings.yaml:

  ## In order to create data libraries. First user should be admin@localhost
  ## and an API key must be specified, be sure to change API keys and passwords
    - username: admin@localhost
      password: adminpass
      api_key: 1234556789
    - username: us...@example.com
      password: pass1
      api_key: 987654321
      ## Histories to create for this user
      - TransferExampleHistory

Then at the top-level of the yaml file you will need to specify the
transferred data should be loaded into this history specified above:

transfer_history_name: TransferExampleHistory
transfer_history_api_key: 987654321

I think you will want to look at the seed_database function in
lib/galaxy.py of galaxy-vm-launcher and the
scripts/api/handle_uploads.py script in cloud-galaxy-dist.


Hope this is helpful,

John Chilton
Senior Software Developer
University of Minnesota Supercomputing Institute
Office: 612-625-0917
Cell: 612-226-9223

On Tue, Jun 5, 2012 at 4:43 PM, Nikhil Joshi <najo...@ucdavis.edu> wrote:
> Hi all,
> I have a galaxy instance running in the Amazon Cloud that I have
> customized and I want to be able to transfer files (via scp) to the
> instance (or its attached storage) and then have those files show up
> automatically in the history.  Is this possible?  Alternatively, is
> there some way to automatically populate the history of a registered
> user, so that when that user logs in, the files are automatically in a
> saved history?  A third possibility would be to transfer the files and
> restart galaxy and then have the files show up automatically?  Just to
> be clear, I do not want to use the Upload Files tool at all.  Is there
> any way to implement any of these options?  Any help would be highly
> appreciated.  Thanks!
> - Nik.
> --
> Nikhil Joshi
> Bioinformatics Analyst/Programmer
> UC Davis Bioinformatics Core
> http://bioinformatics.ucdavis.edu/
> najoshi -at- ucdavis -dot- edu
> 530.752.2698 (w)
> ___________________________________________________________
> Please keep all replies on the list by using "reply all"
> in your mail client.  To manage your subscriptions to this
> and other Galaxy lists, please use the interface at:
>  http://lists.bx.psu.edu/

Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:


Reply via email to