Re: [galaxy-dev] Bx-python and numpy eggs
Hello Bjoern, Yes, that solved the problem. Thank you! Regards, Jose 2015-11-27 17:59 GMT+01:00 Björn Grüning <bjoern.gruen...@gmail.com>: > Hi Jose, > > can you please try to set: > > enable_beta_tool_command_isolation = True > > in your galaxy.ini file? > > Thanks, > Bjoern > > Am 27.11.2015 um 16:01 schrieb Jose Juan Almagro Armenteros: > > I bit more of information. I have this problem running tools from the > last > > version of deepTools (1.5.11.0) , as bamCompare or bamCorrelate. However, > > when I run the same tools from the previous version (1.5.9.1.0), they > work > > correctly. I have reinstalled this last version but nothing. Is it > possible > > that, as these two versions are using different numpy versions, bx-python > > is using the version from the deepTools package and not the one from the > > eggs? > > > > 2015-11-27 13:03 GMT+01:00 Jose Juan Almagro Armenteros < > jjalma...@gmail.com > >> : > > > >> I have tried to remove the bx-python and numpy eggs and then I have > >> fetched them again but this didn't work neither. > >> > >> Any idea why bx-python is not recognizing the proper numpy version? > >> > >> Regards, > >> > >> Jose > >> > >> 2015-11-27 0:53 GMT+01:00 Jose Juan Almagro Armenteros < > >> jjalma...@gmail.com>: > >> > >>> Hello, > >>> > >>> I am in the master branch and I recently pull all the changes to my > local > >>> Galaxy. However, when I run one tool I obtained this error: > >>> > >>> Fatal error: Matched on Error: > >>> Traceback (most recent call last): > >>> File > "/steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/024/24973/set_metadata_GULyCa.py", > line 1, in > >>> from galaxy_ext.metadata.set_metadata import set_metadata; > set_metadata() > >>> File > "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy_ext/metadata/set_metadata.py", > line 27, in > >>> import galaxy.model.mapping # need to load this before we > unpickle, in order to setup properties assigned by the mappers > >>> File > "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/model/mapping.py", > line 21, in > >>> from galaxy.model.custom_types import JSONType, MetadataType, > TrimmedString, UUIDType > >>> File > "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/model/custom_types.py", > line 15, in > >>> from galaxy import app > >>> File > "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/app.py", line 14, > in > >>> from galaxy.visualization.data_providers.registry import > DataProviderRegistry > >>> File > "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/visualization/data_providers/registry.py", > line 2, in > >>> from galaxy.visualization.data_providers import genome > >>> File > "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/visualization/data_providers/genome.py", > line 16, in > >>> from bx.bbi.bigbed_file import BigBedFile > >>> File "bigbed_file.pyx", line 1, in init bx.bbi.bigbed_file > (lib/bx/bbi/bigbed_file.c:6272) > >>> File "numpy.pxd", line 155, in init bx.bbi.bbi_file > (lib/bx/bbi/bbi_file.c:12669) > >>> ValueError: numpy.dtype has the wrong size, try recompiling > >>> > >>> I suppose there is a problem with the bx-python or numpy egg but I am > not > >>> really sure if is this. I run check_eggs.py and it did nothing and I > also > >>> try to update numpy for the system, which didn't work neither. > >>> > >>> Do you know which numpy should I recompile in order to get this to > work? > >>> > >>> Regards, > >>> > >>> Jose > >>> > >> > >> > > > > > > > > ___ > > Please keep all replies on the list by using "reply all" > > in your mail client. To manage your subscriptions to this > > and other Galaxy lists, please use the interface at: > > https://lists.galaxyproject.org/ > > > > To search Galaxy mailing lists use the unified search at: > > http://galaxyproject.org/search/mailinglists/ > > > ___ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: https://lists.galaxyproject.org/ To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/
Re: [galaxy-dev] Bx-python and numpy eggs
I have tried to remove the bx-python and numpy eggs and then I have fetched them again but this didn't work neither. Any idea why bx-python is not recognizing the proper numpy version? Regards, Jose 2015-11-27 0:53 GMT+01:00 Jose Juan Almagro Armenteros <jjalma...@gmail.com> : > Hello, > > I am in the master branch and I recently pull all the changes to my local > Galaxy. However, when I run one tool I obtained this error: > > Fatal error: Matched on Error: > Traceback (most recent call last): > File > "/steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/024/24973/set_metadata_GULyCa.py", > line 1, in > from galaxy_ext.metadata.set_metadata import set_metadata; set_metadata() > File > "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy_ext/metadata/set_metadata.py", > line 27, in > import galaxy.model.mapping # need to load this before we unpickle, in > order to setup properties assigned by the mappers > File > "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/model/mapping.py", > line 21, in > from galaxy.model.custom_types import JSONType, MetadataType, > TrimmedString, UUIDType > File > "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/model/custom_types.py", > line 15, in > from galaxy import app > File "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/app.py", line > 14, in > from galaxy.visualization.data_providers.registry import > DataProviderRegistry > File > "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/visualization/data_providers/registry.py", > line 2, in > from galaxy.visualization.data_providers import genome > File > "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/visualization/data_providers/genome.py", > line 16, in > from bx.bbi.bigbed_file import BigBedFile > File "bigbed_file.pyx", line 1, in init bx.bbi.bigbed_file > (lib/bx/bbi/bigbed_file.c:6272) > File "numpy.pxd", line 155, in init bx.bbi.bbi_file > (lib/bx/bbi/bbi_file.c:12669) > ValueError: numpy.dtype has the wrong size, try recompiling > > I suppose there is a problem with the bx-python or numpy egg but I am not > really sure if is this. I run check_eggs.py and it did nothing and I also > try to update numpy for the system, which didn't work neither. > > Do you know which numpy should I recompile in order to get this to work? > > Regards, > > Jose > ___ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: https://lists.galaxyproject.org/ To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/
Re: [galaxy-dev] Bx-python and numpy eggs
I bit more of information. I have this problem running tools from the last version of deepTools (1.5.11.0) , as bamCompare or bamCorrelate. However, when I run the same tools from the previous version (1.5.9.1.0), they work correctly. I have reinstalled this last version but nothing. Is it possible that, as these two versions are using different numpy versions, bx-python is using the version from the deepTools package and not the one from the eggs? 2015-11-27 13:03 GMT+01:00 Jose Juan Almagro Armenteros <jjalma...@gmail.com >: > I have tried to remove the bx-python and numpy eggs and then I have > fetched them again but this didn't work neither. > > Any idea why bx-python is not recognizing the proper numpy version? > > Regards, > > Jose > > 2015-11-27 0:53 GMT+01:00 Jose Juan Almagro Armenteros < > jjalma...@gmail.com>: > >> Hello, >> >> I am in the master branch and I recently pull all the changes to my local >> Galaxy. However, when I run one tool I obtained this error: >> >> Fatal error: Matched on Error: >> Traceback (most recent call last): >> File >> "/steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/024/24973/set_metadata_GULyCa.py", >> line 1, in >> from galaxy_ext.metadata.set_metadata import set_metadata; set_metadata() >> File >> "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy_ext/metadata/set_metadata.py", >> line 27, in >> import galaxy.model.mapping # need to load this before we unpickle, in >> order to setup properties assigned by the mappers >> File >> "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/model/mapping.py", >> line 21, in >> from galaxy.model.custom_types import JSONType, MetadataType, >> TrimmedString, UUIDType >> File >> "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/model/custom_types.py", >> line 15, in >> from galaxy import app >> File "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/app.py", line >> 14, in >> from galaxy.visualization.data_providers.registry import >> DataProviderRegistry >> File >> "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/visualization/data_providers/registry.py", >> line 2, in >> from galaxy.visualization.data_providers import genome >> File >> "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/visualization/data_providers/genome.py", >> line 16, in >> from bx.bbi.bigbed_file import BigBedFile >> File "bigbed_file.pyx", line 1, in init bx.bbi.bigbed_file >> (lib/bx/bbi/bigbed_file.c:6272) >> File "numpy.pxd", line 155, in init bx.bbi.bbi_file >> (lib/bx/bbi/bbi_file.c:12669) >> ValueError: numpy.dtype has the wrong size, try recompiling >> >> I suppose there is a problem with the bx-python or numpy egg but I am not >> really sure if is this. I run check_eggs.py and it did nothing and I also >> try to update numpy for the system, which didn't work neither. >> >> Do you know which numpy should I recompile in order to get this to work? >> >> Regards, >> >> Jose >> > > ___ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: https://lists.galaxyproject.org/ To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/
[galaxy-dev] Bx-python and numpy eggs
Hello, I am in the master branch and I recently pull all the changes to my local Galaxy. However, when I run one tool I obtained this error: Fatal error: Matched on Error: Traceback (most recent call last): File "/steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/024/24973/set_metadata_GULyCa.py", line 1, in from galaxy_ext.metadata.set_metadata import set_metadata; set_metadata() File "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy_ext/metadata/set_metadata.py", line 27, in import galaxy.model.mapping # need to load this before we unpickle, in order to setup properties assigned by the mappers File "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/model/mapping.py", line 21, in from galaxy.model.custom_types import JSONType, MetadataType, TrimmedString, UUIDType File "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/model/custom_types.py", line 15, in from galaxy import app File "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/app.py", line 14, in from galaxy.visualization.data_providers.registry import DataProviderRegistry File "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/visualization/data_providers/registry.py", line 2, in from galaxy.visualization.data_providers import genome File "/steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/visualization/data_providers/genome.py", line 16, in from bx.bbi.bigbed_file import BigBedFile File "bigbed_file.pyx", line 1, in init bx.bbi.bigbed_file (lib/bx/bbi/bigbed_file.c:6272) File "numpy.pxd", line 155, in init bx.bbi.bbi_file (lib/bx/bbi/bbi_file.c:12669) ValueError: numpy.dtype has the wrong size, try recompiling I suppose there is a problem with the bx-python or numpy egg but I am not really sure if is this. I run check_eggs.py and it did nothing and I also try to update numpy for the system, which didn't work neither. Do you know which numpy should I recompile in order to get this to work? Regards, Jose ___ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: https://lists.galaxyproject.org/ To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/
Re: [galaxy-dev] Bismark installation dependencies error
Hello Björn, I am using Galaxy 15.07, so everything should be updated. Regards, Jose 2015-10-28 15:43 GMT+01:00 Björn Grüning <bjoern.gruen...@gmail.com>: > Hi Jose, > > which version of Galaxy are you using? > @Dave can this be one of the old un-compressing changes? > > Cheers, > Bjoern > > Am 28.10.2015 um 15:31 schrieb Jose Juan Almagro Armenteros: > > Hello, > > > > I have tried to install the last version of Bismark from the toolshed > and I > > got an error while installing. Basically it can't install bowtie and > > bowtie2 because the folder name > > is written twice in the download directory. Here is the error: > > > > File > "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/install_manager.py", > > line 142, in install_and_build_package_via_fabric > > tool_dependency = self.install_and_build_package( > > tool_shed_repository, tool_dependency, actions_dict ) > > File > "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/install_manager.py", > > line 100, in install_and_build_package > > initial_download=True ) > > File > "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/tool_dependencies/recipe/recipe_manager.py", > > line 32, in execute_step > > initial_download=initial_download ) > > File > "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/tool_dependencies/recipe/step_handler.py", > > line 685, in execute_step > > dir = self.url_download( work_dir, downloaded_filename, url, > > extract=True, checksums=checksums ) > > File > "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/tool_dependencies/recipe/step_handler.py", > > line 223, in url_download > > extraction_path = archive.extract( install_dir ) > > File > "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/tool_dependencies/recipe/step_handler.py", > > line 89, in extract > > os.chmod( absolute_filepath, unix_permissions ) > > > > [Errno 2] No such file or directory: > > './database/tmp/tmp-toolshed-mtdM4XBx_/bowtie-0.12.8/bowtie-0.12.8/' > > > > As you see "bowtie-0.12.8" is twice in the path and when I check the > > directory, the last "bowtie-0.12.8" folder doesn't exists. > > > > Do you know what could go wrong or an alternative way of specifying the > > correct path? > > > > > > Best regards, > > > > Jose > > > > > > > > ___ > > Please keep all replies on the list by using "reply all" > > in your mail client. To manage your subscriptions to this > > and other Galaxy lists, please use the interface at: > > https://lists.galaxyproject.org/ > > > > To search Galaxy mailing lists use the unified search at: > > http://galaxyproject.org/search/mailinglists/ > > > ___ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: https://lists.galaxyproject.org/ To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/
[galaxy-dev] Bismark installation dependencies error
Hello, I have tried to install the last version of Bismark from the toolshed and I got an error while installing. Basically it can't install bowtie and bowtie2 because the folder name is written twice in the download directory. Here is the error: File "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/install_manager.py", line 142, in install_and_build_package_via_fabric tool_dependency = self.install_and_build_package( tool_shed_repository, tool_dependency, actions_dict ) File "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/install_manager.py", line 100, in install_and_build_package initial_download=True ) File "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/tool_dependencies/recipe/recipe_manager.py", line 32, in execute_step initial_download=initial_download ) File "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/tool_dependencies/recipe/step_handler.py", line 685, in execute_step dir = self.url_download( work_dir, downloaded_filename, url, extract=True, checksums=checksums ) File "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/tool_dependencies/recipe/step_handler.py", line 223, in url_download extraction_path = archive.extract( install_dir ) File "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/tool_dependencies/recipe/step_handler.py", line 89, in extract os.chmod( absolute_filepath, unix_permissions ) [Errno 2] No such file or directory: './database/tmp/tmp-toolshed-mtdM4XBx_/bowtie-0.12.8/bowtie-0.12.8/' As you see "bowtie-0.12.8" is twice in the path and when I check the directory, the last "bowtie-0.12.8" folder doesn't exists. Do you know what could go wrong or an alternative way of specifying the correct path? Best regards, Jose ___ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: https://lists.galaxyproject.org/ To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/
Re: [galaxy-dev] Bismark installation dependencies error
Hello again, I have the release_15.07 from Bitbucket, so if I want to have that commit I suppose that I would have to switch my Galaxy instance from Bitbucket to Github. Is this correct? Regards, Jose 2015-10-28 17:00 GMT+01:00 Dave Bouvier <d...@bx.psu.edu>: > Yes, this is definitely related to the issue I fixed part of, and (I > think) Nicola fixed the rest of. > > - > Dave Bouvier > http://galaxyproject.org > http://usegalaxy.org > > On 10/28/2015 11:46 AM, Nicola Soranzo wrote: > >> Hi Jose, >> can you check that your release_15.07 is updated and contains git commit >> 197c7b21b209a7b6fbdb9a8a11c232a2b88523fc ( >> >> https://github.com/galaxyproject/galaxy/commit/197c7b21b209a7b6fbdb9a8a11c232a2b88523fc >> )? >> >> Cheers, >> Nicola >> >> On 28/10/15 14:53, Jose Juan Almagro Armenteros wrote: >> >>> Hello Björn, >>> >>> I am using Galaxy 15.07, so everything should be updated. >>> >>> Regards, >>> >>> Jose >>> >>> 2015-10-28 15:43 GMT+01:00 Björn Grüning <bjoern.gruen...@gmail.com >>> <mailto:bjoern.gruen...@gmail.com>>: >>> >>> >>> Hi Jose, >>> >>> which version of Galaxy are you using? >>> @Dave can this be one of the old un-compressing changes? >>> >>> Cheers, >>> Bjoern >>> >>> Am 28.10.2015 um 15:31 schrieb Jose Juan Almagro Armenteros: >>> > Hello, >>> > >>> > I have tried to install the last version of Bismark from the >>> toolshed and I >>> > got an error while installing. Basically it can't install bowtie >>> and >>> > bowtie2 because the folder name >>> > is written twice in the download directory. Here is the error: >>> > >>> > File >>> >>> "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/install_manager.py", >>> > line 142, in install_and_build_package_via_fabric >>> > tool_dependency = self.install_and_build_package( >>> > tool_shed_repository, tool_dependency, actions_dict ) >>> > File >>> >>> "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/install_manager.py", >>> > line 100, in install_and_build_package >>> > initial_download=True ) >>> > File >>> >>> "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/tool_dependencies/recipe/recipe_manager.py", >>> > line 32, in execute_step >>> > initial_download=initial_download ) >>> > File >>> >>> "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/tool_dependencies/recipe/step_handler.py", >>> > line 685, in execute_step >>> > dir = self.url_download( work_dir, downloaded_filename, url, >>> > extract=True, checksums=checksums ) >>> > File >>> >>> "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/tool_dependencies/recipe/step_handler.py", >>> > line 223, in url_download >>> > extraction_path = archive.extract( install_dir ) >>> > File >>> >>> "/steno-internal/projects/galaxy/galaxy-dist/lib/tool_shed/galaxy_install/tool_dependencies/recipe/step_handler.py", >>> > line 89, in extract >>> > os.chmod( absolute_filepath, unix_permissions ) >>> > >>> > [Errno 2] No such file or directory: >>> > >>> './database/tmp/tmp-toolshed-mtdM4XBx_/bowtie-0.12.8/bowtie-0.12.8/' >>> > >>> > As you see "bowtie-0.12.8" is twice in the path and when I check >>> the >>> > directory, the last "bowtie-0.12.8" folder doesn't exists. >>> > >>> > Do you know what could go wrong or an alternative way of >>> specifying the >>> > correct path? >>> > >>> > >>> > Best regards, >>> > >>> > Jose >>> > >>> > >>> > >>> > ___ >>> > Please keep all replies on the list by using "reply all" >>> > in your mail client.
[galaxy-dev] Plot-bamstats
Hello, I have just installed the tool "samtools_stats" from the toolshed. In the synopsis says: "This tool runs the samtools stats command in the SAMtools toolkit, collecting statistics from BAM files. The output can be visualized using plot-bamstat ", but I haven't found any tool to visualize it with plot-bamstat. Do you know if there is such tool or another similar? Best regards, Jose ___ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: https://lists.galaxyproject.org/ To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/
[galaxy-dev] set_user_disk_usage.py as a cron job?
Hello all, I have just encounter a bug with the disk-quota for many users and I run the set_user_disk_usage.py script. It recalculated the disk usage for most of them and now seems that everything is okay. My question is: should I add this script as a cron job or this kind of bugs are just something inusual that happens after upgrading? Best regards, Jose ___ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: https://lists.galaxyproject.org/ To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/
Re: [galaxy-dev] SFTP configuration with ProFTPD
Hello Nate, Yes, I am using the ProFTPD's SFTP server, as it appears in the proftpd's sftp.log when I try to connect. The problem is that when using FTP it ask directly for the username and then the password but with the SFTP it just uses the keys and the password. Even if I try to login with # sftp -P xx email@server.address and the Galaxy user password, it doesn't work. So, I don't know if it possible to connect with SFTP using the username and the passwords from Galaxy. (I don't really have that much idea about ssh, sftp or ftp, so sorry if this is a nonsense). Best regards, Jose 2015-09-08 20:27 GMT+02:00 Nate Coraor <n...@bx.psu.edu>: > On Tue, Sep 8, 2015 at 2:05 PM, Jose Juan Almagro Armenteros < > jjalma...@gmail.com> wrote: > >> Hello everyone, >> >> I am a bit stuck trying to set up a SFTP on my local Galaxy. I already >> have an FTP server as described in the Galaxy wiki and in this post " >> http://galacticengineer.blogspot.co.uk/2015/02/ftp-upload-to-galaxy-using-proftpd-and.html;, >> which works correctly with the user authentication. >> >> The problem is that when I switched to SFTP (using some posts that >> described how to do this on ProFTPD with mod_sftp.c), the user >> authentication stopped working. I don't really know what specific >> modifications should I do to the proftpd.conf file, which is the same as >> the one in the wiki, to make it works. >> >> I would really appreciate if you have any idea about how to change from >> FTP to SFTP in Galaxy or if any of you have a config file working for SFTP. >> > > Hi Jose, > > Are you certain that when you're testing SFTP, you are using ProFTPD's > SFTP server, and not an sftp service (i.e. via OpenSSH) already running on > the system? One important distinction: SFTP and FTPS are not the same thing. > > --nate > > >> >> Thank you and best regards, >> >> Jose >> >> ___ >> Please keep all replies on the list by using "reply all" >> in your mail client. To manage your subscriptions to this >> and other Galaxy lists, please use the interface at: >> https://lists.galaxyproject.org/ >> >> To search Galaxy mailing lists use the unified search at: >> http://galaxyproject.org/search/mailinglists/ >> > > ___ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: https://lists.galaxyproject.org/ To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/
[galaxy-dev] SFTP configuration with ProFTPD
Hello everyone, I am a bit stuck trying to set up a SFTP on my local Galaxy. I already have an FTP server as described in the Galaxy wiki and in this post " http://galacticengineer.blogspot.co.uk/2015/02/ftp-upload-to-galaxy-using-proftpd-and.html;, which works correctly with the user authentication. The problem is that when I switched to SFTP (using some posts that described how to do this on ProFTPD with mod_sftp.c), the user authentication stopped working. I don't really know what specific modifications should I do to the proftpd.conf file, which is the same as the one in the wiki, to make it works. I would really appreciate if you have any idea about how to change from FTP to SFTP in Galaxy or if any of you have a config file working for SFTP. Thank you and best regards, Jose ___ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: https://lists.galaxyproject.org/ To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/
[galaxy-dev] Fwd: History panel error
Hello Dannon, This is the log when a job is executed: galaxy.jobs DEBUG 2015-04-24 17:25:27,266 (9568) Working directory for job is: /steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568 galaxy.jobs.handler DEBUG 2015-04-24 17:25:27,313 (9568) Dispatching to local runner galaxy.jobs DEBUG 2015-04-24 17:25:27,600 (9568) Persisting job destination (destination id: local) galaxy.jobs.handler INFO 2015-04-24 17:25:27,686 (9568) Job dispatched galaxy.jobs.runners DEBUG 2015-04-24 17:25:28,465 (9568) command is: python /steno-internal/projects/galaxy/galaxy-dist/tools/table_gen/generator.py /steno-internal/projects/galaxy/galaxy-dist/database/files/021/dataset_21041.dat 1; return_code=$?; python /steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/set_metadata_MH6prK.py /steno-internal/projects/galaxy/galaxy-dist/database/tmp/tmpw80FFi /steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/galaxy.json /steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/metadata_in_HistoryDatasetAssociation_14967_7w0ALE,/steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/metadata_kwds_HistoryDatasetAssociation_14967_uH3ZBP,/steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/metadata_out_HistoryDatasetAssociation_14967_5iEavP,/steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/metadata_results_HistoryDatasetAssociation_14967_6FT2qV,/steno-internal/projects/galaxy/galaxy-dist/database/files/021/dataset_21041.dat,/steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/metadata_override_HistoryDatasetAssociation_14967_iOPJoQ; sh -c exit $return_code galaxy.jobs.runners.local DEBUG 2015-04-24 17:25:28,475 (9568) executing job script: /steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/galaxy_9568.sh galaxy.jobs DEBUG 2015-04-24 17:25:28,617 (9568) Persisting job destination (destination id: local) galaxy.jobs.runners.local DEBUG 2015-04-24 17:25:31,960 execution finished: /steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/galaxy_9568.sh galaxy.datatypes.metadata DEBUG 2015-04-24 17:25:32,207 loading metadata from file for: HistoryDatasetAssociation 14967 galaxy.jobs INFO 2015-04-24 17:25:34,775 Collecting job metrics for galaxy.model.Job object at 0x7f3154cacad0 galaxy.jobs DEBUG 2015-04-24 17:25:34,799 job 9568 ended galaxy.datatypes.metadata DEBUG 2015-04-24 17:25:34,800 Cleaning up external metadata files I am not sure at all but I think this is the internal error that I got with the history panel problem: 130.225.125.174 - - [24/Apr/2015:15:22:19 +0200] GET /api/histories/32c70b5011288952/contents HTTP/1.1 500 - http://galaxy.bric.dk/history/view_multiple; Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2062.120 Safari/537.36 galaxy.web.framework.decorators ERROR 2015-04-24 15:22:20,033 Uncaught exception in exposed API method: Traceback (most recent call last): File /steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/web/framework/decorators.py, line 251, in decorator rval = func( self, trans, *args, **kwargs) File /steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/webapps/galaxy/api/history_contents.py, line 105, in index hda_dict = self.hda_serializer.serialize_to_view( trans, content, view=view ) File /steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/managers/base.py, line 621, in serialize_to_view return self.serialize( trans, item, all_keys ) File /steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/managers/hdas.py, line 453, in serialize serialized = super( HDASerializer, self ).serialize( trans, hda, keys ) File /steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/managers/base.py, line 566, in serialize returned[ key ] = self.serializers[ key ]( trans, item, key ) File /steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/managers/hdas.py, line 403, in lambda 'resubmitted' : lambda t, i, k: i._state == t.app.model.Dataset.states.RESUBMITTED, AttributeError: 'Bunch' object has no attribute 'RESUBMITTED' Thanks a lot for your help! Regards, Jose 2015-04-24 17:16 GMT+02:00 Dannon Baker dannon.ba...@gmail.com: Hi Jose, Can you review (and share, maybe) the relevant part of the logs from when you attempt to execute a job? It'd be worth it, if you still have the logs, to know what the Internal Server Error indicated in your first message was, as well. -Dannon On Fri, Apr 24, 2015 at 11:00 AM Jose Juan Almagro Armenteros jjalma...@gmail.com wrote: Hi again, Now it is working, I just went back to the previous version and updated it again. All is okay except that the job table in the postgresql database is empty and is not being updated with new jobs (the problem is only with this table). I run
Re: [galaxy-dev] Fwd: History panel error
Hello Carl, I can wait until the next stable release, my Galaxy distribution is working correctly even though the job table in the database is always empty. This table was used to detect job errors by running a cron job that sends an email. Do you know if there is another way of doing this? Thank you! Regards, Jose 2015-04-27 20:46 GMT+02:00 Carl Eberhard carlfeberh...@gmail.com: Hi, Jose Our next cycle includes some updates to the resubmission/hda-serialization code that's mentioned in your stack trace. You can try two things: 1) You can wait for us to officially publish our next stable release (15.05) at the beginning of next month and then update your instance. 2) You can update *now* to the github version that *will be* our 15.05 release: https://github.com/galaxyproject/galaxy/tree/release_15.05 We are still stabilizing 15.05 and not all bug fixes have been committed, but it may be the best option if you need to get past this problem quickly. If you're unable to do one of the above or are still having issues after doing one of them, let me know and we can work on it further. Carl On Mon, Apr 27, 2015 at 4:28 AM, Jose Juan Almagro Armenteros jjalma...@gmail.com wrote: Hello Dannon, This is the log when a job is executed: galaxy.jobs DEBUG 2015-04-24 17:25:27,266 (9568) Working directory for job is: /steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568 galaxy.jobs.handler DEBUG 2015-04-24 17:25:27,313 (9568) Dispatching to local runner galaxy.jobs DEBUG 2015-04-24 17:25:27,600 (9568) Persisting job destination (destination id: local) galaxy.jobs.handler INFO 2015-04-24 17:25:27,686 (9568) Job dispatched galaxy.jobs.runners DEBUG 2015-04-24 17:25:28,465 (9568) command is: python /steno-internal/projects/galaxy/galaxy-dist/tools/table_gen/generator.py /steno-internal/projects/galaxy/galaxy-dist/database/files/021/dataset_21041.dat 1; return_code=$?; python /steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/set_metadata_MH6prK.py /steno-internal/projects/galaxy/galaxy-dist/database/tmp/tmpw80FFi /steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/galaxy.json /steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/metadata_in_HistoryDatasetAssociation_14967_7w0ALE,/steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/metadata_kwds_HistoryDatasetAssociation_14967_uH3ZBP,/steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/metadata_out_HistoryDatasetAssociation_14967_5iEavP,/steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/metadata_results_HistoryDatasetAssociation_14967_6FT2qV,/steno-internal/projects/galaxy/galaxy-dist/database/files/021/dataset_21041.dat,/steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/metadata_override_HistoryDatasetAssociation_14967_iOPJoQ; sh -c exit $return_code galaxy.jobs.runners.local DEBUG 2015-04-24 17:25:28,475 (9568) executing job script: /steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/galaxy_9568.sh galaxy.jobs DEBUG 2015-04-24 17:25:28,617 (9568) Persisting job destination (destination id: local) galaxy.jobs.runners.local DEBUG 2015-04-24 17:25:31,960 execution finished: /steno-internal/projects/galaxy/galaxy-dist/database/job_working_directory/009/9568/galaxy_9568.sh galaxy.datatypes.metadata DEBUG 2015-04-24 17:25:32,207 loading metadata from file for: HistoryDatasetAssociation 14967 galaxy.jobs INFO 2015-04-24 17:25:34,775 Collecting job metrics for galaxy.model.Job object at 0x7f3154cacad0 galaxy.jobs DEBUG 2015-04-24 17:25:34,799 job 9568 ended galaxy.datatypes.metadata DEBUG 2015-04-24 17:25:34,800 Cleaning up external metadata files I am not sure at all but I think this is the internal error that I got with the history panel problem: 130.225.125.174 - - [24/Apr/2015:15:22:19 +0200] GET /api/histories/32c70b5011288952/contents HTTP/1.1 500 - http://galaxy.bric.dk/history/view_multiple; Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2062.120 Safari/537.36 galaxy.web.framework.decorators ERROR 2015-04-24 15:22:20,033 Uncaught exception in exposed API method: Traceback (most recent call last): File /steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/web/framework/decorators.py, line 251, in decorator rval = func( self, trans, *args, **kwargs) File /steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/webapps/galaxy/api/history_contents.py, line 105, in index hda_dict = self.hda_serializer.serialize_to_view( trans, content, view=view ) File /steno-internal/projects/galaxy/galaxy-dist/lib/galaxy/managers/base.py, line 621, in serialize_to_view return self.serialize( trans, item, all_keys ) File /steno-internal/projects/galaxy/galaxy
[galaxy-dev] History panel error
Hello all, I have updated my Galaxy distribution today to the latest version 15.03 and the history panel is not being shown. The history seems to be there but Galaxy can't display the datasets. Even if I create a new history and I run a tool the dataset is not shown. It is just displayed an error which says: An error occurred while getting updates from the server... and the details shows this: { agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2062.120 Safari/537.36, url: http://galaxy.bric.ku.dk/history/view_multipl/api/histories/782c6cc7ec4cfd9f/contents;, data: , options: { data: {}, silent: true, parse: true, emulateHTTP: false, emulateJSON: false }, xhr: { readyState: 4, responseText: {\err_msg\: \Uncaught exception in exposed API method:\, \err_code\: 0}, responseJSON: { err_msg: Uncaught exception in exposed API method:, err_code: 0 }, status: 500, statusText: Internal Server Error, responseHeaders: { Date: Fri, 24 Apr 2015 12:09:40 GMT\r, cache-control: max-age=0,no-cache,no-store\r, Transfer-Encoding: chunked\r, Server: PasteWSGIServer/0.5 Python/2.7.3\r, Connection: close\r, x-frame-options: SAMEORIGIN\r, content-type: application/json\r } }, source: [], user: { username: jose, quota_percent: 0, total_disk_usage: 101352147, nice_total_disk_usage: 96.7 MB, email: jjalma...@gmail.com, is_admin: true, tags_used: [], model_class: User, id: 72ad249754f05d26 } } Do you have any idea what could go wrong? Thank you and best regards, Jose ___ Please keep all replies on the list by using reply all in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: https://lists.galaxyproject.org/ To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/
[galaxy-dev] Sorting users by last login
Hello all, I would like to have the option of sorting users by last login to delete the inactive ones. Is there any way of doing this or which files should I modify? Thank you! Jose ___ Please keep all replies on the list by using reply all in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: https://lists.galaxyproject.org/ To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/