Hello,
I am using the galaxy built-in database, SQLAlchemy on top of sql lite?
May I swith to postgress?
Regards
On 5 February 2015 at 14:41, Hans-Rudolf Hotz <[email protected]
<mailto:[email protected]>> wrote:
Hi Roberto
just double checking: are you using a PostgreSQL database, or are
you relying on the built in SQLite?
Hans-Rudolf
On 02/05/2015 02:26 PM, Roberto Alonso CIPF wrote:
Hello,
I am trying to use parallelism in Galaxy. I added this entry to the
tool xml config:
<tool id="fa_gc_content_1" name="Compute GC content">
<description>for each sequence in a file</description>
<parallelism method="basic" split_size="8"
split_mode="number_of_parts"><__/parallelism>
But when I run the job, the log shows the next:
Traceback (most recent call last):
File
"/home/ralonso/galaxy-dist/__lib/galaxy/jobs/runners/____init__.py",
line 158, in prepare_job
job_wrapper.prepare()
File
"/home/ralonso/galaxy-dist/__lib/galaxy/jobs/__init__.py", line
1607, in prepare
tool_evaluator.set_compute___environment(
compute_environment )
File
"/home/ralonso/galaxy-dist/__lib/galaxy/tools/evaluation.__py", line
53, in set_compute_environment
incoming = self.tool.params_from_strings( incoming, self.app )
File
"/home/ralonso/galaxy-dist/__lib/galaxy/tools/__init__.py", line
2810, in params_from_strings
return params_from_strings( self.inputs, params, app,
ignore_errors )
File
"/home/ralonso/galaxy-dist/__lib/galaxy/tools/parameters/____init__.py",
line 103, in params_from_strings
value = params[key].value_from_basic( value, app,
ignore_errors )
File
"/home/ralonso/galaxy-dist/__lib/galaxy/tools/parameters/__basic.py",
line
162, in value_from_basic
return self.to_python( value, app )
File
"/home/ralonso/galaxy-dist/__lib/galaxy/tools/parameters/__basic.py",
line
1999, in to_python
return app.model.context.query(
app.model.__HistoryDatasetAssociation
).get( int( value ) )
File
"/home/ralonso/galaxy-dist/__eggs/SQLAlchemy-0.7.9-py2.7-__linux-x86_64-ucs4.egg/__sqlalchemy/orm/query.py",
line 775, in get
return self._load_on_ident(key)
File
"/home/ralonso/galaxy-dist/__eggs/SQLAlchemy-0.7.9-py2.7-__linux-x86_64-ucs4.egg/__sqlalchemy/orm/query.py",
line 2512, in _load_on_ident
return q.one()
File
"/home/ralonso/galaxy-dist/__eggs/SQLAlchemy-0.7.9-py2.7-__linux-x86_64-ucs4.egg/__sqlalchemy/orm/query.py",
line 2184, in one
ret = list(self)
File
"/home/ralonso/galaxy-dist/__eggs/SQLAlchemy-0.7.9-py2.7-__linux-x86_64-ucs4.egg/__sqlalchemy/orm/query.py",
line 2227, in __iter__
return self._execute_and_instances(__context)
File
"/home/ralonso/galaxy-dist/__eggs/SQLAlchemy-0.7.9-py2.7-__linux-x86_64-ucs4.egg/__sqlalchemy/orm/query.py",
line 2242, in _execute_and_instances
result = conn.execute(querycontext.__statement, self._params)
File
"/home/ralonso/galaxy-dist/__eggs/SQLAlchemy-0.7.9-py2.7-__linux-x86_64-ucs4.egg/__sqlalchemy/engine/base.py",
line 1449, in execute
params)
File
"/home/ralonso/galaxy-dist/__eggs/SQLAlchemy-0.7.9-py2.7-__linux-x86_64-ucs4.egg/__sqlalchemy/engine/base.py",
line 1584, in _execute_clauseelement
compiled_sql, distilled_params
File
"/home/ralonso/galaxy-dist/__eggs/SQLAlchemy-0.7.9-py2.7-__linux-x86_64-ucs4.egg/__sqlalchemy/engine/base.py",
line 1698, in _execute_context
context)
File
"/home/ralonso/galaxy-dist/__eggs/SQLAlchemy-0.7.9-py2.7-__linux-x86_64-ucs4.egg/__sqlalchemy/engine/base.py",
line 1691, in _execute_context
context)
File
"/home/ralonso/galaxy-dist/__eggs/SQLAlchemy-0.7.9-py2.7-__linux-x86_64-ucs4.egg/__sqlalchemy/engine/default.py",
line 331, in do_execute
cursor.execute(statement, parameters)
OperationalError: (OperationalError) database is locked u'SELECT
history_dataset_association.id
<http://history_dataset_association.id>
<http://history_dataset___association.id
<http://history_dataset_association.id>>
AS history_dataset_association___id,
history_dataset_association.__history_id AS
history_dataset_association___history_id,
history_dataset_association.__dataset_id AS
history_dataset_association___dataset_id,
history_dataset_association.__create_time AS
history_dataset_association___create_time,
history_dataset_association.__update_time AS
history_dataset_association___update_time,
history_dataset_association.__state AS
history_dataset_association___state,
history_dataset_association.__copied_from_history_dataset___association_id
AS
history_dataset_association___copied_from_history_dataset___association_id,
history_dataset_association.__copied_from_library_dataset___dataset_association_id
AS
history_dataset_association___copied_from_library_dataset___dataset_association_id,
history_dataset_association.__hid AS
history_dataset_association___hid,
history_dataset_association.__name
<http://history_dataset_association.name>
<http://history_dataset___association.name
<http://history_dataset_association.name>> AS
history_dataset_association___name,
history_dataset_association.__info
<http://history_dataset_association.info>
<http://history_dataset___association.info
<http://history_dataset_association.info>> AS
history_dataset_association___info,
history_dataset_association.__blurb AS
history_dataset_association___blurb,
history_dataset_association.__peek AS
history_dataset_association___peek,
history_dataset_association.__tool_version AS
history_dataset_association___tool_version,
history_dataset_association.__extension AS
history_dataset_association___extension,
history_dataset_association.__metadata AS
history_dataset_association___metadata,
history_dataset_association.__parent_id AS
history_dataset_association___parent_id,
history_dataset_association.__designation AS
history_dataset_association___designation,
history_dataset_association.__deleted AS
history_dataset_association___deleted,
history_dataset_association.__purged
AS history_dataset_association___purged,
history_dataset_association.__visible AS
history_dataset_association___visible,
history_dataset_association.__hidden_beneath_collection___instance_id
AS
history_dataset_association___hidden_beneath_collection___instance_id,
history_dataset_association.__extended_metadata_id AS
history_dataset_association___extended_metadata_id, dataset_1.id
<http://dataset_1.id>
<http://dataset_1.id> AS dataset_1_id, dataset_1.create_time AS
dataset_1_create_time, dataset_1.update_time AS
dataset_1_update_time,
dataset_1.state AS dataset_1_state, dataset_1.deleted AS
dataset_1_deleted, dataset_1.purged AS dataset_1_purged,
dataset_1.purgable AS dataset_1_purgable,
dataset_1.object_store_id AS
dataset_1_object_store_id, dataset_1.external_filename AS
dataset_1_external_filename, dataset_1._extra_files_path AS
dataset_1__extra_files_path, dataset_1.file_size AS
dataset_1_file_size,
dataset_1.total_size AS dataset_1_total_size, dataset_1.uuid AS
dataset_1_uuid \nFROM history_dataset_association LEFT OUTER JOIN
dataset AS dataset_1 ON dataset_1.id <http://dataset_1.id>
<http://dataset_1.id> =
history_dataset_association.__dataset_id \nWHERE
history_dataset_association.id
<http://history_dataset_association.id>
<http://history_dataset___association.id
<http://history_dataset_association.id>> =
?' (1,)
galaxy.jobs.runners ERROR 2015-02-05 12:58:11,431 (89_486) Failure
preparing job
So when one task tries to run it fails, it seems that the
database is
locked by other task. When I run with 4 splits it never happen,
but with
5 it begins to happen. Indeed with 5 splits sometimes it doesn't
happen,
but since 6 it always occurs.
Could you please help me?
Regards
--
Roberto Alonso
Functional Genomics Unit
Bioinformatics and Genomics Department
Prince Felipe Research Center (CIPF)
C./Eduardo Primo Yúfera (Científic), nº 3
(junto Oceanografico)
46012 Valencia, Spain
Tel: +34 963289680 Ext. 1021 <tel:%2B34%20963289680%20Ext.%201021>
Fax: +34 963289574 <tel:%2B34%20963289574>
E-Mail: [email protected] <mailto:[email protected]>
<mailto:[email protected] <mailto:[email protected]>>
_____________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client. To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
https://lists.galaxyproject.__org/
<https://lists.galaxyproject.org/>
To search Galaxy mailing lists use the unified search at:
http://galaxyproject.org/__search/mailinglists/
<http://galaxyproject.org/search/mailinglists/>
--
Roberto Alonso
Functional Genomics Unit
Bioinformatics and Genomics Department
Prince Felipe Research Center (CIPF)
C./Eduardo Primo Yúfera (Científic), nº 3
(junto Oceanografico)
46012 Valencia, Spain
Tel: +34 963289680 Ext. 1021
Fax: +34 963289574
E-Mail: [email protected] <mailto:[email protected]>