Hello!
I have a huge dictionary table with series data generated by a third-party
service. The table consists of 2 columns
- id : serial, primary key
- series : varchar, not null, indexed
From time to time I need to apply a "patch" to the dictionary, the patch file
consists of "series" data, one
If I exclude the large tables(>30GB) in the parallel dump it does succeed and
normal dump also succeeds. So I am not sure if the network is at fault. Is
there any other option that might help to make parallel dump usable for large
tables?
thanks
shanker
-Original Message-
From: Tom Lan
On 02/14/2015 08:58 AM, Ramesh T wrote:
exactly what I am trying convert oracle to postgres ,
following
1)first i am creating type in oracle
CREATE TYPE suborder_list AS (suborder_id int);
2)second creating table type in oracle
create or replace type suborder_list_table as table of suborder_l
On 02/14/2015 08:22 AM, Ramesh T wrote:
dbms_scheduler.create_job(
job_name => 'DELETE_EMPTY_PART_NUMS'
,job_type => 'PLSQL_BLOCK')
without pgagent or cron is not possible..?
It has already been stated a couple of times dbms_scheduler does not
exist in the community version of Po
2015-02-14 17:22 GMT+01:00 Ramesh T :
> dbms_scheduler.create_job(
> job_name => 'DELETE_EMPTY_PART_NUMS'
> ,job_type => 'PLSQL_BLOCK')
>
> without pgagent or cron is not possible..?
>
Not in PostgreSQL
Regards
Pavel Stehule
>
> On Mon, Feb 9, 2015 at 11:35 AM, Pavel Stehule
> wro
"Raymond O'Donnell" writes:
> On 14/02/2015 15:42, Shanker Singh wrote:
>> Hi,
>> I am having problem using parallel pg_dump feature in postgres release
>> 9.4. The size of the table is large(54GB). The dump fails with the
>> error: "pg_dump: [parallel archiver] a worker process died
>> unexpected
On 14/02/2015 15:42, Shanker Singh wrote:
> Hi,
> I am having problem using parallel pg_dump feature in postgres release
> 9.4. The size of the table is large(54GB). The dump fails with the
> error: "pg_dump: [parallel archiver] a worker process died
> unexpectedly". After this error the pg_dump ab
Hi,
I am having problem using parallel pg_dump feature in postgres release 9.4. The
size of the table is large(54GB). The dump fails with the error: "pg_dump:
[parallel archiver] a worker process died unexpectedly". After this error the
pg_dump aborts. The error log file gets the following messa
AI Rumman writes:
> I started the following query in Postgresql 9.1 where only this sql is
> running on the host and it has been taking more than an hour and still
> running.
> alter table userdata.table1 alter column name type varchar(512);
Pre-9.2 releases don't realize that that doesn't requi
i solved my problem using string_agg in tab_to_string
ex:-
select string_agg(s.Rnumber ::text,',' )AS number
On Fri, Feb 13, 2015 at 10:40 PM, Raymond O'Donnell wrote:
> On 13/02/2015 13:13, Ramesh T wrote:
> > cast(COLLECT (r_id) as num) in oracle..
> >
> > is their *collect *function in postg
Guillaume Lelarge wrote:
2015-02-14 14:07 GMT+01:00 Berend Tober mailto:bto...@broadstripe.net>>:
Saimon Lim wrote:
Thanks for your help
I want to restrict some postgres users as much as possible and allow
them to execute a few my own stored procedures only.
C
2015-02-14 14:07 GMT+01:00 Berend Tober :
> Saimon Lim wrote:
>
>> Thanks for your help
>>
>> I want to restrict some postgres users as much as possible and allow
>> them to execute a few my own stored procedures only.
>>
>
> Create the function that you want restrict access to in a separate
> 'pr
Saimon Lim wrote:
Thanks for your help
I want to restrict some postgres users as much as possible and allow
them to execute a few my own stored procedures only.
Create the function that you want restrict access to in a separate
'private' schema to which usage is not granted.
Create the func
13 matches
Mail list logo