>Haven't tried it myself, but you may be able to connect the DB2 database
>to your PostgreSQL cluster using this FDW module:
>https://github.com/wolfgangbrandl/db2_fdw
>Looks like db2_fdw is DB2 LUW only though, so you might be out of luck
>if your DB2 is on IBM i (or z ;-)
As the thread
On 01.11.2018 18:27, Ravi Krishna wrote:
I have a project to develop a script/tool to copy data from DB2 to PG. The
approach I am thinking is
1. Export data from db2 in a text file, with, say pipe as delimiter.
2. Load the data from the text file to PG using COPY command.
In order to make it
On 1/11/18 7:27 μ.μ., Ravi Krishna wrote:
I have a project to develop a script/tool to copy data from DB2 to PG. The
approach I am thinking is
1. Export data from db2 in a text file, with, say pipe as delimiter.
2. Load the data from the text file to PG using COPY command.
In order to make
On Thu, Nov 1, 2018 at 10:50 AM Ravi Krishna wrote:
> [...] What I need is a constant refresh.
> We plan to use it daily to replicate data from db2 to pg.
Perhaps you've already considered and discarded the idea, but your use
case made me think back to when I was looking at AWS SCT as a way to
> I've never used it, but there is this in case it's helpful:
> https://github.com/dalibo/db2topg/
I looked into it. I thought it is a schema convertor plus data load. In other
words,
it is one of those one time migration script. What I need is a constant
refresh.
We plan to use it daily
On Thu, Nov 1, 2018 at 10:28 AM Ravi Krishna wrote:
>
> I have a project to develop a script/tool to copy data from DB2 to PG. The
> approach I am thinking is
>
> 1. Export data from db2 in a text file, with, say pipe as delimiter.
> 2. Load the data from the text file to PG using COPY command.
I have a project to develop a script/tool to copy data from DB2 to PG. The
approach I am thinking is
1. Export data from db2 in a text file, with, say pipe as delimiter.
2. Load the data from the text file to PG using COPY command.
In order to make it faster I can parallelize export and load