I'll have a look - the previous workflow I developed for a single large
table used GenerateTableFetch followed by ExecuteSQLRecord.

Did you develop any useful strategies for larger numbers of tables, or was
doing everything by hand the best way in the long run? I'm thinking of code
to generate parts of the workflow, or some kind of template to describe key
parts of the table structure.

On Mon, Jan 27, 2025 at 9:56 AM Dennis N Brown <mr.dnbr...@gmail.com> wrote:

> I have always used "QueryDatabaseTable" for this... it does tend to get
> complicated when you have a large number of tables, but you have a lot of
> control,  like setting a starting point for the "copy".
>
> On Sun, Jan 26, 2025, 17:04 Richard Beare <richard.be...@gmail.com> wrote:
>
>> Hi,
>> I have a project for which mirroring, potentially in a close to real time
>> fashion, of a complex DB might be very useful. We have only limited access
>> to the source systems and therefore can't use some of the more conventional
>> approaches for live replication. I'm aware that GenerateTableFetch is
>> useful for fetching recent changes from a remote table. Does anyone have
>> experience using it, or an alternative, for a case where there may be
>> hundreds of tables? If so, can you recommend strategies for keeping the
>> system maintainable?
>> Thanks
>>
>

Reply via email to