unless pythonanywhere has some kind of direct integration, the simplest of 
the ways would be to backup incrementally the db on pythonanywhere and 
restore incrementally on redshift. 
ATM the pain-point on this simple architecture (the one you'd expect from 
two postgresql service providers) is that I think redshift just alows you 
to use COPY if you want to pump data in, so the incremental restore is not 
feasible (no pg_restore)
Back to the point: it'd be pretty easy to make "your own" incremental 
dump/restore, but only if you have a limited set of tables, not extremely 
correlated (no lots of FK) and "timestamped" data in an append-only 
fashion. Or if your entire dataset is rather small, you can do a full dump 
and a full restore.

On Monday, July 25, 2016 at 9:24:36 PM UTC+2, Alex Glaros wrote:
>
> Am not well versed in architecture so hope this question even makes sense
>
> I like how Amazon Redshift looks and was wondering if it's possible to 
> have my web2py transactional Postgres database on PythonAnywhere 
> continually copy data in real time to Amazon Redshift for users to create 
> their own reports with?
>
> If yes, what is conceptual overview to accomplish that?
>
> thanks,
>
> Alex Glaros
>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to