This is a standard issue with any properly managed development environment.
You could add the images to a git/dvcs repo. You could do the same for sql,
though sql files tend to get large.

I'd say you can't go to wrong adding product images to a repo. We do this
with a huge repo ~2GB of images. We have scripted syncs that dump dbs,
auto-add & commit any image changes on one server, ssh to another server
where we then pull & merge the changes & apply the rsync'd sql file to our
postgres dbs.

Bash scripting is great for this. Heck, you can toss a little Python in
there, too, if you wish. Then just put it on a schedule.

Easy.

On Wed, Oct 7, 2009 at 7:40 AM, bab <[email protected]> wrote:

>
> After *finally* getting Satchmo to work (thanks all for the help), I
> was wondering what is the best (problem-free) way to transfer
> products, pictures, option group, and shipping data between stores?
>
> I have local instance of Satchmo running in a VMware virtual machine
> on my laptop and a live instance hosted online (both using the latest
> trunk).
>
> I tried creating and configuring all the products and shipping data
> etc in the local instance and then exporting (as XML and python)
> everything and importing the data to the live shop, but the import
> always fails.
>
> I am thinking on just copying the infor from the local database to the
> live database (both postgresql) but am unsure how to do this as there
> are so many tables and I'm not sure which ones are needed.
>
> I am sure others have had this problem. How did you solve it?
>
> Thanks.
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Satchmo users" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/satchmo-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to