Hi,
I've had to do something similar to this myself... my suggestion would be to
have a look at the multiple-db-support branch, here is some code that I use
on top of that branch to connect to different databases with the same
schema:


This function is from the wiki article
http://code.djangoproject.com/wiki/DynamicModels with some changes:

def create_model(name, fields=None, app_label='', module='', options=None,
admin=None):
    "One example of how to create a model dynamically at run-time"

    class Meta:
        # Using type('Meta', ...) gives a dictproxy error during model
creation
        pass

    if app_label:
        # app_label must be set using the Meta inner class
        setattr(Meta, 'app_label', app_label)

    # Update Meta with any options that were provided
    if options is not None:
        for key, value in options.items():
            setattr(Meta, key, value)

    # Set up a dictionary to simulate declarations within a class
    attrs = {'__module__': module, 'Meta': Meta}

    # Add in any fields that were provided
    if fields:
        attrs.update(fields)

    # Create an Admin inner class if admin options were provided
    if admin is not None:
        class Admin:
            pass
        for key, value in admin:
            setattr(Admin, key, value)
        attrs['Admin'] = Admin

    # Create the class, which automatically triggers ModelBase processing
    return type(name, (models.Model,), attrs)


This one is dependent on the multiple-db-support branch, which uses a
dictionary like structure in settings.py to specify connections to databases
like so:

OTHER_DATABASES = {
        'oss1': {'DATABASE_ENGINE' : 'oracle',
                 'DATABASE_NAME' : '....',
                 'DATABASE_USER' : '....',
                 'DATABASE_PASSWORD': '....',
                 },
        'oss2': {'DATABASE_ENGINE' : 'oracle',
                 'DATABASE_NAME' : '....',
                 'DATABASE_USER' : '....',
                 'DATABASE_PASSWORD': '.....',
                 }
                'dumposs1': {
                                 'DATABASE_ENGINE' : 'sqlite3',
                                 'DATABASE_NAME' : '/path/to/sqlite.db'
                                 },
                'dumposs2': {
                                 'DATABASE_ENGINE' : 'sqlite3',
                                 'DATABASE_NAME' : '/path/to/sqlite.db',
                                 },
        }
There's no reason that you couldn't build this structure manually, but there
are a few caveats that you'd have to read through the source to grok.


Here is my function that dynamically creates a copy of your model (and any
others in the same database which are referred to by a RelatedField)

def duplicate_model(model, other_db):
    """Given model and other_db (which is a key in connections) create a new
model
called model.__name__ + other_db, where the _default_manager's db points to
the
correct connection object"""
    meta_opts =
['db_table','db_tablespace','get_latest_by','order_with_respect_to',\

'ordering','unique_together','verbose_name','verbose_name_plural']
    options = dict([(k, getattr(model._meta, k)) for k in meta_opts])
    name = "%s_%s" % (model.__name__, other_db)
    app_label = model._meta.app_label
    module = model.__module__
    fields = dict([(f.name, f) for f in model._meta.fields])
    #Copy the str method
    fields['__str__'] = lambda self : model.__str__.im_func(self)
    new_cls = create_model(name, fields, app_label, module, options)
    new_cls._default_manager.db = connections[other_db]
    #Change related fields to also point to a model in the other_db
    from django.db.models.fields.related import RelatedField
    flds = new_cls._meta.fields + new_cls._meta.many_to_many
    for f in flds:
        if isinstance(f, RelatedField):
            to = f.rel.to
            if to == model:
                new_to = new_cls
            else:
                new_to = duplicate_model(to, other_db)
            f.rel.to = new_to
    return new_cls

Using something like this you should be able to define all of your models in
one place and "translate" them to different databases as needed. Note that
my project is small (and internal) therefore performance isn't an issue for
me. i don't know how much overhead this would entail. Maybe some more
experienced pythonistas could shed some light/provide some opinions..?

Just to note, I've also used this to copy databases from one type to
another. (Which my settings.py snippet above might allude to). It's a nice
way to copy one QuerySet worth of data from one database to another (in my
case the oracle db at work to an sqlite db on my laptop to work at home).

Feel free to give me a shout if this is the sort of thing that you need and
I can help further.
Ben

On 02/10/2007, Russell Keith-Magee <[EMAIL PROTECTED]> wrote:
>
>
> On 10/2/07, bluesky <[EMAIL PROTECTED]> wrote:
> >
> > So information about the connection to db should be stored on session,
> > not globally. Is this possible with django?
>
> Well, anything is possible, but not necessarily easy. It certainly
> isn't a built-in feature - at least, not like you describe.
>
> There are possibly other ways to acheive what you want - you might
> want to look at the sites framework, and consider the ways that you
> could deploy multiple instances of your Django application, with a
> common login interface that redirects to the appropriate site after
> login.
>
> Yours,
> Russ Magee %-)
>
> >
>


-- 
Regards,
Ben Ford
[EMAIL PROTECTED]
+6281317958862

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to