Short version: Is there any way to override the ReadOnly behavior of AR3 when a SELECT clause is specified?
I suspect I know why that change was made, but I have a Rails conversion / legacy data situation where I really need those returned models to be updatable. Long version: A collection of related apps sharing 4 databases, over 250 tables, with some of those tables having over 200 fields. It was designed to suit dbm philosophies not OO/ORM philosophies. From a Rails perspective they should be broken down into smaller tables and a boatload of has_one associations defined. But we can't do that. While I am converting this one app to Rails, others needing to use the DB will not be converted just yet. So, I have to leave the schema 99.9% alone until after all apps are Rails, then we can refactor the schema. I need a plan for evolving the models and schema over time. I'm hoping to find ways to define small models that somehow use only a prtion of those large tables. ActiveRecord probably won't be very good at that given the way it reflects on the table schema. So, having SELECTS not be read only is critical. I know I could use AR2 syntax, but I'd rather not if I don't have to. If I do, then fine. I'm looking at DataMapper too as it might allow me to have many classes which define the use of only a subset of the fields from the large tables. Whichever of the two ORMs allows me the most efficient plan to evolve the schema wins. -- gw -- Posted via http://www.ruby-forum.com/. -- You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en.

