On Sun, Aug 29, 2021 at 7:51 PM Rob Galanakis <[email protected]> wrote:

> I am trying to optimize some code to use bulk creation of models- on the
> order of dozens, not hundreds, and I do still need models, so raw dataset
> methods aren't suitable.
>
> I don't see a way to do this with Sequel- we have Dataset#import and
> #multi_insert, and they can return an array of primary keys. But there is
> internal machinery in Model (@new and @modified attributes among others)
> that is unavailable, so even if I set the primary key value, the model
> still thinks it is new. It'd be nice at times to "enqueue" an unsaved model
> for bulk insert; save it; and then use it once saved.
>
> Is this something reasonable to add? Django, which for the most part I
> loathe, has this sort of thing (of course its bulk create is entirely
> hand-rolled and does not use underlying database machinery so is filled
> with other issues). It would also seem possible to work around this by
> being able to update a model in-place, or something like 'refreshing' it
> from some data rather than the database (`_clear_changed_columns` is not
> enough because of the internal attributes).
>

I don't think it sounds like something I'd want to add to Sequel::Model,
but I'd have to see an implementation.  There would have to be some
significant advantage over the equivalent of:

  enumerable.each{ModelClass.create(...)}

It would also have to be general enough that I could see a substantial
number of people using it.

You might want to start out creating a plugin for what you want.

Thanks,
Jeremy

-- 
You received this message because you are subscribed to the Google Groups 
"sequel-talk" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/sequel-talk/CADGZSSfZpD2JnMowS9oh_H_tffDdfDGBw%3Dhec0WKRpz2_QoD0A%40mail.gmail.com.

Reply via email to