On Tue, Nov 8, 2016 at 12:34 PM, Patryk Zawadzki <pat...@room-303.com>
wrote:

> I've just hit another problem related to custom fields.
>
> Currently migrations contain information about "rich" fields. If you use a
> custom field type, the migration code will currently import your field type
> from its Python module. This is highly problematic in case either the code
> moves or you later stop using that field type and want to remove the
> dependency.
>
> I am currently in the process of rewriting some of my existing migrations
> by hand to replace all instances of a custom field type with the type it
> actually uses for storage. This will eventually allow me to drop the
> dependency but it's not very nice.
>

This was a hard choice to make - I was obviously aware of the risks here,
but eventually chose the current system given that it's far easier to reset
the migrations and start over in the Django system than it was in South,
and that removing code is generally rarer than adding it in.


>
> Another problem is that for many custom field tapes makemigrations detects
> changes made to arguments that do no affect the database in any way (as
> they are returned by deconstruction).
>

This has to be done unless fields came with a list of keyword arguments
that were "known safe", and all subclasses of those fields also implemented
that method (in case you e.g. subclassed StringField and made the `choices`
kwarg actually use a MySQL ENUM)


>
> If we could ever break backwards compatibility, I'd suggest having field
> deconstruction only return the column type (and necessary arguments) it
> wants the schema editor to create. This would prevent the migrations from
> having external dependencies (which is a major win in itself).
>

That's not possible if you want to keep the migrations database-agnostic,
as the type of a column varies based on the backend (and sometimes other
things). If you want a system that is fixed to an exact database, at some
point it might be better to just use SQL.

(There is totally room for generating migrations as raw SQL and still
having them work in the current system, which would also get around the
field problem you describe)


>
> I'd also consider having apps.get_model() just use introspection to read
> the schema and return transient models with default field types for each
> underlying column type (so a custom JSONField would become a regular boring
> TextField inside migration code). This would save us tons of "rendering
> model states" time for the relatively small cost of having to cast certain
> columns to your preferred Python types inside a couple of data migrations.
>
>
This runs into issues when the schema you read does not give you enough
information - e.g. some field types (especially geospatial ones) are more
than just a column, there can also be a sequence, some indexes, constraints
etc. involved.

I wrote a more advanced introspection backend as part of the migrations
work, but you'd need to extend it even more and improve upon features like
foreign key implication before it would be possible to do this.

Andrew

-- 
You received this message because you are subscribed to the Google Groups 
"Django developers  (Contributions to Django itself)" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-developers+unsubscr...@googlegroups.com.
To post to this group, send email to django-developers@googlegroups.com.
Visit this group at https://groups.google.com/group/django-developers.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-developers/CAFwN1uq9d6h79uujMr%2BKtOTim_1HQ810qA9ZTu%2BnewF%2BdRZ%2B1g%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to