On Fri, Jun 29, 2012 at 11:08 AM, Nick Sutterer <[email protected]> wrote:
I just got educated by drogus that the problem is the module_eval with
> include:
>
This is a gotcha in Ruby.
As you probably know, the interpreter keeps a linear representation of the
ancestors of a base class or module. Algorithms that walk up the ancestors
go over that linearization following pointers up, as in a linked list,
rather than following pointers recursively which would be the immediate
mental model for this.
OK, if you add methods or constants to any class or module in that ancestry
chain they are seen:
module M
end
class C
include M
end
module M
def foo
end
end
C.new.foo # WORKS
If you include more modules into the base class or module, their methods
are found:
class C
end
module M
def foo
end
end
class C
include M
end
C.new.foo # WORKS
The linear ancestry that exists behind the scenes gets updated to reflect
the new ancestor.
But if once the linearization is done you modify the 2nd-degree ancestors,
those are not propagated:
module M
end
class C
include M
end
module N
def bar
end
end
module M
include N
end
C.new.bar # DOES NOT WORK
As you see the linearization of the ancestry chain of C does not get
updated with N, albeit if you computed the ancestry chain in that very
moment N should belong to it.
I asked for the rationale behind this, conceptually does not seem coherent.
Matz said it could be changed if someone came with an implementation that
was performant enough[*].
Xavier
[*] http://www.ruby-forum.com/topic/1458576
--
You received this message because you are subscribed to the Google Groups "Ruby
on Rails: Core" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/rubyonrails-core?hl=en.