1) during/after finished the loading of a webpage, it is mapped to a DOM TREE to be rendered, right ?


It's not mapped, it _is_ a DOM tree.  But the DOM is not what's rendered.

2) and the algorithm used for rendering must walk though this dom tree , right ?!


No, the rendering algorithm walks the layout object tree, which is constructed from the DOM tree. And it doesn't walk the whole thing in order, since that wouldn't work with z-index...

3) if so, I would make some changes on it, making good use of this walking. Actually, during this walking, I will just compare if the current node


The layout objects do not map 1-1 to nodes. It's actually a many-many relationship, in general.

** Does my rationale make sense to you ?


Sort of, yes.

Got it ! Excelent. Your anwser clarifyed many points in my mind.


** in the worse case (I am totally wrong), what can I do ?


I still suggest using a treewalker or nodeiterator. Or caching your list of relevant DOM nodes and invalidating the cache on DOM mutations as needed.

I got that the usage of treewalker would be really useful. The problem with this approach is that , afaik, it (the treewalker) uses filters to select the nodes (elements, attr, and so on) according to its purpose, right ? My current algorithm does something like that (it is pretty simple JS file):

function czZoomElements(node, scaleFactor) {
   var childNodes = node.childNodes;

   var childNode = null;
   for (var i = 0; i < childNodes.length; i++) {
       childNode = childNodes[i];
if (childNode.nodeType == 1) {
           czZoomElements(childNode, scaleFactor);
       }
   }

   // for frames:
   if (node.contentDocument) {
           czZoomElements(node.contentDocument, scaleFactor);
   }

   try {
       if (node.localName == "IMG") {
               ... quoted ...
    } catch () {...}

   // text size handlering
      (...)

So, as far as I understood, this filter is applied during every walk-through on the tree. If so, it would not speed up the perfomance as I expected. My current algorithm above does many comparisons as well (filtering informations).

IMHO, the best case would be that this new data structure (a linked list or another DOM tree with just the relevant elements stored in it, as you suggested) were created once and cached in memory and invalidating it on DOM mutations as needed (as you said).

any thoughts ?

regards

---Antonio Gomes
Nokia Technology Institute
_______________________________________________
mozilla-layout mailing list
[email protected]
http://mail.mozilla.org/listinfo/mozilla-layout

Reply via email to