The real problem is never exactly where you are looking.
By process of elimination (and Venkman), I was able to discover that
the whole reason that the script was stalling was because an entirely
different part of the script was trying to find any flashes on the page
and make them auto-hide after 5 seconds.
There weren't any, but because there was 135K of table code and a
zillion potential objects on the page that might match $$('.flash'),
that's where the stall was happening. As soon as I tightened that
definition up to $$('div#file_table div.flash'), the page loaded almost
immediately.
Thanks again to Jerod for the idea -- this is amazingly quicker now
that I found the rest of the problem.
Walter
On Aug 27, 2007, at 8:46 PM, Walter Lee Davis wrote:
>
> Thanks, that speeds it up by over 100%, but it's still throwing one
> "unresponsive script" warning per page load. I'll try a little more
> tinkering...
>
> Thanks again,
>
> Walter
>
> On Aug 27, 2007, at 6:53 PM, Jerod Venema wrote:
>
>> Assuming I'm understanding your issue....
>>
>> On 8/27/07, Walter Lee Davis <[EMAIL PROTECTED]> wrote:
>>> I have a script which attaches an "add this to my cart" control to
>>> one
>>> of N rows of files in a Web page using invoke on a collection
>>> gathered
>>> with $$('#mytable .mycontrol'). On most pages, this works great.
>>>
>>> But I have this one killer page with over 3,000 rows in the table.
>> Can you limit the number of rows returned? Some pagination might help.
>> I can't think of many situations where anyone would really need to see
>> 3000+ rows of information. In the worst case, maybe you can somehow
>> group the elements, and only retrieve the elements in the group if
>> someone selects the group?
>>
>>> Once the script finishes loading, it runs really well. But that
>>> initial
>>> load takes so long that I have to press the "keep trying" button
>>> three
>>> times in Firefox.
>>>
>>> I seem to remember seeing a mention of using window.setTimeout to
>>> trigger a kind of interrupt in long-running scripts, and keep them
>>> from
>>> timing out like this. But I can't find any examples in my archives of
>>> this list or on Google.
>>> Does anyone have any idea how to make this sort of thing work, or any
>>> other suggestions about how I could speed up this initial load? Is
>>> there another more efficient way to iterate over 3,000 items like
>>> this?
>> I'm assuming from your description that you're observing all 3000+
>> items. If there's no way to reduce the number of elements on the page,
>> I'd suggest you just observe the table, and when the click happens,
>> determine what row/cell was clicked in the handler. That way, you only
>> have to call the "invoke" method once.
>>
>>> Thanks in advance,
>>>
>>> Walter
>>>
>>>>>
>
>
> >
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Ruby
on Rails: Spinoffs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/rubyonrails-spinoffs?hl=en
-~----------~----~----~----~------~----~------~--~---