There's an extension called LiveHTTPHeaders which allows the relevant
request information to be captured and saved to a file.

http://livehttpheaders.mozdev.org/

Open it from the tools menu, then do the things that are slow while
it's opened, then click "save all" and save it to a file. Don't post
the file publically, since it will contain your login cookies, but you
can send it to Krinkle as an email attachment, if he wants it.

-- Tim Starling

On 10/08/12 11:21, Eugene Zelenko wrote:
> Hi, Krinkle!
> 
> I'm very sorry for beginner question, but how could get such log in
> Firefox 14? Is some extension available which could dump all pages
> with timestamps downloaded to view particular page? Or may be Firefox
> could do this itself?
> 
> Eugene.
> 
> On Thu, Aug 9, 2012 at 7:59 AM, Krinkle <krinklem...@gmail.com> wrote:
>> On Aug 9, 2012, at 4:49 PM, Eugene Zelenko wrote:
>>
>>> Hi!
>>>
>>> I noticed that content from  bits.wikimedia.org (including WikiEditor)
>>> is updated quite regularly - ~ every 20 minutes on Commons.
>>>
>>> Such behavior is definitely creates problem for users with slow
>>> connections or with payed data traffic.
>>>
>>> Are JavaScript/CCS are really updated so often?
>>>
>>> Eugene.
>>>
>>
>> Can you elaborate a bit? (urls, timestamps, http headers, ..)
>>
>> -- Krinkle
>>
>> _______________________________________________
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l



_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to