Re: Filtering webpages

2016-01-28 Thread Vince M Hudd
Gavin Wraith  wrote:

> I would also like to be able to discriminate content by source URL and to
> give permissions for which should be blocked or which allowed through.

At the domain level, you could add entries to your hosts file for sources
you want blocked, along the lines of:

127.0.0.1google-analytics.com

This will effectively stop websites you visit on that computer *with any
browser* from loading anything from that domain.

If you can do something similar in your router, you will achieve the same
result for any computer on your network.

-- 
Vince M Hudd
Soft Rock Software

Don't forget to vote in the 2015 RISC OS Awards:
www.riscosawards.co.uk/vote2015.html



Re: Very slow page rendering

2016-01-28 Thread Ole Loots
Am Donnerstag, den 28.01.2016, 21:04 + schrieb Peter Slegg:

> 
> https://www.royalmail.com/track-your-item
> 
> Another page that took ages to display and looked like the css had
> failed as well.
> 

Early versions of the atari port crashed because of stack size issues
(caused within mintlib regex module, called by NetSurf CSS parser, while
it was processing exorbitant strings (3k, which expanded to several
MegaBytes within mintlib regex module)...).

I prevented the crash by compiling mintlib with the
+DEFS=-DREGEX_MALLOC 
define. 

This is also applied to the CI builds. 

Just a guess: there is some kind of slowdown when doing excessive malloc
operations with MiNT. 

Greets,
Ole



 







 






Re: Very slow page rendering

2016-01-28 Thread Peter Slegg

> Date: Sun, 10 Jan 2016 16:43:40 +0100
> From: Jean-Fran?ois Lemaire 
> Subject: Re: Very slow page rendering
>
> On Saturday 09 January 2016 19:43:53 Peter Slegg wrote:
> > >Date: Sat, 09 Jan 2016 16:11:40 GMT
>
> > >>   Peter Slegg  wrote:
> > >>>http://git.netsurf-browser.org/netsurf.git/tree/atari/gemtk/guiwin.c
>
> > >>>This page takes abut 20mins to download and render, Highwire browser
> > >>>takes about 6sec.
>
> > No criticism, I am hoping this might help the devs find some speed
> > improvements.
>
> I have an Atari 2.9 version lying around and with that build it takes 100 secs
> to render that page. Still very slow but much less so than with the 3.*
> builds.
>
> Cheers,
> JFL

https://www.royalmail.com/track-your-item

Another page that took ages to display and looked like the css had
failed as well.

Peter






Re: Big push on testing needed

2016-01-28 Thread george greenfield
In message <5548c5c513cvj...@waitrose.com>
  Chris Newman  wrote:

> In article <3bc72c4755.davem...@my.inbox.com>,
>Dave Higton  wrote:
>> Big news...

>> Current test (CI) builds are now release candidates.  Yes, a new
>> release of NetSurf is imminent.

>> Please, everybody, download the latest test build (which will,
>> of course, change as bugs are found and fixed), give it a good
>> thrashing, and get your bug reports in.

>> Please also note that, since it's now close to release time, the
>> Javascript setting in Choices->Content is obeyed (and has been
>> for a couple of days or so now).

> Greetings from sunny Australia (gloat, gloat),

> The 38 Degrees petition page at

> https://secure.38degrees.org.uk/Scotland-stop-CETA

> Is a bit of a pigs dinner. It takes ages to load, frames overlap & the
> signing link doesn't work.

> Dev CI #3315   on  Virtual Acorn Adjust 4.39
> Same effects with JS on or off.

> Works OK using Maxthon browser on the Windows side.

> Does anyone see the same effects?

2.6s JS off, 5.4s JS on, CI #3312, 5.21 (RC14), Pi 2 @ 900MHz. Page 
display is substantially different compared to Otter 0.9.09 on RISC 
OS. The signing link doesn't work in Netsurf with JS on or off, but 
Otter seems fully functional with JS enabled. I didn't make a precise 
count of page loading time in Otter - it's considerably slower than 
NS, about 20-30 secs JS off/on.

-- 
George