On 5/31/11 11:18 AM, Rostyslaw Lewyckyj wrote:
> Justin Wood (Callek) wrote:
>> On 5/29/2011 4:25 PM, W3BNR wrote:
>>>
>>> Personally I don't think we should have to lie about our browser.
>>
>> We (SeaMonkey Council) fully agree with you. That said, our users are
>> higher priority. KaiRo (longtime project member, and Council member) has
>> proposed, years ago, a theoretical solution to the issue (back when
>> Firefox was not even recognized by websites, fwiw), the problem is we
>> don't have the backend gecko support atm, and it would be A LOT of work
>> to try and implement on our own.
>>
>> If the support is ever there, we'll gladly use it though.
>>
> May I naively suggest the following course:
> In effect build agent switcher into SM. First let SM identify itself
> correctly. This lets the bean counters get SM into their usage totals. 
> But if the host objects then let the user be able, on the fly, to switch 
> to one of a select list of fake ids, (which would also carry
> a token to identify it as an SM alias/fake), and try the host again.
> (Perhaps the first part of going through the list of fake ids, might
> even be scripted and automatic).
> I envision the following scenario:
>   - User tries to enter/link-to/ a site
>   - Site rejects with a message
>   - User presses a 'try faking it' button
>   - SM switches to script of sending fake UAs till success of utter
>    failure.
> Elaborations are obvious: Record UAs tried at the given host and
> the results. Report these as incidents to SM central for actions
> to try to convince the host owners to correct their behaviour.
> Record successes (host identifier & successful UA) to short-cut
> trial and error when going again to that host. i.e. build in
> memory
> 

Such a scheme would be very nice.  However, I think it might be
impossible to implement.

Incorrect sniffing that looks for "SeaMonkey" instead of "Gecko" does
not usually result in an outright rejection of the browser.  Instead,
the Web page might be rendered incorrectly because non-standard HTML was
sent.  Alternatively, you get a valid Web page that states something
such as
> This operation is not currently available.
> Please contact your System Administrator and try again later.
> We are sorry for this interruption in your service. 
or something such as
> XML Parsing Error: mismatched tag. Expected: </input>.
> Location: http://store.spruebrothers.com/recent-arrivals-c373.aspx
> Line Number 60, Column 3:</form>
> --^ 
or a request (demand?) that you install the latest version of Flash even
though you already installed the latest version.  Even a human may have
trouble recognizing that what he or she sees in the browser window is
the result of invalid sniffing.  Think of the software logic to make
such a recognition.

Then there is a cookie problem.  Many Web sites that sniff for the UA
string set a session-only cookie indicating what user agent you are
using.  A session-only cookie persists not merely as long as you are
viewing the Web site; no, it lasts until you terminate your browser.
Having a cookie that says "This is not IE or Firefox or Chrome" often
overrides any subsequent change to your UA string.  (Note that I omitted
Safari and Opera.  Many sniffing Web sites fail to recognize those
browsers.)  You must delete the cookie when you change the UA string.
You must first determine a cookie was set and which cookie if more than
one was set, being careful not to delete unrelated cookies.

Are you beginning to see how complicated this can be?  :)

-- 

David E. Ross
<http://www.rossde.com/>

On occasion, I might filter and ignore all newsgroup messages
posted through GoogleGroups via Google's G2/1.0 user agent
because of spam from that source.
_______________________________________________
support-seamonkey mailing list
[email protected]
https://lists.mozilla.org/listinfo/support-seamonkey

Reply via email to