Thomas Michanek wrote:

> I have an older Mac laptop running MacOS 10.12.x
> I have installed Lynx as part of Lynxlet (version 0.8.1), which contains
> a Lynx installation with the following version information:
> > Lynx Version 2.8.6rel.5 (09 May 2007)
> > libwww-FM 2.14, SSL-MM 1.4.1, OpenSSL 0.9.7l, ncurses 5.7.20081102
> > Built on darwin8.9.1 Jun  8 2007 06:18:10

As others have stated, this version of Lynx (and associated libraries)
is just too old to do much with the modern web.  Specifically, almost
all web sites now insist on HTTPS protocol, and most will only offer
relatively recent versions of the associated cryptography protocols.
That ancient distribution probably also lacks root-of-trust certificates
relevant to the current web.

I googled 'comprehensive list of sites or sources (such as brew,
macports, and any other) for lynx binary capable of running on macos
10.12', and their Artificial Idiot claimed that you should be able to
get a suitable binary from any of: Homebrew, MacPorts, Fink; it also
mentioned Lynxlet (which you're using and is clearly not suitable), and
something called Rudix that I haven't encountered before.

Of Homebrew, MacPorts, and Fink -- I only have experience with Homebrew,
and I'm not particularly a fan, but it does seem to mostly work.  I am
especially not a fan of the situation where Mac OS has poor open source
support on its own, but there are multiple competing and *clashing*
'ecosystems' you might need to subscribe to if you need multiple open
source apps.  But this shouldn't be a concern if you only need Lynx.
See:

https://brew.sh/
https://www.macports.org
https://www.finkproject.org/

(At a glance, 'Fink' looked the most likely to easily support Mac OS
10.12 without additional hassles...)

Google's AI also mentioned building Lynx from source yourself; but if
you are not experienced at such things, that is probably not the best
route.

> I want to use Lynx with the -source option to download the HTML source
> code from websites.

OR... Mac OS will already have a preinstalled copy of `wget` and/or
`curl`, which are both perfectly capable of downloading source code for
web pages.  Run `man wget`, `man curl` -- to see which you have + learn
how to use it.  For instance, these three commands are somewhat
equivalent:

$ lynx -source blahblah.com > blahblah.html
$ wget -O blahblah.html blahblah.com
$ curl -o blahblah.html blahblah.com

(In practice you might have to specify different parts of URLs, as each
program is likely to fill in missing details differently; e.g.
'https://', 'www.', '/index.html', etc.  And none of these methods will
help all that much with modern web sites which are all dynamic and full
of JavaScript; you're likely to get an 'empty shell' of a web page with
none of the actual data filled in...)

>Bela<

Reply via email to