As an open-source maintainer of iperf 2, which is basically a network socket & traffic tool, I find this history extremely interesting. Releasing a measurement tool free to all, with transparent code, allows everyone access to a "shared yardstick." While maybe not enough, hopefully, it helps a little bit to those 40+ years of not much.

Bob
Good point -- "How would I know if an installation was meeting the
specs?"

It *has* been done before.  From a historical perspective...

When TCPV4 was being defined and documented in RFCs (e.g., RFC 793),
circa 1981, other activities were happening in the administrative
bureaucracy of the US government, outside the realm of the "research
community".

The US Department of Defense, which purchases huge quantities of
electronic equipment, declared TCP to be a "DoD Standard" in the early
1980s.  Further, they changed their purchasing rules so that all
equipment purchased, which might need to communicate to other
equipment, had to implement TCP.  If you wanted to sell your networked
products to the government, they had to implement TCP.   This caused
industry to suddenly pay attention to what us crazy researchers had
done in creating this TCP thing.

A separate piece of government, the US National Bureau of Standards
(now called NIST), defined a testing procedure for verifying that a
particular TCP implementation actually conformed to the documented DoD
Standard.   Further, they also created a program which would certify
third-party labs as qualified to perform those tests and issue
conformance certificates.   Such conformance proof could be submitted
by companies as part of their sales process to supply equipment for
DoD contracts.

I remember this pretty well, since I set up one such TCP Conformance
Lab, got it certified, and we performed a lot of testing and
consulting to help traditional government contractors figure out what
TCP was all about and get their products certified for DoD
procurement.  I've never learned who was orchestrating those
bureaucratic initiatives, but it seemed like a good idea.  There may
have been other similar efforts in other countries over the decades
since 1981 that I don't know anything about.

In the last 40+ years, AFAIK little else has happened for testing,
certification, or regulation of Internet technology.   Hundreds,
perhaps thousands, of "standards" have been created by IETF and
others, defining new protocols, algorithms, and mechanisms for use in
the Internet.  I'm not aware of any testing or certification for any
Internet technology today, or any way to tell is f any product or
service I might buy actually has implemented, correctly, any
particular "Internet Standard".

Governments can create such mechanisms around important
infrastructures, and have done so for transportation and many others.
IMHO they could do the same for Internet, and seem to be trying to do
so.

But to be effective the administrators, politicians, and regulators
need to know more about how the Internet works.   They could create
"Conformance Labs".   They could involve organizations such as the
Underwriters Lab in the US, CSA in Canada, CE (European Conformity) et
al.

If they knew they could and decided they should .... Education...

Jack Haverty

On 10/12/23 12:52, Hal Murray via Nnagain wrote:

Jack Haverty said:

A few days ago I made some comments about the idea of "educating"
the
lawyers, politicians, and other smart, but not necessarily
technically
adept, decision makers.

That process might work.

Stanford has run programs on cyber security for congressional
staffers.

From 2015:
Congressional Staffers Headed to Stanford for Cybersecurity Training

https://cisac.fsi.stanford.edu/news/congressional-staffers-headed-stanford-cybe
rsecurity-training

Today I saw a news story about a recent FCC action, to mandate
"nutrition
labels" on Internet services offered by ISPs:

Is there a chicken-egg problem in this area?

Suppose I had a nutrition-label sort of spec for a retail ISP
offering.  How
would I know if an installation was meeting the specs?  That seems
to need a
way to collect data -- either stand alone programs or patches to
existing
programs like web browsers.

Would it make sense to work on those programs now?  How much could
we learn if
volunteers ran those programs and contributed data to a public data
base?  How
many volunteers would we need to get off the ground?

Could servers collect useful data?  Consider Zoom, YouTube, gmail,
downloads
for software updates...
_______________________________________________
Nnagain mailing list
Nnagain@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/nnagain
_______________________________________________
Nnagain mailing list
Nnagain@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/nnagain

Reply via email to