[EMAIL PROTECTED] wrote:
>
> The governemnt has decided to regulate frequencies and require licenses to
> operate within those frequencies. A much better example here would be cell
> phones. A cell phone also uses frequencies but the manufacturer is
> responsible for making sure that the cell phone stays withing those
> frequencies not the cell phone user. AS long as I do not modify the cell
> phone outside of the bounderies of the manual I can expect that cell phone
> to stay within FCC regulations.
What if cell phones allowed you to run code of your choice on them? Wouldn't
that be modifying them?
> 1) Require licenses and create laws for security requirements to be on the
> Internet and revoke Internet privileges for people who don't obey the
> rules.
If we look at the Internet like we look at the airwaves, there is
precedent for testing and licensing. We have the citizens band, the
amateur radio band, commercial bands, and emergency bands each
with their own requirements.
> 2) Hold software manufacturers responsible.
<snipped and re-ordered gf>
> Am I responsible for a product that is defective before it reaches my
> doorstep?
No, but I'm suggesting we may be responsible for our computers' actions
which may affect what software we choose to run on them.
In defense of the vendors, I think they just gave us what we asked for.
One could say that we had much more secure products with mainframes and
terminals but the consumer demanded personal computers. I don't believe
our ability to write software makes it possible to write an operating
system secure from the end user for a personal computer connected to an
unrestricted world-wide network aimed at the mass market as long as it
retains its general purpose programming functionality.
In addition, people want the computer to anticipate their actions and
desires which leads to things like file associations, automated program
execution, and behind the scenes communications. Also more AI and
automation type behavior covered up with a GUI which adds to complexity
and we all know what the relationship is between complexity and security.
Besides, regardless of what the software vendors produce, there is a
processor at the heart of the machines that can be told to do anything
they want with their associated network connections. This cannot be
prevented unless the machines are designed so that the operating system
is embedded and tamperproof and sandboxes code. It would be a completely
different type of machine and environment than we have today.
I also don't think it impractical or harsh to expect people to upgrade
their systems on a regular basis. Code that has been in use and
highly examined for years still has bugs reported against it (kerberos
for example). It is unrealistic to expect vendors to product bug free
code particularly considering the complexity, schedules, and wide
ranging environments in which it is expected to work. It will always
be mandatory that we periodically upgrade software unless we wish
to freeze functionality and products as they are.
> #What about the "attractive nuisance" argument? If I habitually leave my
> #keys in my car next to a playground and a kid climbs in, drives off,
> #and hurts someone, am I responsible?
>
> This is a flawed argument. I have a password (the key) for my box but all
> of these hackers have tools (lock picks) that allow them to bypass my lock.
What about a swimming pool with a two foot, flimsy fence? Or more apt, one
with a hole in it you were warned about?
> If a small business has a Lynksys router or a personal firewall and
> passwords then what else should it be required to have to exist on the
> Internet?
Regular patches.
Perhaps a regular vulnerability scan. Perhaps done by the ISP.
--
Gary Flynn
Security Engineer - Technical Services
James Madison University
Please R.U.N.S.A.F.E.
http://www.jmu.edu/computing/info-security/engineering/runsafe.shtml
-
[To unsubscribe, send mail to [EMAIL PROTECTED] with
"unsubscribe firewalls" in the body of the message.]