Re: IPR at IETF 54
> From: Pekka Savola <[EMAIL PROTECTED]> > ... > .. If it takes a lawyer to write (or understand) licensing terms, they're > probably too complex .. Something like "the programmer who is his own IP lawyer has a fool for a client" applies. In other words, if you need to sign a license, then it takes a lawyer to understand its terms. 10 or 15 years ago people said that free RFCs versus expensive ITU (ISO) standards were a major advantage of the DDN Protocol Suite. Today, the cost of necessary legal advice dwarfs those old documentation fees by orders of magnitude. This is all merely sweeping against the tide. I predict all of this noise will come to nothing but affirming the IETF's IP status quo. The IETF has always had too many people who without the faintest clue or interest in implementing anything, except when they mean "unpack a box and plug it in." That problem has continued to worsen over the years and has been joined by another problem. Many of the ever fewer IETF implementors (or their employers) now hope to get rich (or avoid bankruptcy) by selling IP licenses. Vernon Schryver[EMAIL PROTECTED]
Re: IPR at IETF 54
On Fri, 31 May 2002 [EMAIL PROTECTED] wrote: > On Fri, 31 May 2002 08:40:17 +0300, Pekka Savola said: > > > A bad thing IETF could do (but not the worst luckily :-) is to give a > > signal "Ok.. feel free to patent and give RAND licensing.. depending how > > good it is, we might give it a standards status or we might not". That > > _encourages_ to do patents (or try to), and we want to avoid that. > > Patents *in and of themselves* are not a Bad Thing. As far as the IETF goes, > the problem only arises when the patent is used to enforce a restrictive > licensing policy. > > Can anybody think of a reason the IETF should object to patented tech *per se*, > as opposed to objecting to *hard-to-license* patented tech (the latter I think > we have consensus as being a Bad Thing)... Some reasons have been given already about the evilness of (almost) all patents. I'll add one that hasn't been widely discussed. The patent holder is in charge of the licensing. So he could do e.g.: - write so ambiguous licensing terms nobody has the courage to implement the spec without long, long consultation with lawyers - the licensing terms could have a few (intentionally or unintentionally) clauses that can afterwards interpreted differently *) - change the license afterwards *) an example of this is: http://www.ietf.org/ietf/IPR/ISI-NGTRANS.txt, specifically: --8<-- [...] If SRI's submission (or portions thereof) is accepted and becomes part of the IETF standard, then SRI will grant royalty-free permission under such patents for both commercial and non-commercial uses, to the extent _necessary for compliance_ with the standard. [...] --8<-- (underlining mine) This can be interpreted that everyone implementing a MAY or SHOULD in the specification could be blamed for infringing the licensing terms afterwards. Fun, eh? .. If it takes a lawyer to write (or understand) licensing terms, they're probably too complex .. -- Pekka Savola "Tell me of difficulties surmounted, Netcore Oy not those you stumble over and fall" Systems. Networks. Security. -- Robert Jordan: A Crown of Swords
Re: IPR at IETF 54
Bill Strahm wrote: > > On Thu, 30 May 2002, RJ Atkinson wrote: > > > > > On Thursday, May 30, 2002, at 09:48 , Melinda Shore wrote: > > > Here's one for starters: there's no guidance on how or whether to > > > treat differences in licensing terms for competing proposals. It > > > would be nice to be able to say that all other things being more-or- > > > less equal we should prefer technology which will be available > > > royalty-free, > > > > Agree. > > > > My druthers would be to have an IETF policy explicitly saying that > > the first > > choice is to use unencumbered technology if it can be made to work, > > second choice > > is encumbered but royalty-free technology, and last choice is "fair and > > reasonable > > licence terms" (or whatever the equivalent correct legal wording might be > > for that last). > > > > And it would be good to have a conventional template for the > > royalty-free > > licence -- one that the IETF's legal counsel has reviewed and believes > > is acceptable > > for IETF purposes. > I disagree with this, I don't think the IETF can afford to keep a staff of > lawyers working on determining the licencing statements of all of the > standards being churned out. > > That said, I don't think it would do any good anyway, lets say the IETF > lawyer gives his Okey Dokie, then my company implements the standard and a > problem with the licencing terms comes up... Who do I go sue, the IETF ??? > > I hope not, but that could be creating a legal liability for the IETF if > its lawyers make statements on the licencing terms of protocols... > > Bill Bill, The IETF isn't incorporated, so there is no way it can make such statements. The IETF's corporate umbrella is the Internet Society. Now I haven't consulted ISOC's CEO and VPs but I am on pretty safe ground in asserting that you are correct: ISOC would never accept such a liability. Our insurance company wouldn't let us. Brian E Carpenter Board Chairman, Internet Society http://www.isoc.org INET 2002, Washington, DC, 18-21 June http://www.inet2002.org
Re: IPR at IETF 54
On Thu, May 30, 2002 at 11:13:24PM -0500, Dave Crocker wrote: > To underscore the point that Marshall has been making: > > The IETF has a strong preference to use unencumbered technologies. When > there is a choice between encumbered and unencumbered, the working group > includes encumbrance into the range of factors it treats as important for > evaluating alternatives. > > There are some unknowns about licensing. Some holders of IPR are helpful > to resolving that easily and quickly. Others are more reticent. That > become a component of the evaluation about the IPR factor. > > And so on. > > Generally this thread seems to be seeking determinacy for a matter that can > only be made deterministic by a) ignoring IPR encumbrance, or b) rejecting > all IPR encumbrances. The first is not compatible with IETF culture. The > latter is not practical in some cases. > > So, what exactly do folks think is a practical kind of change to the > current IETF policies? I think the problem is that while there is (very) rough consensus IETF-wide that there is a strong cultural bias against patent encumberances (*), this bias is not adequately documented in writing. This is exacerbated by the fact that some IETF working groups, particularly those where a greater number of the participants do not have as much IETF experience and acculturation. This is happening more and more as we start doing more cross-collaborations with other standards bodies, and when technologies which previously had been used on top of other media are "ported" to IP, and people who had been used to working in other standards bodies find them selves working within the IETF. (*) Unless there is a ***very*** reason why you can't do without the patent --- RSA signatures/encryption being classic example, but even there, RSA DSI's licensing policies were probably far more effective that the U.S. government's export control regime at preventing the deployment of secure protocols in the Internet). Many of these newcomers to the IETF very dutifully read the relevant RFC's (2026, et. al.), and then are surprised either (a) they get strong push back from the IESG, or (b) their decisions get attacked at IETF plenery sessions, the IETF mailing list, or in other venues. They are get surprised, and there is some fairness to their argument that this bias against non-RF patents isn't written down anywhere and isn't formally part of our policies. Granted, we can't document every tiny detail of cultural biases within the IETF in our policy documents, but I think this one is important enough that we need to say something. Once we do decide that we need to say something, then the next question is exactly where do we draw the line, and that's where all of the discussion and long missives to the IETF mailing list are coming from. Although it's pretty clear we won't be able to give working groups an algorithmic flowchart about when a non-RF patent is acceptable, I do believe that we can give some general guidelines, and then require that the working group chair work with the area director when this sort of issue raises its ugly head. This won't solve the "stealth patent" problem, where the patent problems only reveal themselves very late in the process, or even after document is published as an RFC, but it does handle a large number of other cases. - Ted
Re: IPR at IETF 54
> I think the most effective thing would be to send a strong signal of some > kind: "If you patent technologies and give non-RF licenses, _do not expect > the technology be supported in IETF at all_". The problem is that there are enough companies out there that don't care. There are some areas of technology out there where companies that don't need IETF-style interoperability have essentially patented every possible branch of the decision tree (and are continuing to do so). What should the IETF do here? Fortunately, we don't do codecs. Whew. However, there may be other areas where the only remaining way forward is to actively cut through the existing jungle of utterly bogus patents. By the way, I do completely agree with your point. Just pointing out that it may not be enough. Gruesse, Carsten
RE: IPR at IETF 54
> | And the flip side - we've moved an amazingly SMALL number of documents > | to Full Standard, and only when we *think* we *fully* understand > things. > > That's the problem. Or it is with the IPR issues. It is determining > whether > we can make that final step (widespread deployment is what is required, > expecting full understanding of almost anything is naïve) that actally > decides whether or not the IPR rights holder is being reasonable or not. Ten years ago, we were mostly concerned with the "silent patent holder" problem. It is reasonably easy for a WG to make its own decision when the existence of the patents and the licensing conditions are disclosed up front, before the WG agrees on a solution. But the real problem occurs when the patent holder "ambushes the standard". Products get developed and fielded, and then the vendors or the users of these products get hit by an infringement lawsuit. The current process was designed to minimize this risk, on the belief that if an issue actually existed, it would surface during the early phase of testing, i.e. before the standard would move from "proposed" to "draft". The rationale was that there would not be much usage at that stage, and that if push came to shove the WG could re-design the standard so as to not require licensing of a hard-to-get patent. As KRE points out, the whole mechanism falls apart when vendors field products based on a proposed standard, not to mention an internet draft. There are other issues. The first one is the imprecision of the disclosure requirements. The current process does not exactly say who is required to disclose the existence of intellectual property. According to some interpretations, a working group chair whose organization holds patents affecting a draft discussed in the working group is not required to disclose these patents, if he or she does not contributes or otherwise participate in the discussion of this specific draft. A second issue is the interaction between the standardization process and non-disclosure agreements. For example, an IETF participant may know that his or her former employer has a patent claim on a technology considered for standardization; in fact, I know case where the participant is in fact one of the authors of the patent. Yet, the agreement signed when leaving an employer typically prevents disclosure of such information. In another example, a vendor may have to sign an NDA before learning that its product infringes on some other organization's patent. This vendor is then legally prevented to signal the existence of the patent claim to the IETF. I would contend that, if we have one urgent problem to solve, it is to find a way to ensure speedy disclosure of intellectual property issues that affect a standard. -- Christian Huitema
Re: IPR at IETF 54
> From: [EMAIL PROTECTED] > ... > The problem is that you can publish the same document, and then some > sleazeball competitor patents it, because the patent office does such > a poor job of researching "prior art". That seems to be based on the false notion that the patent office checks even its own documents for prior art. >If http://www.bustpatents.com didn't have a reason to exist, I'd agree with you The reasons I see for that and similar publications do not involve the assumption that government bureaucrats and everyone else involved in the "IP protection" business don't see that their own interests lie in what can be charitably summarized as erring on the side of accepting instead of rejecting patent applications. Look for the notion of "blocking patents" in the 19th Century in such as in development of firearms. Unless you think everyone in the 19th Century was a superstitious and ignorant idiot or that all real science and technology appear after 1950, look at the many obviously silly patents from the first century of the extortion racket. Vernon Schryver[EMAIL PROTECTED]
Re: IPR at IETF 54
On Fri, 31 May 2002 07:54:49 MDT, Vernon Schryver <[EMAIL PROTECTED]> said: > In theory that could happen. It may have happened in practice with > the Ethernet patent. But what's the point? What is gained by > winning such a patent from government(s) compared to publishing > the same document, other than a year or three of jumping through > hoops and plenty of money and hassles? The problem is that you can publish the same document, and then some sleazeball competitor patents it, because the patent office does such a poor job of researching "prior art". If http://www.bustpatents.com didn't have a reason to exist, I'd agree with you. msg08442/pgp0.pgp Description: PGP signature
Re: IPR at IETF 54
Date:Fri, 31 May 2002 11:48:24 -0400 From:[EMAIL PROTECTED] Message-ID: <[EMAIL PROTECTED]> | How would that work (having 2 full standards for the same exact thing)? Depends on what the thing is, and how precisely you mean "the same exact thing". In some cases it wouldn't - it is common for a new std to obsolete an old one, no reason that can't continue. But sometimes the new std is written in such a way that the old one can also continue. Eg: the standard for IP. There's a new one (not yet full std, but really should be by now), and the old one. They do the same job, but can co-exist (version field works, though other ways used more commonly). You might say those are not the same thing, but the new one really is intended as a replacement (eventually) for the old. kre
Re: IPR at IETF 54
On Fri, 31 May 2002 22:34:06 +0700, Robert Elz said: > Yes, that's true - but it would be even easier if the new one were a > full standard (even if the old one was too). How would that work (having 2 full standards for the same exact thing)? msg08440/pgp0.pgp Description: PGP signature
Re: IPR at IETF 54
Date:Fri, 31 May 2002 09:03:43 -0400 From:[EMAIL PROTECTED] Message-ID: <[EMAIL PROTECTED]> | OK.. I'll bite - at what point should a not-yet-full standard expire to | historic? Pretty quickly. What the max period at DS should be I'm not sure, but certainly no more than 2 years. The point not being to shed lots of stuff necessarily of course, but to put more pressure on all of us to take the comparatively small steps needed to move the docs up the chain. Compared with the work needed to get something to PS state, the rest of it isn't difficult really. | And the flip side - we've moved an amazingly SMALL number of documents | to Full Standard, and only when we *think* we *fully* understand things. That's the problem. Or it is with the IPR issues. It is determining whether we can make that final step (widespread deployment is what is required, expecting full understanding of almost anything is naïve) that actally decides whether or not the IPR rights holder is being reasonable or not. That's what we really need to figure out relatively quickly - we can't wait 10 years with docs in PS (and a lesser number DS) state with everyone assuming they are *the* standard - then we haven't properly tested any IPR problems that might exist. | One of the *GOOD* things about protocols living at DS is that you can | convince vendors to start supporting the *new* DS rather than the *old* one. Yes, that's true - but it would be even easier if the new one were a full standard (even if the old one was too). kre
Re: IPR at IETF 54
> From: [EMAIL PROTECTED] > > In still other words, don't you remember the years of pain > > Motorola/Codec caused PPP with those two bogus patents? > > I guess what I was asking was how the IETF would feel about an organization > grabbing a patent on an algorithm and using it the same way the GNU crew > uses copyright on source code. (Remember - the GNU copyright only works > for *code* - since algorithms can be (at least in the US) patented but not > copyrighted, you'd have to do a similar stunt with a patent). > > (And yes, this would be a case of "the Good Guys file a bull-manure patent > to pre-empt the Evil Guys from filing a bull-manure patent" - but until the > Patent Office gets their act together we're stuck with borked software patents > that are invalid due to prior art, etc) In theory that could happen. It may have happened in practice with the Ethernet patent. But what's the point? What is gained by winning such a patent from government(s) compared to publishing the same document, other than a year or three of jumping through hoops and plenty of money and hassles? If you look at patents, you soon see that there's nothing special about software patents and that the problems with the system are more than 100 years old. What would you expect from giving lawyers and government bureaucrats (specifically including examiners) the responsibility and authority to determine the validity (e.g. no perpetual motion) and novelty of other people's ideas? Government central planning of economies is more or less dead (for now), but government central planning for science and technology lives in the West. Vernon Schryver[EMAIL PROTECTED]
Re: IPR at IETF 54
On Fri, 31 May 2002 07:12:50 MDT, Vernon Schryver <[EMAIL PROTECTED]> said: > In still other words, don't you remember the years of pain > Motorola/Codec caused PPP with those two bogus patents? I guess what I was asking was how the IETF would feel about an organization grabbing a patent on an algorithm and using it the same way the GNU crew uses copyright on source code. (Remember - the GNU copyright only works for *code* - since algorithms can be (at least in the US) patented but not copyrighted, you'd have to do a similar stunt with a patent). (And yes, this would be a case of "the Good Guys file a bull-manure patent to pre-empt the Evil Guys from filing a bull-manure patent" - but until the Patent Office gets their act together we're stuck with borked software patents that are invalid due to prior art, etc) /Valdis msg08437/pgp0.pgp Description: PGP signature
Re: IPR at IETF 54
>> Right. Standards exist so that we can get interoperability; expensive >> licenses limit interoperability. > >No, expensive licenses place an upper bound on the number of >interoperable implementations. I believe it comes to the same thing. Interop is not actually the end goal; it is a tool to prevent vendor lock-in by maximizing people's choice of implementations. If a standard is subject to an expensive license, that raises the bar on who can implement it, which reduces choice. /===\ |John Stracke|Principal Engineer| |[EMAIL PROTECTED] |Incentive Systems, Inc. | |http://www.incentivesystems.com |My opinions are my own. | |===| |There are footprints on the moon. No feet, just footprints.| \===/
Re: IPR at IETF 54
> From: [EMAIL PROTECTED] > Patents *in and of themselves* are not a Bad Thing. As far as the IETF goes, > the problem only arises when the patent is used to enforce a restrictive > licensing policy. > >Can anybody think of a reason the IETF should object to patented tech *per se*, >as opposed to objecting to *hard-to-license* patented tech (the latter I think > we have consensus as being a Bad Thing)... In real life, every patent that is enforced is hard-to-license. Every patent that is not ignored by everyone including the holder has a restrictive licensing policy, official IETF, IEEE, and other dogma notwithstanding. Any sort of real patent licensing requires dealing with lawyers, negotiations, and the rest of that very painful, incredibly slow, and cripplingly expensive show. The phrase "non-restrictive patent licensing policy" is an oxymoron. The entire and only point of a patent is to restrict that other people can do. At best, you can have a "take equal money from anyone patent licensing policy." In still other words, don't you remember the years of pain Motorola/Codec caused PPP with those two bogus patents? Vernon Schryver[EMAIL PROTECTED]
Re: IPR at IETF 54
On Fri, 31 May 2002 16:09:47 +0700, Robert Elz said: > My suggestion to fix this problem is quite simple > > No more last calls before moving protocols to historic, > except where they're full standards already. > > For everything else, going to historic should be automatic. That is, > the only way to avoid any spec being made historic automatically, is > for it to become a full standard. OK.. I'll bite - at what point should a not-yet-full standard expire to historic? And the flip side - we've moved an amazingly SMALL number of documents to Full Standard, and only when we *think* we *fully* understand things. Even then, things have been known to get "out of sync" with reality: 1119 Network Time Protocol (version 2) specification and implementation. D.L. Mills. Sep-01-1989. (Format: TXT=143, PS=518020, PDF=187940 bytes) (Obsoletes RFC0958, RFC1059) (Obsoleted by RFC1305) (Also STD0012) (Status: STANDARD) 1305 Network Time Protocol (Version 3) Specification, Implementation. David L. Mills. March 1992. (Format: TXT=307085, PDF=442493 bytes) (Obsoletes RFC0958, RFC1059, RFC1119) (Status: DRAFT STANDARD) One of the *GOOD* things about protocols living at DS is that you can convince vendors to start supporting the *new* DS rather than the *old* one. I've had *enough* fun with one vendor who insisted on sticking with RFC2133 rather than updating to RFC2553 for IPv6 socket API, even though both are only Informational. -- Valdis Kletnieks Computer Systems Senior Engineer Virginia Tech msg08434/pgp0.pgp Description: PGP signature
Re: IPR at IETF 54
On Fri, 31 May 2002 08:40:17 +0300, Pekka Savola said: > A bad thing IETF could do (but not the worst luckily :-) is to give a > signal "Ok.. feel free to patent and give RAND licensing.. depending how > good it is, we might give it a standards status or we might not". That > _encourages_ to do patents (or try to), and we want to avoid that. Patents *in and of themselves* are not a Bad Thing. As far as the IETF goes, the problem only arises when the patent is used to enforce a restrictive licensing policy. Can anybody think of a reason the IETF should object to patented tech *per se*, as opposed to objecting to *hard-to-license* patented tech (the latter I think we have consensus as being a Bad Thing)... -- Valdis Kletnieks Computer Systems Senior Engineer Virginia Tech msg08433/pgp0.pgp Description: PGP signature
Re: IPR at IETF 54
Date:Thu, 30 May 2002 23:13:24 -0500 From:Dave Crocker <[EMAIL PROTECTED]> Message-ID: <[EMAIL PROTECTED]> | So, what exactly do folks think is a practical kind of change to the | current IETF policies? Actually, like many things, I suspect that the underlying problem that we're seeing is in quite a different area than the one we're looking in. My suggestion to fix this problem is quite simple No more last calls before moving protocols to historic, except where they're full standards already. For everything else, going to historic should be automatic. That is, the only way to avoid any spec being made historic automatically, is for it to become a full standard. This is more or less saying what Henning Schulzrinne said before I got the chance... The problem is that very few standards make it to Draft. Exactly. We have developed a culture of getting the work to PS status, and considering it done. We even disband working groups (or move them to dormant status, which is effectively the same thing) as soon as all their docs have been published. And the implementors see that - as soon as something has reached PS status, it is considered finished, and we even have people getting upset (because of the deployed base we'll be breaking) if any changes get made. That's what's really broken here - we actually have almost the prefect IPR policy in place, but we're not bothering to actually attempt to use it before it is way too late. What we need to do is make sure that everything gets to draft standard within a relatively short time after it has reached PS, say 9-12 months (preserving the current 6 month minimum to make sure there's enough time for implementations to be attempted). Anything that hasn't made it to DS then should simply be shelved, abandoned (DS isn't much of a hurdle after all). And we should be actively discouraging distribution in products of anything that isn't at least DS (to do that we probably should get into the habbit of making random innocuous changes to docs when they go to DS state - like if we have commands that are issues in binary and they're 1, 2, 3, ... (as usual), we just change which is 1, which is 2, etc so everyone actually running the code has to update - which of course is impossible after something has been shipped. Then again, we need to make sure that all DS's get elevated to full standard quickly, or dropped. This is where the IPR rules really get exercised, where we see if the proposal has IPR rules that make it difficult to implement and deploy or not. If the IPR rules cause problems, then the doc won't reach Std status, and again, that should mean that it is automatically made historic and abandoned. It isn't the IPR rules that need changing here, the current formulation that Christian Huitema came up with is almost certainly the best we can do - anything different is far more likely to cause problems than to solve any. kre