Proprietary Influences in Free and Open Source Software: Lessons to Open and Universal Internet Standards

28 September 2011 - A Workshop on Other in Nairobi, Kenya

Agenda

The Motor Vehicles Manufacturers Association in America before World War II shared inventions among its members without a licence fee of any kind the same way home cooking recipes have been shared across kitchens since the beginning of human culture. In the field of Computers, in the 50's and 60's software produced by the Computer Science academics were freely shared. Software was generally distributed in the spirit of sharing. Source code, the human-readable form of software, was generally distributed with the software itself, to allow users / developers to use, study, and possibly change and improve its design.

In the 1970s and early 1980s, the software industry began using technical measures (such as only distributing binary copies of computer programs) to actually prevent computer users from being able to study and customize software they had paid for. In the 70s, and to a limited extent in the early 80s, Unix made its source code available. In 1983, Richard Stallman announced the plan to develop the GNU operating system, which would be Unix-compatible and entirely free software. In 1985 the GNU Manifesto was published. A month later, the Free Software Foundation (FSF) was founded. Linus Torvalds released the Linux Kernel though it was not freely modifiable in 1991. Torvalds made Linux free software in Feb 1992. Linux filled the last gap in GNU, so GNU+Linux made a complete free operating system, which attracted the attention of volunteer programmers. In 1998 Netscape released its Internet suite as free software. All of this furthered "software freedom for all"

Netscape's act prompted an examination on how to bring free software principles and benefits to the commercial software industry. They concluded that the Free Softward Foundation's social activism was not appealing to companies like Netscape, and looked for a way to rebrand the free software movement to emphasize the business potential of the sharing of source code. The new name they chose was "open source", and quickly O'Reilly, Linus Torvalds and others signed on to the rebranding. In 1998 the Open Source Initiative was founded.

During the course of this history of Free and Open Source there been 'wars' between the Free / Open Source movements with the Proprietary philosophies as also smaller battles within the Free / Open Source ideologies. In many cases these were not “wars” or even “battles” but parellel standards. There were on:

1.Browser Standards: Describes the actions of Microsoft, Google, Mozilla, Apple Inc., and Opera continuing to have a rearmament cycle of trying to create the authoritative web browser.

2.Editor Formats: unix editor users are divided into two big groups. The users of vi and the users of emacs.

3.Desktop Environments: KDE and GNOME desktop environments has the same effect.

4.Operating system advocacy: between Net BSD, Open BSD, Free BSD, GNU/Linux, Solaris, Windows and Macintosh. While relations between GNU/Linux and BSD developers are not entirely friendly, but those whithin the Free and Open Source Community do consider it a 'war'.

5.Format Wars: Of greater releavnce to the theme is the format wars which is competition between mutually incompatible proprietary formats that compete for the same segments. Format wars have happened and continue in several segments, for instance in streaming media as wars between AVI, Quicktime (MOV), Windows Media (WMV), RealMedia (RA), MPEG, DivX or XviD and Ogg. Ogg as a free and open container format is unrestricted by software patents[4] and is designed to provide for efficient streaming and manipulation of high quality digital multimedia. Ogg went though several hurdles in the process of establishing its format.

While Proprietary Software thrives on differentiation, the Free and Open Source Community finds itself drawn into a situation of mutliple flavors. Why do we have a different set of command lines for some tasks in RedHat and Ubuntu, which share the same kernel and most application software? Why do GNU/Linux distributions differ in implementation? Why do we have some difficulties in some computing across an Open Solaris Standalone connected to a network with GNU/Linux nodes? Why is it difficult to seamlessly import from Eudora and migrate to Evolution?

Some of the free software is 'free this far and no further' and some open source code releases are partially closed. Somewhere along the path of the evolution of Free and Open Source, commercial considerations have caused some players to draw a visible or invisible barricade around their 'own' software, distribution or release. While this makes it possible for an Open Source enterprise to make the enterprise commercially viable and profitable, it has also been a cause for interoperability among what originated in free and open source code.

Open Source and Free Software is futher evolving and would gain even greater importance. But attention to drawn to the 'wars' and to what Richard Stallman stated in the GNU Manifesto: "Software sellers want to divide the users and conquer them, making each user agree not to share with others."

Open Source and free software have caused immense progress in Information Technology as also in other areas. References are drawn here particularly to the differences and gaps, NOT to elicit a debate on Open Source Software but with a larger purpose of examining the diversity to contemplate and avoid mutliplicity in the Unified Network of Networks.

How can the World Wide Web remain free of signs in Websites that say "Site optimized for Internet Explorer" that annoy Tim Berners Lee? How can we ensure that the Internet Architecture remains free of parellel standards that threaten the Universal opearability of the Internet?

 

A brief substantive summary and the main events that were raised:
Workshop Number: 201

Sixth Annual Meeting of the Internet Governance Forum
27 -30 September 2011
United Nations Office in Nairobi, Nairobi, Kenya
September 28, 2011 - 11:00AM

Title: Proprietary Influences in Free and Open Source Software: Lessons to Open and Universal Internet Standards

Concise Description:
The Motor Vehicles Manufacturers Association in America before World War II shared inventions among its members without a licence fee of any kind the same way home cooking recipes have been shared across kitchens since the beginning of human culture. In the field of Computers, in the 50's and 60's software produced by the Computer Science academics were freely shared. Software was generally distributed in the spirit of sharing. Source code, the human-readable form of software, was generally distributed with the software itself, to allow users / developers to use, study, and possibly change and improve its design.

In the 1970s and early 1980s, the software industry began using technical measures (such as only distributing binary copies of computer programs) to actually prevent computer users from being able to study and customize software they had paid for. In the 70s, and to a limited extent in the early 80s, Unix made its source code available. In 1983, Richard Stallman announced the the plan to develop the GNU operating system, which would be Unix-compatible and entirely free software. In 1985 the GNU Manifesto was published. A month later, the Free Software Foundation (FSF) was founded. Linus Torvalds released the Linux Kernel though it was not freely modifiable in 1991. Torvalds made Linux free software in Feb 1992. Linux filled the last gap in GNU, so GNU+Linux made a complete free operating system, which attracted the attention of volunteer programmers. In 1998 Netscape released its Internet suite as free software. All of this furthered "software freedom for all"

Netscape's act prompted an examination on how to bring free software principles and benefits to the commercial software industry. They concluded that the Free Softward Foundation's social activism was not appealing to companies like Netscape, and looked for a way to rebrand the free software movement to emphasize the business potential of the sharing of source code. The new name they chose was "open source", and quickly O'Reilly, Linus Torvalds and others signed on to the rebranding. In 1998 the Open Source Initiative was founded.

During the course of this history of Free and Open Source there been 'wars' between the Free / Open Source movements with the Proprietary philosophies as also smaller battles within the Free / Open Source ideologies. In many cases these were not “wars” or even “battles” but parellel standards. There were on:

1.Browser Standards: Describes the actions of Microsoft, Google, Mozilla, Apple Inc., and Opera continuing to have a rearmament cycle of trying to create the authoritative web browser.

2.Editor Formats: unix editor users are divided into two big groups. The users of vi and the users of emacs.

3.Desktop Environments: KDE and GNOME desktop environments has the same effect.

4.Operating system advocacy: between Net BSD, Open BSD, Free BSD, GNU/Linux, Solaris, Windows and Macintosh. While relations between GNU/Linux and BSD developers are not entirely friendly, but those whithin the Free and Open Source Community do consider it a 'war'.

5.Format Wars: Of greater releavnce to the theme is the format wars which is competition between mutually incompatible proprietary formats that compete for the same segments. Format wars have happened and continue in several segments, for instance in streaming media as wars between AVI, Quicktime (MOV), Windows Media (WMV), RealMedia (RA), MPEG, DivX or XviD and Ogg. Ogg as a free and open container format is unrestricted by software patents[4] and is designed to provide for efficient streaming and manipulation of high quality digital multimedia. Ogg went though several hurdles in the process of establishing its format.

While Proprietary Software thrives on differentiation, the Free and Open Source Community finds itself drawn into a situation of mutliple flavors. Why do we have a different set of command lines for some tasks in RedHat and Ubuntu, which share the same kernel and most application software? Why do GNU/Linux distributions differ in implementation? Why do we have some difficulties in some computing across an Open Solaris Standalone connected to a network with GNU/Linux nodes? Why is it difficult to seamlessly import from Eudora and migrate to Evolution?

Some of the free software is 'free this far and no further' and some open source code releases are partially closed. Somewhere along the path of the evolution of Free and Open Source, commercial considerations have caused some players to draw a visible or invisible barricade around their 'own' software, distribution or release. While this makes it possible for an Open Source enterprise to make the enterprise commercially viable and profitable, it has also been a cause for interoperability among what originated in free and open source code.

Open Source and Free Software is futher evolving and would gain even greater importance. But attention to drawn to the 'wars' and to what Richard Stallman stated in the GNU Manifesto: "Software sellers want to divide the users and conquer them, making each user agree not to share with others."

Open Source and free software have caused immense progress in Information Technology as also in other areas. References are drawn here particularly to the differences and gaps, NOT to elicit a debate on Open Source Software but with a larger purpose of examining the diversity to contemplate and avoid mutliplicity in the Unified Network of Networks.

How can the World Wide Web remain free of signs in Websites that say "Site optimized for Internet Explorer" that annoy Tim Berners Lee? How can we ensure that the Internet Architecture remains free of parellel standards that threaten the Universal opearability of the Internet?

Biographies:
Aina Alain (Mr.)
Hackshaw Tracy (Mr.)
Hackshaw Tracy (Mr.)
Hariharan Venkatesh (Mr.)
Pisanty Alejandro (Mr.)
Stallman Richard (Mr.)


Summary of views expressed by panelists:

>> ALEJANDRO PISANTY: The workshop is to discuss the various meanings or shades that the word "open" or the word "openness" mean in context related to IT for development and to Internet Governance and their overlap. What's very important is to explore the proprietary influences. Technology can be proprietary and still conform to open standards. Even software can have proprietary components or a proprietary origin and then become open or vice versa. People can take pieces of open software, and there are rules-based ways to appropriate this software.

>> TRACY HACKSHAW:
The fundamental issues, therefore, facing the free and open source movement, software movement, in the developing world, are continuing lack of adoption by large enterprises and Governments, for a myriad of reasons, as well as the inability for open source to effectively compete on a resource level against the larger players in the market. The increased commoditization of open source software through mergers and acquisitions in the movement, for example, Java, MySQL, stifles and sort of puts a lock on the growth and development of open source software. even if you are moving forward with open source and then a large player buys out that movement that you are trying to adopt, it creates a real problem for you to advocate and to move within in Government in particular, and even within private enterprise. Governments can and should summon the political will to bring a wider appreciation of open source through adoption a matter of public policy in the public interest. .

>> VENKATESH HARIHARAN: The Indian Government formulated an open standards policy, which is probably one of the best open standards policies in the world, which said that in e-governance, the Indian Government will use standards that are free of royalties and are freely available. As a follow-up to that India is now working on a policy on device drivers, which says that any Government procurement of peripherals and hardware mandate really that will have a device driver which is available for free and open source software along with the proprietary software. It is being considered to move the education sector to the free and open source software. India can save close to $2 billion by adopting free and open source software.

>> SUNIL ABRAHAM:
e free software can play a critical role in allowing emerging economies to get properly entrenched.
And the way that the proprietary companies treat us these days has changed. At one point, they used to call the GPL cancerous and people like Richard Matthew Stallman a communist, but today an organisation like Centre for Internet and Society has to compete with the large proprietary giant in Bangalore to host the PHP community group meeting because they always can afford to give better cake and coffee than we can. So that's a useful thing. Free software is now in the heart of proprietary commercial business enterprise.

>> SUNIL ABRAHAM: "Open standards," could mean standards that don't have royalty implications. And there is a tendency in the Internet age for companies that are not quite open to use the term "open" constantly as a mantra. For the software developed by the software companies in India,  the intellectual property is owned by the companies that contract the entities within India. Free software can play a critical role in allowing emerging economies to get properly entrenched. [Proprietary companies are beginning to acknowledge the growing importance of free software] Free software is now in the heart of proprietary commercial business enterprise.

>> SIVASUBRAMANIAN: In a scenario there are some conflicts in the Internet standard-making process, and if the same concepts are broadened to Internet Protocol, what would happen to the Internet? If there is a proposal to an alternate or an additional TCP/IP fast lane protocol, which is kept as a proprietary protocol, what would happen to the Internet?

>> SUNIL ABRAHAM: Plurality in key protocols is not necessarily a useful thing because it Balkanizes the network, and then we have reduced network effects. But in the world of software implementation, pluralism is slightly more better phenomena. It's better for us to have a choice. So regardless of what the de jure processes are, finally, it's whether the community accepts your standards. So it's an issue that we need to solve at the policy layer, but also in terms of adoption.

VENKATESH HARIHARAN: The question you raise was at the heart of the entire open standards debate in India, and the proprietary standards companies were arguing that it's okay to have multiple standards. [But] Standard is basically a social contract and that everybody should adhere to that contract. That's because that's in the larger interest of everybody. The Indian Government had the wisdom to understand that logic and decided that for eGovernment in India, we shall use only a single standard. Standards are meant to unify. Standards are meant to bring people together. And multiplicity of standards completely vitiates the very purpose of having a standard.

>> SCOTT BRADNER: Standards from the IETF point of view are things that people use or companies agree to implement. If somebody had proprietary extension, for example, to TCP, it would be a significant problem.

What do you mean by open standards? In the Internet Engineering Task Force, anybody can participate. In the ITU, you have to be a member and pay money to be a member in order to be able even to see the working documents and can't actually participate in the development of the standard without being a member.

The most common point of view is open standards are standards developed in an open process, Other people have defined open standards as standards which you do not have to pay to get a copy of the standard. Other people look at open standards and say well, they are standards that you can freely implement that have no intellectual property rights, no licensing requirements to implement. That is actually relatively rare. The ITU, the IETF, and most other standards bodies other than the W3C don't insist that their products, their standards that come out of recommendations, are intellectual property right free.

In the ITU, if somebody says we have a patent and we will license it fairly, then they are to be considered completely on equal terms than anybody with any technology, whether there's patents on it or not, whether it's free. In the IETF, we leave it to the working group to evaluate whether a technology is important enough to deal with the fact that you have a known patent on it and there are licensing fees.
The IETF has produced parallel standards from time to time, multiple standards to do the same thing. We produce two vastly different things to do Internet telephony, Megaco and SIP. IETF produced both of those standards and let it to the marketplace to decide what to do. And the marketplace actually chose both. We've also had situations where multiple standards have simply confused the marketplace. IETF now has open standards in the sense of open software where anybody can modify them. But what is a standard? A standard is a consensus agreement on the right way to do something. If individuals can make their own standards, they can't be consensus. On the point that you made to the Indian Government, having a variety of ways to do things can be useful, but they they should be distinctly separate philosophies and architectures of how to approach something. The ITU came up with something called H-323, which was their multiprotocol communication standard for Internet telephony and video conferencing. IETF came up with SIP. Implementers targeted the same set of customers with it. And SIP completely wiped out the H-323 when it came to voice, and the standards are back and forth when it comes to video. It's healthy for the environment to have that level of choice. Both of those standards are consensus-based standards. They are not individual tweaks, modifications to an existing one.
IPR is a very big issue. Standards where the patents show up later or Standards where you have to pay to implement something is particularly difficult for the developing world. The costs that are asked by the patent holders can be extraordinary in that context and can be large in the context of the U.S., but it can be extraordinary in another context.
In IETF, we decided to let the Working Group think about it. We require disclosure where we can [get the IPR] for the people who are participating in the standards process. We require disclosure if they've got IPR. Consider the rate at which the patents office considers patents. There was an incredibly weird patent issued. There was a patent issued to exercise a cat with a laser pointer a few years ago. We're in an environment where patents are going to be there. Patents are not just in the U.S., but around the world. They're not going to go away, so you can't make the assumption that anything that is it developed -- any standard that's developed will ever be IPR free until all possible patents that were created before it have run out. And that could be 20 years later.
>> FRED BAKER: Scott, … talk about licensing terms.
>> SCOTT BRADNER: Licensing terms are an interesting problem. Under U.S. law, a standards body cannot negotiate licensing terms with an IPR holder. It's under antitrust laws. The IETF requests that IPR holders provide licensing terms for the Working Group when they are evaluating the technology. Licensing terms are something that we are not allowed to negotiate as a standards body. It would be nice to be able to, but we can't.
>> FRED BAKER: Cisco uses different kinds of licenses in different places. After some issues on claims of infringement on other's patents, CISCO started developing IPR with the idea that it could swap it for IPR held by others. At the ITU Cisco generally seeks brand licensing. In the IETF, Cisco uses a variety of licenses and offers the IETF the liberty to implement. The price is zero for FRNAD/RAND with a covenant not to sue. FRAND/RAND gives the open software people the freedom that they need in order to implement
>> SCOTT BRADNER: A number of times, there comes up as a point of Internet Governance issue is who is going to make standards? Back in 1994 at a Harvard meeting, I said that one of the two unresolved issues of the Internet was who says who makes the rules? And by the rules there, I meant standards of some of the rules. The ITU and others have asked that the standards for the Internet, the technical standards for the Internet, be done in -- within the ITU and that the IETF and others that are producing standards which are used on the Internet would then put their standards through the ITU process in order to bless them.
This is a very basic Internet Governance issue is that many countries believe that having this cacophony of locally developed standards is disadvantageous to their local industries.
Many years ago ISO declined to be the standards body for the TCP/IP because they were concerned that the U.S. companies had too much of a head start on it and that they wanted an even playing field.
This problem has not gone away. We've had proposals as recently within the last couple of weeks to turn over standardization -- technical standardization of the Internet, not simply business standards, technical standards of the Internet, to a UN organisation.
>> SUNIL ABRAHAM: On the question of outgoing royalties from a particular Nation State connected to standards with patent implications, RAND and choosing between those options of royalty free and FRAND/RAND are not the only options available. There are also options like royalty caps. In India, till two years ago, on any device, only 5% of the selling price could be divvied up as royalty. That's another possibility, may not be TRIPS compliant, but it was enforced in India until two years ago. Pooling patents as between CDMA and GSM, is another option. And Government leadership in pushing for greater pools is also useful policy option for Nation States to exercise. Thank you.
>> ALEX GAKURU: [IPR has not been an issue for the] growth of the Internet. [Everything has been shared and] given out freely. [Is] the Internet now is taking the traditional path some companies which starts very nice, reaches a peak, then starts falling down because of IPR ? I think the Internet is headed for self-destruct because the IPR, so it's the biggest threat on the Internet.
In terms of the legality of adopting open standards and open documents for eGovernance, certain countries have requirements that a document is admissible and qualifies as an electronic record -- legal electronic record if it can be read subsequently by other systems.
I think that's a good thing because it enforces that what is considered legal is something that can be read subsequently.
>> SCOTT BRADNER: This is a question that comes up on the IETF all the time. IETF has used plain text for its standards from the very beginning, despite the limitations on the format for pictures.
>> RICHARD MATTHEW STALLMAN: There is a tremendous danger to the Internet from restrictive standards. Some of these standards are -- some of these formats or protocols -- I shouldn't call them standards. They're formats and protocols. Some formats and protocols are standards; some are not. When a standard or protocol is either secret or restricted by patents, there is a big danger. And what we see now, of course, is that this is spreading even to the ability to boot your computer. Windows wants to make computers that will only run programmes approved by Microsoft boot.

If you hear someone use the term "intellectual property," you shouldn't think that means he understands deeply. Instead, you should think that person is deeply confused, and he's talking about an incoherent collection of unrelated things, and he thinks he's saying something meaningful, but he really doesn't understand. The only intelligent statements to make about these laws are one at a time. For instance, patents do threaten our use of protocols and formats, but it's hard for a copyright to get in the way.

>> SCOTT BRANDER ?? But we have to be very careful not to use the confusing concept of, quote, intellectual property, unquote, because this is a generalization about a dozen or so totally unrelated laws that have no similarity. Even if we just look at copyright law and patent law, they are different in every possible way, and the way they affect the field of computing is totally different. The core protocols, the non optional protocols, are not things which should be inhibited in any way. The optional things, such as if you want to use a particular CODEC, that's fine. There are other alternatives. But if it's the wheels that keep the train running, we can't have intellectual property -- we can't have restraints on that use.
There are have been lawsuits saying that TCP is -- infringes certain patents, but so far none of those have been successful. So to your point, the IETF does not accept proprietary extensions to the core protocols.
>> SIVASUBRAMANIAN Sometime back a standard called OOXML was to be introduced as a standard, and it was not adopted. That was -- was it an example of a proprietary standard being introduced to the Internet, and can you tell me the history about -- history of what happened?
>> SCOTT BRADNER: I don't know the history of that particular one. There are others which have been. Perfect example is one of the anti-spam proposals came out of Microsoft, and that was one where Microsoft attempted to do the same thing that Cisco does in terms of their licensing, saying we won't sue you unless you sue us. But they also inserted a "you actually have to take out a license with us before you can implement it," and the open source people did not like it and the IETF did not reach consensus to support that document.
>> ALEJANDRO PISANTY: This is a critical point to convey to you, a reminder that was made to me by Bernard, that UN rules are unfavorable to all attacks, so I hope that It was a friendly description, it was descriptive.
>> RICHARD MATTHEW STALLMAN: I know about what happened in the case of OOXML. Microsoft invented a basically … document standard so that it could [hinder] the adoption of ODF. Microsoft's so-called open format was designed with a specification that was thousands of pages long, and that was incomplete. So Microsoft invoked a special emergency procedure or exception procedure in ISO where all it had to do was get enough countries' standards organisations to vote in favor. And then it went about [gathering] the support of the national standards organisations. There was a worldwide fight, and Microsoft won. It succeeded effectively in taking control of ISO for its own purposes, which is a very clear example of how the empire of the megacorporations functions. And, of course, it threatens every area of society.
>> VENKATESH HARIHARAN: The final word on OOXML in India was 14 votes against OOXML and 5 votes in favor of it. And the key question was can somebody independently implement the 6,000-page document and this so-called format or so-called standard if they were asked to do it from scratch without taking reference to Microsoft. And there were certain areas where, you know, the standard or the proposed standard said this has to function like [Microsof's] Word 95. It is not really an independent standard which could be implemented based on the specifications that were given, and there were parts of it that were hidden. So that was the key reason why it was voted out in India.
>> MARK BLAFKIN: My question is actually to you about this concept a UN agency taking over the role of IETF, and what form is that proposal coming in and from whom?
>> SCOTT BRADNER: It's not a new proposal. About 12 years ago, the ITU plenipotentiary was being asked to vote that the ITU would be the standards organisation for the evolving Internet. Each plenipotentiary since then has done the same. The [recent] proposal from India, China, and Brazil hints at doing the same.
>> SCOTT BRADNER: The best hope we can have is for open standards processes which are truly open in the sense that what is brought in is -- is thought through and modified by the standards process. Cisco came up with a technology called tag switching, and brought it into the IETF, and what came out isn't exactly tag switching. It's quite heavily modified. And that was a good result. That's not a universal result by a long shot. There are an awful lot of things that are worked through the process in many standards organisations, occasionally also in the IETF, which wind up to be very much what the particular vendor brought in, and that's unfortunate. It does inhibit the openness of the standards process, and we get poorer results out of it.
>> RICHARD MATTHEW STALLMAN: Now, when a format or protocol becomes a de facto or official standard, if there are patents on that standard, it restricts all of us directly. [The patents] restrict us from using software that we control and leave us stuck with using software … that controls the user. My choice is clear. My freedom is more important than being able to use any particular kind of technology... And that makes this problem an issue of overall social concern.
Therefore, governments ought to take action to make sure this cannot happen. Patents should not be allowed to restrict the development, release, and use of software to run on widely used computer hardware. We also have to take care to make sure that secret protocols and formats, which you can find being used by many Internet services available now do not become widely used. If they are being used at some small level, they are still unfortunate, but perhaps there's no need for public institutions to pay attention. But the danger is that if it starts small, it might get big.
>> SIVASUBRAMANIAN: On the ITU attempt to get into the Internet standards process, and if there is a proposal again, do you think the Internet user community should welcome the ITU with open arms to the standards process?
>> RICHARD MATTHEW STALLMAN: I don't know a lot about the ITU, but what I believe I recall I have heard is that it is very closely connected with telecommunications companies and that it will do what those companies want. Now, what do they want? They want to abolish network neutrality. So I'd be very suspicious of ITU involvement. But this is a memory of what people told me quite a few years ago. It's conceivable I'm remembering wrong.
PETER RESNICK: I think the best way to keep things like software patents, which will restrict the rest of us, at bay is to have an open standards process where one needs to come to consensus because if the open software community comes to that open standards process and they say we do not agree to use these kinds of technologies because of their patents, then that will create a situation where a standard can't be produced. If a group of people want to get together and say we want to use this technology, I don't think we should do anything to prevent that from happening if that group comes to consensus. But if we are all coming to the IETF to say let's standardize a practice for doing this particular protocol, I think the folks who want to do free and open software should be in the same position as people who want to do commercial software in that conversation.
>> RICHARD MATTHEW STALLMAN: First of all, it's as mistake to contrast free software with commercial software. Many free programmes are commercial. And likewise, there are proprietary programmes which are not commercial.
>> RICHARD MATTHEW STALLMAN: I am in favor of development standards committees developing specifications through the [open] process. We do have to worry, though, about the danger that a company with a lot of money can corrupt the process like Microsoft did with ISO. ISO [unusually] had this exception built to its rules that allowed Microsoft to buy it. And that's one danger we have to worry about.

[ For an understanding of the issue, and for links to additional documents for fair perspectives, please see http://en.wikipedia.org/wiki/Office_Open_XML#cite_note-infoworld-embittered-18 ]

Finally, there's a fundamental error in saying if some group of people want to use to get some patented protocol, why should we stop them? Well, presented that way, I would agree, but what does this group really consist of? It consists of some company which is participating because it set the scheme up, plus maybe millions of people who never thought about it who were pressured in or lured in by the network of friends and perhaps massive publicity, and this is not -- this doesn't fit the idea of some people who knowingly decided to use a patented protocol. If it really is that, I wouldn't try to intervene. If they're adults, right, they should be able to do this thing that I think is foolish. Why should any of the rest of us muscle into their decisions? But that's not the way it is nowadays, at least not in the case that is affect the most people. So it's very different what a company should be able to do using either a patented or more commonly a secret protocol with millions of people who didn't get together and decide they wanted to be restricted in that way.
>> SCOTT BRADNER: I mentioned that event and Lyman offered ISO the TCP/IP standardization. For many years, the ISO protocol stack, regulated telecommunications. In the U.S., if you were selling to the Government, you had to sell products that met that. There were countries where it was illegal to use TCP/IP on the public network, which was the phone network. You had to use the OSI protocol stack. [ An attempt is being made to propose ] that the UN or the UN through the ITU, to be the Internet standards body and the Internet regulatory body, so the regulators in country can have a better handle over this Internet. A lot of regulators in a lot of countries are very frustrated by the Internet. It's the only major telecommunications scheme in history that is not heavily regulated everywhere. Regulated to the extent of what products you can offer, what prices that you have to charge, what functions you have, what you have to hand over to the Government when, all of that.

The arguments in favor of pushing the Internet standardization into the ITU is one in favor of regulation and in putting control back onto the telecommunication infrastructure. This is a real threat. This is a fundamental Internet Governance issue It's moving from a nonregulated Internet to an essentially nonregulated enviornment. [In such an environment] it takes ten years to develop the simplest application.
Concluding Remarks:

>> RICHARD MATHEW STALLMAN: There is a difference that may be relevant between the free software movement and the open source idea, which is mainly a development methodology, and you can find this in GNU.org/philosophy

>> TRACY HACKSHAW: Right. In my country, we don't influence the Internet. We are receivers of it. And all this information is very intriguing to us in our part of the world. In our part of the world, we haven't even reached the stages yet to discuss how open standards affect Internet Governance. We are still grappling with issues of even getting our entire country to understand what the Internet is and what open source software means to us. Open source and open thinking has not arrived in our part of the world as yet. Clearly, decisions are being made somewhere else that are affecting us, and clearly we need to get a voice in that.
>> VENKATESH HARIHARAN: It's amazing how much we look at everything through the lens of intellectual property. Public policies need to be drafted keeping the Internet in mind as a public good, as knowledge comments, and that knowledge comments are something that should be actively cultured and nurtured.
>> SUNIL ABRAHAM: We are discussing multiple standards-setting organisations and perhaps multiple fora where a Nation State gets to negotiate key clauses in really legal contracts that affect outgoing royalties, incoming royalties, etc., I understand the concerns raised about how slow the UN process is and how slow telecom regulation could be. Multiplicity of fora don't necessarily serve the cause of developing nations. [ because Developing countries do not participate as elaborately and as well prepared as some Developed countries ]
>> SCOTT BRADNER: I completely agree that just having more fora is not a good solution. I wasn't worried about -- and I am not worried about -- the speed of development of standards. The final adjudicator of standards in the ITU and many traditional standards bodies are Government regulators who vote on the technical standards. In the IETF and W3C and a number of fora, the final adjudicator are technical people doing technical work, and that's a fundamental difference.
>> SEBASTIEN BACHOLLET: A comment online. Sites like Facebook exemplify what Richard Stallman was talking about. The space is being presented as open and perceived as open, but in fact, it is a closed space that people are being lured into.
>> PETER RESNICK: There's been a lot of discussion about how open Facebook is and that everybody can develop for it. But, of course, developing for something does not make it a protocol that anybody can interact with, and having an interaction is really what makes something open.
>> ALEJANDRO PISANTY: This workshop and the dialogue has had an extraordinary result in showing how the different ways the word open is used for open standards, open source software, how they come together where they do collide, where open standards do accept a certain level of proprietary control over the technology yet still ways of sharing and licensing are okay for open Internet standards, and they build a platform, which is very broad and which almost everything else is possibly developed. The negative examples of Facebook as a closed platform -- relatively closed platform, the stories of standards wars, and the ways that standards are developed nationally have been brought to -- under the Internet Governance light, which is very valuable.

National standards-setting bodies - which very often are handed over completely to industry, or to very vague definitions of the public interest - are not necessarily able to perform the same good work that the IETF and other standards-developing organisations [which do their work] by considering all the public interest objectives, yet leaving the technology standardization to the technologies themselves.

 

Conclusions and further comments:
Plurality in key protocols is not necessarily a useful thing because it Balkanizes the network, and then we have reduced network effects. And multiplicity of standards completely vitiates the very purpose of having a standard. Open standards are standards developed in an open process. There was a strong view that there is a tremendous danger to the Internet from restrictive standards; Patents should not be allowed to restrict the development, release, and use of software to run on widely used computer hardware; We also have to take care to make sure that secret protocols and formats, which you can find being used by many Internet services available now do not become widely used. But the balancing conclusion from the Chair was that "Open standards do accept a certain level of proprietary control over the technology yet still finds ways of sharing"

Public policies need to be drafted keeping the Internet in mind as a public good, as knowledge comments, and that knowledge comments are something that should be actively cultured and nurtured. In the IETF and W3C and a number of fora, the final adjudicator are technical people doing technical work, and that's a fundamental difference.

 

Related Sessions

Main Session in Rio de Janeiro on Other

12 November 2007 | 1127 views

Main Session in Rio de Janeiro on Other

12 November 2007 | 1233 views

Best Practice Forum in Rio de Janeiro on Other

12 November 2007 | 852 views