Governance of Social Media

15 September 2010 - A Workshop on Privacy in Vilnius, Lithuania

Also available in:
Full Session Transcript


Note: The following is the output of the real-time captioning taken during Fifth Meeting of the IGF, in Vilnius. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.


>> BERTRAND DE LA CHAPELLE:  Good morning, everybody, thank you for coming.  This is an attempt at a new type of format for workshops.  As explained in the    sorry, first of all, I introduce myself, I am Bertrand de La Chapelle.  I am the special envoy for the French Foreign Affairs Ministry.  I have been following those processes for quite some time, and I felt that at this IGF, it was interesting to explore a new intermediary format between the workshops that deal with the substance in depth and the plenary session that are necessarily focused on very short messages and exchanges.  So the idea, and we agreed with the secretariat to test this new format that would be a sort of wrap up or coordination among the different workshops that deal with similar issues in order to identify specific messages or specific threads that could be put into the main session on security, openness and privacy.
So without delving into too much detail, the format, if you agree, that was discussed and presented on the page would be as follows.
We would have the first half offering the possibility to people who are organizing or co organizing workshops dealing with issues related to what we can call governance of social media to make a very quick presentation on what they have discussed or what they intend to discuss in their workshop, and then the second half would be an exchange among ourselves on what could be the few messages, the few threads or the few initiatives that could be presented in the main session to either organize the work between now and next year's IGF, or to become an objective for the organisation of next year's IGF.
I hope that this is a format that you find acceptable, so two halves.  Do you have any questions on this approach?  Is that acceptable?  Is that okay?  So if it's okay, I suggest that we move forward.  How many people around the table have been organizing workshop either already or hosting a workshop in the coming days who would like to present it right now in the first half so that I know how many comments?  I have one, I know that Lisa Horner was around.  Who else, can you raise your hand on how many would want to make a quick presentation?
I think we are missing two other people who were supposed to join, so we should have two or four presentations.  I suppose some people are just joining from other meetings.  So maybe I will first give you the floor    yes, I know, but what I would like to see is you have been participating in workshops and organizing workshops, and I would like feedback on    so could UNESCO maybe start, could you start and present the workshops that you are organizing?  I'm sorry about the overall noise.  This is a general problem, but, please.
Thank you, Mr. Chapelle.  We just organized our workshop yesterday morning to launch UNESCO reports, a new report to analyze the freedom of expression on internet.  It is a different, I mean, there are so many reports on the subject the different approach of this is not only focused on the filtering and blocking, but trying to analyze the subject, the free expression in a larger ecology.
There are a wider range of approaches and policy approaches which have been analyzed, and, you know, the way    and also are from the Oxford Internet Institute.  I don't know if they came.  If not, I will just give a very brief briefing on the different approaching to the information flow.  They analyze the    they identify six different approaches.  The first one is the industry or policy and the regulations, such as copyrights, intellectual property.  The second one is called, we called it user centric approach such as children protection, such as fraud, defamation and speech, and the second one about the net centric policy which is linked to the internationalization of domain names, for example.  And then is the security policy in terms of the relation to privacy   
So but generally the participants share, there is a growing trend to restrict information flow on the internet, and so we need    we need further to explore the expanded range of the policies and regulatory issues.  And a lot of debates focused on whether there should be a universal accept    universal standard to block the information, but at least, I mean, the participants agree that we should have transparency when we have to block some information.  Yes, that's it.
Thank you.  I improvised, so I may not be inclusive, but I can add later on.  Thank you.

>> BERTRAND de LA CHAPELLE:  Thank you.  One question right now that I would like to have participants ponder and maybe comment upon, the mention was made about user centric approach, network centric approach, and the general question of whether there are over arching principles.  There was another workshop this morning on very interesting one on the charter of rights and principles.
I know there are many actors that are developing either in trade associations or in regional levels or in NGO's, principles or international organizations, principles.  One of the challenges that we have is to identify the right level to deal with the issues of governance of social media as privacy, freedom of expression and so on.
I would be happy to have some comments after the initial presentations on this notion.  What are the right tools?  Should there be international principles?  Should the governance of social media be done, social media by social media, like a charter for Facebook, a charter for You Tube and so on.  Just think about it, and I will be happy to have your feedback immediately afterward.
If you want to follow suit.

>> Thank you.  Kurt Opsal, I'm an attorney with the Electronic Frontier Foundation, we are a civil society organisation dedicated to defending freedoms on the internet, free speech, freedom of expression, privacy, innovation rights.  And I just wanted to discuss a few of the principle that's we have been thinking about in terms of social networking.
And what we have come up with is a, what we call the bill of privacy rights for social networking users.  And these are some principle that's we think are appropriate for all forms of social media to follow and that we will be judging these media by whether or not they follow it, so as to help inform consumers where a good place to go is or where there are challenges.
So the three basic principles are, number one, is the right to inform decision making.  Number two is the right of control, and number three is the right to leave.  And let me explain each of those with a little bit more detail.  On the right to inform decision making, this is a very important thing in social media because sometimes it is very complex how it operates, especially in the privacy sphere.  We saw this a lot with respect to some of the changes made by Facebook in April of this year, a lot of users did not understand what they were doing, so they were making choices, but they didn't understand the choice that's they were making.
And if you don't understand the choice, then a principle of choice is an ineffective principle.  So the user should have a clear user interface that allows them to informed choices about who sees their data and how it is used.  They should also be able to readily see who else is entitled to access any particular information about them.  That would include other users, government officials, web sites, applications that interact with the social media, advertisers, advertising networks, so that when they are making a decision about what to place on the social networks, they know what the effective that decision will be and who will get to see it.
In case where's it is non consensual accessed or court ordered process in criminal or civil courts the social networking service should give the users notice when the government or the private parties are using these processes so they have an opportunity to respond.  They have due process to assert whatever privileges and rights they may have to resist that disclosure of information because without the notice, the social network is not really in a position to assert those rights on their behalf. 
Principle number two is the right of control.  The social networking service should insure that users retain control over the use and disclosure of their data.  So the network, the service provider should only have a limited license to use the data for the purpose which was originally given.  And so, of course, over time social networks evolve, they change, come up with new services and new and innovative ways of using data, and that is fine and that is often very socially beneficial, but if the social network wants to make a secondary use of the data, it must get a further permission from the user, an opt in permission.  So the user, initial choice when they gave the data does not mean unlimited license for future uses.
This came up in particular with the evolving privacy policy of Facebook, when Facebook first started five or so more years ago, it was a very closed system.  It was limited to universities, particular universities and the information was only shared to one's friends on to system.
And then over time, the privacy policy kept on evolving until more and more information was made public, and to take advantage of those services, people should be entitled to take advantage of the services but only if they understand what they are doing and are making a right choice about that.
And the right to control includes the right to decide whether friends can authorize the use of data, and in many instances, your friend on a certainly network will have access to your information.  They will see what your favorite movies are what your favorite books are, the kind of information.  But the second degree, what short do your friends have to make decisions about who can see your data.
And the originating user should have a level of control over the disclosure to these third parties.  And so when information is being shared with a new category of people or used in a new way as we were just describing, they must ask the permission, and it should be opt in by default, not opt out.  This is addressing the issue where some people might not go back and check, you know, every day, what the changes are on the privacy policy, and they should have the confidence of knowing that if they do nothing, if they have no interaction, then they will    they will have the level of privacy that they had come to expect from their prior interaction.
Number three is this right to leave.  So one of the most basic ways that somebody can protect themselves and enhance their privacy is by moving to a service that offers more privacy, and have competition amongst various services as to which one provides sort of the right level of control over information and the right level of balancing, interacting with your friends and having control over your information.
But if you are unable to easily move to a new service, that is an impediment to having that sort of competition.  So there is some aspects of this right to leave.  One is the right to delete.  If you decide that a social networking service is not providing the level of service that you are interested, whether it be for privacy reasons or otherwise, if you leave, it's only an effect of leaving if your information is deleted and it is no longer on that service.
So it is not enough for the service to simply sort of disable access and make it temporarily unavailable or make it sort of hard to access, but to not be there anymore so that you don't have to worry about what will happen to that information in the future.  And the second aspect of it is to be able to take your uploaded information, the information that you have provided to that social service, and then be able to move it somewhere else, and most of the social networking service you can do this at least in a clunky way, by going on there and accessing as you ordinarily would and then perhaps using a cut and paste functionality to sort of slowly obtain your information.  But having it in a fusible format so the information that you have provided you can then take away, and move to which service that you so choose in the future.  And data portability.
So these are the three principle that's we have been putting forth in our bill of privacy rights.  And I wanted to make an important note about this is that while these are important principles and principles that we think that all social networks should follow, I want to point out there is a risk of regulation, of developing regulatory rules that try and enforce these principles.  And that is because of the fast moving technologies that are involved in social networks and there is a severe danger that if you come up with specific rules that try and implement these, that within a year and two years those rules will reflect a technology that's no longer the operating technology, and will have unintended consequences for what, for innovation and unintended consequences for privacy, vis a vis the new technologies that we don't know about.
So one of the ways that at least we try and crack this nut is to use these two provide pressure to the social networks to ask them to agree to abide by these things, point out when failing to abide by this, and have consumers move from one network to another to take advantages of the services that are providing the privacy that consumers want.

>> BERTRAND de LA CHAPELLE:  Thank you very, very much for the clarity of the three principles.  Like I did for the first intervention, I would like to ask you to think about comments on the following points that have been raised.  I think the notion of portability of data is a very important one, what do you feel about it?  How can it be implemented?  Is it going in the direction of true standard for portability?  Or is it an add duct basis where you can port from one basis to another one but not to a third party?
The second thing is the motion of any modification being brought to the terms of service basically having to be an opt in mechanism, so the kind of predictability that you know that if you have not accepted the new rules, then the old rules apply.
How does that work on both sides of the equation?  What kind of burden does it bring on the company regarding some services that need everybody to endorse it?  And at the same time, how does it combine with the predictability of the user of a certain stability of the legal environment?  The other point is the importance of the clear user interface which is an element of transparency regarding the rules.  Today we click on terms of service basically without reading them.  So user readable lawyers' language is an important element.  Can we, for instance, take inspiration for the creative comments approach that has made logos very simple to understand for a few types of categories.
And a fourth element is the question of enforceability.  I want to throw in the discussion something that has been evoked sometimes about having trust marks or rating agencies, could we envision having third parties doing this kind of monitoring of the quality, putting labels?  I know rating agencies do not have a very good reputation at the current moment because of the financial sector, but is this something that could be envisioned?  So these are the four points, rating agencies, user interface, and clarity of regimes, pros and cons of the opt in approach for modification, and portability standard.  We will come back to that immediately afterwards.  Lisa, can I ask you to give an update if we can overcome the overall noise on the workshops you are organizing or you have taken part in including the dynamic coalition this morning.
Thank you.

>> Sure, thank you.  We have just come out of a very interesting and lively workshop this was being held by the internet rights and principles dynamic coalition, and the focus of that workshop was an open consultation on a document that the coalition has been working together to produce.  That's the Charter of Human Rights and Principles for the Internet.
And the same of that process was really to translate existing human rights standards and apply them to contemporary internet policy issues and the evolving internet ecosystem.  The process has been collaborative so far.  We have a document that we are broadly happy with, although we know that it's by no means perfect, and it's not the end product.
So now we have a version 1.0, which we will be taking forward and doing further consultations on with different stakeholder groups.  The coalition is multistakeholder, and so we are soliciting the views and opinions of all groups and the private sector and civil society and governments and intergovernmental organizations.
I guess it's relevant in the context of this discussion because the document that we are creating aims to identify both the human rights and how they apply on line, but also what we are terming implementation principles.  So what are the policy principles that are practical and that people can use to help them in the day to day policy making and practice on line?
We have identified in a way three levels of rights and principles within the document.  First, we really have the human rights standards as defined in the universal declaration of human rights and translating that to the on line environment in a very top level.  So, for example, the right to freedom of expression should be protected and promoted and fulfilled on line.
The second level down from that is saying, okay, so what does that actually mean in a bit more detail?  And you have a series of principles which really flow from that.  So, for example, that's censorship or control on the flow of information shouldn't be embedded in internet networks themselves, that they should be controlled by end users so that's one principle that falls down from the right to freedom of compression.  The next level down is saying, okay, so we have those, they are still fairly broad.  What does that actually mean in terms of the roles and responsibilities of different stakeholders for which those principles are relevant to?
Whose role is it ultimately to protect these rights to fulfill them, and what should, and what can the different stakeholders be doing?  So that's the efforts of the coalition, really, to move towards rights and principles, but within a very broad framework.  So many of them are relevant to social media.  Some might be less relevant, but I think that that process and those learnings from that process is relevant to this discussion.
Also relevant to the notion of creating principles our approach to the charter as a whole, and that it's not supposed to be a document that we expect the UN to ratify tomorrow, for example.  That might be an end goal, but for me, my personal level, that's not really a primary aim.  In the short term to medium term, we have three main names really.  Firstly, it's a    the process of the standard setting, the process of defining these principles is important in itself as well as a means of bringing different people together and getting us thinking about the issues and seeing where we can agree and where we might have to agree to disagree.  That process of trying to foster collaboration and dialogue, I think, is one of the aims of the charter itself, and it's something to bear in mind in terms of any principles in standard setting.
The second aim really is for it to be a norm setting document, but in quite a bottom up way, that it's seen as something useful for different people and, therefore, used as a tool to inform policy making and law making.  And then finally, related to that, it really is like an educational resource and tool that people might decide that they want to use for advocacy should they think that formal rule making processes are going the wrong way or it can be used as a resource for making, making those policies and rules.
So I just wanted to give you a bit of background in that sense I guess talking a little bit about the process of developing principles which I think is relevant if we are thinking about developing relevant principles for social media, and I guess that process is ongoing and some of the things within that document might be relevant for social media, and social media governance.
Just quickly, the second reason I am here is that I am co organizing a workshop later on today at 2:30 in room 2 on freedom of expression and intermediaries where do we go from here?  And while we don't expect to come out with any defined principles as part of that workshop, it is part of the process of moving forward and trying to think about what the principles might mean already within the group itself we have some who think that intermediaries should always be protected no matter what, and we have some who are interested in looking at the flip side saying whether there are any responsibilities involved.
So I think if we are trying to define principles, you are always going to come across diverging viewpoints, but I think the process of discussion is important in terms of norm setting and moving towards an end goal where hopefully we can start to tease out the issues and understand what we can and what we should be doing.
So I will leave that there for now.  I hope that that was useful, and I look forward to this discussion.  Thank you.

>> BERTRAND de LA CHAPELLE:  Thank you very much, Lisa.  On a personal behalf, I must say as a French government representative, I have been following the work of the Dynamic Coalition on Rights and Principles from the very beginning, and I want to present a personal, just on the side note, a personal testimonial on the difference between the multistakeholder process in the IGF and what would happen or have happened in a purely intergovernmental framework.  In the very early discussions on the opportunity of a charter on internet rights or an internet bill of rights, there was attention on whether the notion of designing a charter would actually reopen a can of worms of the balance of the universal declaration of human rights, and actually result in a lower standard.
And for that very reason on a personal basis, I was relatively not very warm on the idea of a new document, because the principle access was to reaffirm the existing rights and make sure they are fully applicable to the internet.  One of the beauties of the IGF process is that if we had been in an intergovernmental discussion space, we would have argued for ages about whether we should begin the drafting of a charter or not.
In the IGF space, what happens was that the people really thought that it was good to do it, gathered in the dynamic coalition in full view of everybody, and they made the proof by walking that actually the work that we are doing is useful, because I attended the workshop this morning, and the result of this work is absolutely fascinating in terms of the attempt at balance between merely reaffirming the principles, which is very good, finding the concrete application and distributing the role.
So it is certainly not perfect, but I could imagine that in the course of the successive IGF processes, it will improve even farther.  So I want to encourage you in the setting of the security openness and privacy to not only mention the content of what you have done, but to illustrate the process, because it is a very good example of what dynamic coalitions can make.  So like the previous comments, I would like to pick out of what Lisa said the first question, which is what could be the future of this document?
When would it be endorsed more officially?  What could be the format for making it a very recognized document?  Are we looking at bringing it to an intergovernmental space of one single one?  Are we thinking about getting endorsement by several groupings?  Is it about getting a critical mass of actors that will grow?  So what is the endorsement mechanism for something that has been draft in a multistakeholder format?  Because we are now exploring the decision shaping, the drafting.  We still need the question of how do we give weight to something that has been elaborated in that manner.  So that's the first question.
The second thing is do people feel comfortable and see benefits in the three levels that Lisa had mentioned, the principles, the rights, the principles, and the actual roles?
And the third question is could we discuss a little bit more about whether in the implementation we should identify, and this goes to the second part of your comment regarding the role of intermediaries at the workshop this afternoon, should there be a distinction between the role of intermediaries depending on whether they are communication intermediaries like ISP, they are hosting intermediaries like You Tube and so on, or there are spaces intermediaries like Facebook, which is common places.
It's just a question.  I wonder whether there are different constraints for these different types of structures?  So that being said, are there any people    I hope that, to have somebody from Council of Europe, but apparently they could not manage to reach us on those workshops.  Is there anybody else that wants to give an input regarding a workshop they speak or where they organize that is connected to the topics we are talking?  Anybody?

>> Actually, we are also going to organize another workshop this afternoon.  It's on the privacy and social networking, but, you know, UNESCO, we are an organisation with a special mandate.  We are only on promoting freedom of the expression, so our approach to see the interrelation between freedom of expression and the privacy, we want to see how we can protect both from the rights equally without a compromise, I don't know is that possible?  And I also welcome you to participate this afternoon to see what will come out from the debates.  Thank you.

>> BERTRAND de LA CHAPELLE:  Thank you.  So I had another question that we might explore, which is the case is whether there is attention between principle that's we want to protect, and this is an issue that is popping up very frequently in the implementation of the human rights declaration.  And so privacy and freedom of expression is one element, but there are probably other cases.  On a personal note, I also will mention later on the Franco Dutch initiative on freedom of expression on the internet.
We will have a ministerial meeting in October, and I will give in due course additional information.  So I would suggest, unless there are specific questions at that stage, to open the floor for, let's say, an hour now, maybe until a little bit before 1:00, 45 minutes for discussions and comments on the topics I identified and any other that you might have in mind.  Do you want to speak?

>> Yes.

>> BERTRAND de LA CHAPELLE:  I'm sorry, I didn't hear.

>> I would like to have the floor.

>> BERTRAND de LA CHAPELLE:  You have the floor.  I'm sorry.  Introduce yourself, please.

>> Thank you, Mr. Chairman, ladies and gentlemen.  Sorry for the loud speaker, maybe nobody hear me.  But the problem of social networking is   

>> BERTRAND de LA CHAPELLE:  Introduce yourself, please.  Excuse me, could you introduce yourself first, please.

>> My name is Andre Sherbos, higher school of economics.
The issue of social networking is extreme important in development of the internet nowadays because the web 20 technologies, a lot of web sites, a lot of portals, a lot of very important connective links in internet based on the ground of the social networking.  And so I see personally that the social networking  is a new stage of the development of the internet, and it is quite new level of interest internet governance.
We shouldn't forget that in social networking there is a community of users, community of users, of responsible users which are creating, moderating and following their rules of behavior on their social networks.
We need a moderator because if someone will express destructive behavior on this social network, he will be banned by the internal sanctions of this web resources.  For example, like Wikipedia, Wikipedia is one of the examples of this self governance of these web resources.  So our task here, this is strictly my opinion, our task here is to have help to educate, to grow up those communities of users of social networks.  For example, it will be useful to provide some guideline rules for the networks like that.  This will help to observe freedom of expression, information accessibility without infringements of personal information.  It will become personally on this social networks.  Our task is to assist them.  This is strictly my opinion.  The other problem is that we have another types of web sites and what to do when there is no communities of users on relevant web sites when we have a special effort of infringement of the freedom of expression there based on, for example, political and other grounds, like using speech, like using racist or other impressions.  We should do something with this web sites because in Russian network, in Russian internet, we haven't    we have not so big level of the culture of communication within the network.
So I think that the issue of the information culture, information culture is very important for that stage.  Thank you very much.

>> BERTRAND de LA CHAPELLE:  Thank you, thank you very, very much for the contribution.  And this allows me to make a sort of metallurgical comment in here.  In this limited time frame, we are not going to address the issues in the substance.  The goal is to have the different headlines to frame the debate and make sure that we cover the different dimensions.  This is why I appreciate very much the comments that were made before, because they provide structured discussion.
In that respect, I would like to pick from your intervention some additional elements that we had not covered before that I would personally suggest are very interesting.  One is the notion that with social networks, web 2.0, social media, we are entering a new stage that somehow we have been moving from the internet to the web and that we have something that we don't name yet, but that is a social net, that is a real cyberspace where people interact.  Whatever we name it, I would like to get your feedback on whether we do feel this way, that there is a new stage.  The second thing which is very, very interesting and what you said, is the notion of the internal rules of social spaces, the fact that there is the development of in a certain way the governance rules within each network.  So the notion that somehow each network, Facebook or You Tube or so on are becoming spaces in their own with their own governance rules inside.  Some are more restricted regarding what you can post.  Some are more open.
You may have followed on line the debate about Craig's List when had a very open posting policy, and that now is beginning to restrict some of the most offensive things.  So the second notion is the notion of internal government framework for each entity and the third point which is connected to the second, right on time, the third point is what is the degree of involvement of the users and the members themselves in the governance framework and in establishing the rules.  In Wikipedia, there is a lot of layers regarding participation.  Just one question, could someone try to find a seat for Thomas?  Yes, Thomas, there is one here.
So in Wikipedia, we have the participation of the users, but in Facebook, the users do not participate yet in the establishment of the terms of service.  So I don't get into more details.  Three elements, is there a third generation?  How do we name the governance frameworks for each space, and, three, how do we address the right to participate, to take the charter formulation, the right to participate in governance mechanisms internally?
Thomas can I briefly grab you at the moment to ask you in a short element of presentation indicate what are the workshops that the Council of Europe is organizing here just in terms of the headlines and domain angles of analysis so that we can finish the round table and open the discussion a little bit further?  Go ahead.

>> Hello to everybody, I'm not really prepared yet.  I have been busy by other urgencies.  Actually, maybe I'm forgetting a few things, but there is one which is particularly important, which is a merged by Council of Europe, electronic frontier and others about internet intermediaries and the question of responsibilities and so on and so forth, which the contribution of the Council of Europe was the one that reflecting the work of the Council of Europe in the implementation of the    ministerial conference where we are about to develop a new notion of the media which should not only cover the traditional mass media, but a whole range of actors in the media field which might not be media in a traditional sense but pure intermediaries with no media functions and all of the layers in between where some actors take on some functions of the media.
And this is a discussion that we think is very relevant, it very relevant in Europe, but also for other regions, and this part of the discussion of the workshop on internet intermediaries that is going to take place this afternoon.  Maybe this workshop has already been mentioned.  Maybe there are some other issues or workshops that the Council of Europe is doing but I would have to get my mind more focused.  Thank you.

>> BERTRAND de LA CHAPELLE:  Thank you very much.  I pick on one element that Thomas was mentioning, a discussion that took place yesterday where the head of a audio visual and media regulator in one country was saying we are confronted with a problem that we are giving licenses for TV and radios in our country, and at the same time on the internet, people are setting up radios and broadcasting systems through You Tube channels and so on what should we do?  Should we expand our regulatory role to the internet, which is not an appropriate way.  And at the same time, how can we continue to give licenses to people when you can go around with another mechanism?
So I think this notion of what is a change in the media space is important.  So are there comments to align to the discussion in you first then you then you.  It will be hard because I don't know you.  Go ahead.

>> Okay.  I'm sorry, is this on?

>> BERTRAND de LA CHAPELLE:  Yes, it is.

>> I'm Lillian Edwards, professor of internet law at the University of Sheffield.  I am here doing some work with the council of Europe and the OECD but I'm speaking purely as an independent academic so I can perhaps speak more plainly than some.  I welcome the contribution from the EFF but I'm a little disappointed that it doesn't go farther, I have to say.  The gist of what was said about the right to informed decision making appears to be really a slightly enhanced version of notice and choice, which has been enshrined in European data protection for a very long time and seems to have made almost no difference to the problems especially around privacy in social networks in Europe despite the fact that the major networks do aspire to ascribe to data protection law.  In some discussions with the European Commission around the upcoming reform of the data protection directive, the idea has been floated that this is essentially a consumer protection problem.  As we well    even if they did, they probably wouldn't understand them as we have said, they wouldn't be aware when they were changed.  You can deal with that problem, but the problem remains that this is not really the way to approach it and I think we are all aware of stories about how people downloaded games and it said you were signing away your soul, for example, and this didn't permit receivers from downloading and using the game.
One approach that I am in favor of myself is to regard this as rather akin to an unfair terms problem, and the upcoming European approach to that is one of black list, white lists and gray lists of acceptable clauses.  So, for example, it could be with industry in a co regulatory way, contracts, model clauses for social networks, of course, different categories or different intermediaries could be developed in which certain forms of information gathering, data sharing, targeted advertising would simply be unacceptable or perhaps acceptable in certain circumstances or target audiences, perhaps not children, for example.  And that, of course, pushes the problem one stage back, which is endorsement, as in who will take on the job of auditing the contracts of the social networks and making sure that they do what they say, but at least it's a start.
And I think it's a start that's a much more sensible way to start than going on with this 19th Century fallacy, the 18 year olds that want to go to parties are going to read terms and conditions and make informs choices.  That's one I feel strongly about.  A couple of other points I would like to ask, data possibility is one that comes up a lot.  But it has to be expressed as interoperability, because it makes no difference if you can take the data and leave Facebook if your other five million friends don't leave Facebook.  One is an intraopability staple that works like open ID on speed, and secondly, a French concept, the idea that you should be able to delete any data on the network, the right to forget, which I think is very much the way to go.
Thirdly    is really one that there is no evidence ever worked.  Indeed, there is contrary evidence that you could correlate the existence of industry seals or privacy seals with the prevalence of that site being involved in consumer fraud because in fact people just cut and pasted the privacy seal so as to entice the consumers to use the side.  My own idea to throw in a web 2.0 word what we should be looking to accreditation bottom up by the users, we know the/dot effect, people vote for the reputation with their ratings and that seems to me to be the way to do it rather than looking to a top down model which is 20th century and broken.

>> BERTRAND de LA CHAPELLE:  Thank you very much.  Sir?

>> Hi.  John Lapri, I'm a professor of communication at Northwestern University.  I'm speaking as an independent academic.  I want to echo some of the comments I just heard, and also I'm concerned about the EFF's perspective.  First of all, is there anyone here who actually runs a social networking site, a for profit social networking site?  Okay.  So when we are talking about limiting the use of data of personal data, you are attacking a for profit company's profit centre these days.  You have to make really strong arguments to those companies to get the, to voluntarily reduce their profited.  What is your argument for that?  I have heard EFF went against the idea of regulation, which I can understand for the reasons given, however, in the barring regulation, how do you convince Facebook to voluntarily give up a percentage of their profits, for instance?  I'm very concerned that, just that the whole idea that we are approaching here about protecting individual data doesn't have a primary stakeholder at this table or at this meeting, which is the social networking groups that are, we are concerned about.  We need to bring them to the table to discuss how we can do it in concert.  Because if we don't involve them, barring regulatory sticks, they are going to be unwilling to cooperate in this rigid arch.  I don't see any compelling reason for them to do so.  On a side note for the last comment, I would disagree with the previous commenter regarding the right to forget.
I think, again, for the same reasons, for profit industry is not interested in that, and to a large extent government is not interested in that.  While it's an interesting idea, I just don't see the practical support for those ideas in the sectors that actually hold that kind of information.  Thank you.

>> BERTRAND de LA CHAPELLE:  Thank you very much.  I will just piggy back one moment on the last question.  You asked regarding the right to forget, because as you know, this is something that the French minister for digital economy is pushing.  She announced yesterday in bilateral meetings and in the plenary that we have a charter that will be signed in France among the actors including large social networks on this question by clearly defining the aspects we are dealing with which are behavioral data like the involuntary data being collected and voluntary data you have posted.  So these are the two legs.  I can give you more information about it, but the argument is not necessarily that the companies do not want or will be hurt in terms of their profit.  The argument can be made that if they do not participate in that, they will be hurt in their profit because of the disaffection.  So it's a give and take, but I agree it's an important subject.  Can you go next?  And then I have you and you, you and   

>> Thank you.  Kurt Opsal, Electronic Frontier Foundation.  I guess I will start by responding a little to some of the comments made earlier.  First of all, I tend to agree, no one does want to read the ULA.  When they go to a party one of the biggest problems with ULA's is that people do not read them.  To be clear what we are talking about in terms of the right of informed decision making is more about the controls that are available with the service, on how it is used.  When you post something, to whom that information is available, whether it is available to everyone, whether it is available to friends of friends, to your friends and what does that mean?  And if your friends have access to it, what does that mean?  So it is not so much about the ULA.  That is more encompassed in the right to control.  So that if you have the right to control how your data is being used on the service, then you need to also have the right to informed decision making so you can effectuate that control.  If the ULA takes away controls and says that in some circumstances you don't have the control, then you are writing up against this right to control and have a problem there.  So just to clarify what we are talking with those two rights.
Now, turning this one, do you convince the social network to give up profits?  This is, of course, a challenge that many of these companies, part of what they are trying to do is to use the data provided by the users in order to advertise to those users and then generate profits based on the advertising.  Bit nevertheless, actually, we have had some very substantive and effective conversations with a variety of social networks promoting these bill of pricy rights and have received a pretty good reaction from our face to face meetings over this.
It is too bad that there are no representatives here right now, and I think this would have been a more interesting conversation to have their perspectives here directly, but I would say that, you know, in those conversations, you know, those are private conversations so I don't want to get into the substance of it, but I would say that one of the concerns they have is that they have to have the trust of their users.
And Facebook, you know, a very prominent example, but I think you can also look at the reaction to Google "Buzz" when it was introduced to show that Google, they have orchid, orchid is popular in some countries, but is not very competitive with Facebook and they said, okay, let's come out with Google "Buzz" as a method of competing.
And unfortunately for Google, right out the gate, the news was all about the privacy problems associated with Google "Buzz" and as it turned out, Google "Buzz" did not end up being a particularly potent competitor to Facebook and that sends a message not just to Google but to a lot of different social networking services.  If they want to compete and get out the gate and entice users to come over on mass to a new service, then they have to think about the privacy implications and they have to make sure that walking out the door they are not going to be hit with a privacy controversy.  So this makes it worthwhile to have those conversations and so that's hence they are not giving up privacy by implicating, I'm sorry, not giving up profits by implementing privacy protections but actually making profits even possible by having a service which successfully gets off the ground.
Briefly, just, I had responded to this, but there was one other point that I wanted to make.  We were talking about before different types of governance and I just wanted to for purposes of discussion and structure, you have the sort of the Craig's List communitarian approach where members of the community will flag things for review and so it is the will of the community as reflected by the amount of flagging that gets reflected and what gets taken down.  There is sort of the more structures communitarian approach of Wikipedia where they have processes and due process.
With respect to social networks, I did want to point out an experiment that Facebook did try.  They have something which are the Facebook principles.  They had a process, they sent out a statement of principles, you can find them on the Facebook web site, and they went through a feedback and then voting process, but it did suffer from very few users were active in that process.  It ended up passing, of course, but not because a large number of users, a majority of users voted for it, but because a sufficient number of the few that participated.  So I'm not sure this was a particularly democratized.  But it is a form of internal governments worth discussing and there are ways of expanding upon that idea where members of a community could participate in the development of the principles associated with that community.  Thank you. 

>> BERTRAND de LA CHAPELLE:  Thank you for the positions and thank you for the categorization of the types of governments.  I would be interested if anybody has additional category to those three types, maybe under are others.  The last one is pointing towards a joke I'm making sometimes that we need a Facebook governance forum to define the terms of service.  Please, and introduce yourself.

>> Do I need to keep it pressed?  No.  Okay.  My name is Clarissa Smith.  I'm an academic at the University of Sunderland.  I'm here at the invitation of APC.  And we talked yesterday in a session on sexuality, and sexual rights and how that relates to privacy in governments.  And I think it's quite interesting that there hasn't been any discussion of that here so far, but it's clear that when we are talking about community standards and how that might operate on something like Facebook, that often the community standards are problematic for groups that don't conform to normative sexual practices or whatever.  And why it might be nice to think of the community as policing particular behaviors and we talk constantly about the protection of individuals and maybe, you know, freedom from things like hate speech.  Community standards are manipulated by interest groups who actually are supposed to perverse activities and I want to make sure that that comes out in quotes because I don't believe they were in quotes, activities of feminist groups and queer activists, and, also, of course, people who wouldn't see themselves as politically identified around their sexuality, but who practice things that other people think are disgraceful.  So I think that's an issue that needs to be addressed now too.  Thanks.

>> BERTRAND de LA CHAPELLE:  Thank you.  I think you are pointing in a direction that is also likely to concern the actions of one special social network versus another because we are going into more and more specialized social network.  There was a very interesting article in Bloomberg business week last week on the fact that around Facebook there are many initial social network and the more focused they are on specific behaviors the more there will be clashes against people who have different values.  So this is an element we could take into act.

>> Hi, my name is Janna.  I an international lawyer from the University in Poland and I believe this is a question that is directly linked to what we have been talking about.  I wanted to comment on the EFF proposal.  You convinced me and the second right you mentioned is the key to the problems we are discussing here.  It's sad that you have to be able to decide what your friends do with your data.  This is my question because I believe this is the question of privacy what your friends do with your data, pictures, for example, so how are you going to enforce that?  How are you going to state whether two 15 year olds fighting over a picture on Facebook being transferred onto another social network, how are you going to execute that?  We have had that in Poland with a social network where the pictures were taken from there and on line.  And the operator is as one user said he agreed for me to use his photo.  The other is saying I have never given him my approval of that.  So how are you going to do that would this be a voluntary DMCA mechanism among the platforms where either there is a court suit or not and if there isn't a court suit the content is being brought back because I believe this is the core of the problem, how are you going to enforce this, this right to be executed.  The comment I wanted to make as an international lawyer I believe in international treaties, however, you have convinced me that working with ethics and users voting with their feet or voting with their data as a more practical mechanism.  Lisa I was saying that I absolutely appreciate the work that you are doing and the document that is being developed and I admire you for being on top of the information coming your way, but I have a question, with the issue being raised that written documents will not keep up with crime, you said that one of the aims is producing a document that would be able for adopting by the international community.
I believe in an interest national document be adopted, however, with the threat    adopted, however, with the threat mentioned do you think this should be the aim or stick with the ethics in convincing the states, communities that these principles should be enforced with ethical means instead of legal means.  This is my question.  Thank you.

>> BERTRAND de LA CHAPELLE:  Thank you very much Lisa, if you don't mind we will finish the round of questions and you will have the.  Opportunity.  I think Victoria was the first one.  Given the time, we will probably, unless there are urgent requests wrap this round of discussion to try to identify the messages that we can control it.  Sir?

>> Sir, I just wanted to pick up on the point of the lady over here, my name is Brad Solomon.  I work with Access which is a global movement for digital freedom and we work with individuals and organizations who have a human rights agenda or democracy agenda, and I guess I just wanted to pick up sort of the pointy end of this issue whereby human rights activists and democracy movements around the world are increasingly using social networks as the primary form of communication and to advance their agenda so if you think about the Iranian green movement using You Tube or Venezuelan activists using Facebook or Burmese, in and out of Burma using Twitter, increasingly the front lines of human rights defense is moving on line and moving into the social networking platforms and I guess there are a number of examples whereby, for instance, this is an example that happened last week is that the green movement of shiraz which is a city in Iran uses Facebook primarily as their point of communication and mobilization.  Because all other avenues are no longer available to them.
And that's site disappeared last week and it disappeared because thousands of people organized by the regime flagged the terms of service.  And there is an automated process within Facebook, for instance, like many other social networks, social networking platforms which is automated and it doesn't actually look into the specifics of the particular case.
And so we have developed some principles for social networking sites to be able to sort of develop, I guess, a concierge service for human rights activist and democracy movements to be able to flag, to prevent this kind of situation from happening and also to respond to develop a process so that the services respond quickly, because in many of these cases it is actually a question, it can be a question of life and death.  And so the kid who wants to go to a party, you know, it's a question of scale.  So how do we develop those processes internally so that human rights activists and advocates not mixed up in the group of kids who want to go to a party or take town a photo of which their girlfriend is posted which is important when it comes to this we need to differentiate.
The other thing is just to say if people haven't noticed there are headphones on either side of the microphones.  Not that I'm trying to create    I'm sorry, headphone plugs, not that I'm trying to create two classes of people, but if you can get to one of these you can plug your headphones in, and you get a direct service.  Thanks.

>> BERTRAND de LA CHAPELLE:  Thank you very much for the last comment.  We should have come with our iPods and other plugs.  That's a good recommendation.  Thank you very much and I would be happy to give a light input on the notion of the concierge service because this is one of the things that we will discuss in October in the Franco Dutch initiative on freedom of expression on how to network people who are monitoring the human rights implementation in various countries in the world.  I think the next person was Victoria. 

>> Thank you.  First, let me briefly state the scenario in Italy.  The point is that the broad majority of people between 15 and 25 in Italy don't use the internet anymore.  They don't use email they don't use the web they only use Facebook.  So Facebook is the internet for them and it is the only instrument through which you can talk to these people through which you can reach them.  And as you know, Italy is internationally recognized media is not free.  It is sense censored so Facebook is the only non censored media we have.  It's for free speech and spreading news.  There are fan pages on Facebook which have 300,000 people signed up and they are among the top ten newspapers in Italy.
So there are some additional aspects, I would say, on Facebook coming from Italy.  We should really have concern because the problem is that all sorts of bad things are happening on Facebook and there apparently is no way to stop them.  There were, for example, profiles and front pages for mafia positions and apparently mafia was using Facebook to recruit young mobsters and the reply from Facebook is yes it's illegal by Italian laws but we are not subject to Italian laws and we don't care so even if they have 10 million users in Italy they don't care.  This maybe is getting better, they are opening a representative in Italy, but their reply to this was we should have a law to regulate the internet.  So the risk is actually irresponsible behavior by the people who manage social networks.
And on the other hand, there is other problems such as people who have spent years in building up their profiles, being removed without any advice, without any notice.  You don't even know why.  Some people really do it as a job.  So there is people making a living off of managing Facebook profiles and all of a sudden their profile is gone and they are out of work.  If you contact Facebook you get an automated reply which says we don't care because it would cost us money just to have someone manage your case so we would rather just close your account and we don't care.  My message actually is that we have to be very careful because it's not just a matter of privacy.  It's not just a matter of competition.  It's a matter of free speech.  It's a matter of actually people making a living, and if anyone imagine that's competition can solve this, you are off track.  Social networks have a value which is exponentially despondent on the number of people that are there.
So you would just sign up for the network on which your friends are.  Once such a network gets the market, it gets it all, and even if terms and conditions are bad, you would still use that one and not use any other social network and maybe there would be another one that will be so better than Facebook it will replace it, but it will still become a monopoly thanks to this mechanism.
The final point is that we think of these as social networks but in fact they are social database because it's not a network, everything is centralized in a single point.  So this service is really subverting the architecture of the internet and this is a point that not where people realize but they are changing the nature of the internet and so we should be doing something to prevent this from happening.  Thank you.

>> BERTRAND de LA CHAPELLE:  This, I think, feeds into the previous comment regarding the third stage.  It is an enhancement of that, and thanks for insisting on the age element.  I have Thomas, so I will close this, I have Thomas, I have you, and I have Hong and I will finish after you.  I would just like at this moment to explain the next step before we finish.  I would like to finish this session by asking every one of you, one after the other, to give two or three just bullet points among the things that have been discussed that you would like to see highlighted in the main session.  Just a bullet point among our list of different points that we have identified and just to get a weighing of what are the priorities that you see if we have to make clear messages.  So we will finish the round of comments.  We should be there by 1:00, and we will have some time to go around the table.  Thomas?

>> Thank you.  I do not know whether I properly introduced myself.  I'm Thomas Snyder, deputy director of international affairs Ofcom, and I'm currently the chair of the Council of Europe Expert Group on New Media.  I would like to quickly react to what the colleague had said about how to bring companies to reduce voluntarily or more or less voluntarily reduce profit for some reasons and I think basically in a rule of law situation, you have hard law possibilities and you have soft law possibilities, binding and unbinding standards and there are other aspects but to take the privacy aspect, for instance, this is a question of also consumer protection law or privacy law whether it's legal to have terms of conditions where you upload a photo of yourself to a social yet work, and then the photo of yourself belongs to the social network.
If I'm not    I'm not a privacy lawyer, but if I'm not mistaken, for instance, Swiss privacy loss says unless you are a person of public importance you have the right over your picture.  So you can discuss whether it's legal to have a work contract where you can be fired in one's month.  You can discuss what is the legal scope for terms of contract for a social network.  This is something you can legally decide, if you want.  If you think it's not necessarily, you don't have to do it, but you can.
Then there are soft law provisions that you can recommend, the notion of right to forget.  The Council of Europe in 2008 adopted the declaration.  The ministers on the fact that children and young people should have the right to get content deleted that might cause them any    that might be of a negative effect to them in their future life or for their growing up, so on, so forth.  Of course, this is an illusion nowadays, but there are some technical things like expiration dates on a picture or water marks or things you can do.  They are not 100% efficient and as I government you can decide to create a law that says every person has the right to have an expiratory date on any content he puts on the internet or you can say you recommend operators of social networking sites to allow the possibility that if people want to put something on line, they can decide I put this on line.  This should stay two months, three months, one year and then it should destroy itself or be invisible.  It's also about consumer control or consumer protection or consumer empowerment.  There are different levels of how much you interfere into a market.  And with regard to self regulatory regimes or possibilities, there is, for instance, an example in Germany where    one of the biggest germ my social network, they voluntarily engaged in a self regulatory exercise with the government to say, okay, they define principles on how they treat customer's data, how on they try to help the users to take informed decisions.  They have some principle that's they commit themselves to, and this also fed into the work that we are doing at the moment at the Council of Europe.  We are developing guidelines for social networks as we used to develop guidelines, human rights guidelines for ISP and for providers, social networks is one.  And guidelines that we are drafting together with the industry.  It's not that governments impose something you should do this, but we want to help the industry help the clients to take better decisions.  That's basically the aim and it's not something you can do against the industry, but you have to see where is the common ground.  Where is there a mutual interest to support the people and help them to move more freely and more informed on those web sites.  And that would be other aspects regarding media.  For instance, in terms of this fight when two clients fight on a picture, then it's the question, is there a responsibility of the operator to define, take a decision, and if we call them our social media, so we call them media or in a traditional newspaper, if something is on a paper and somebody doesn't like it, you have the right to reply.  That is more or less clearly or loosely regulated from country to country.
So you could think of developing something like a right to reply for people concerned, how this is done in practice, this is another question, but you could think of it.  You could think of saying, okay, maybe in a market of social networks, some social networks would say, okay, we in case of fights between users, we develop procedures, mitigation procedures and so on, so forth.  You can support this by, from the government side, you can force people to do it different ways, different possibilities, and I think we should think about the pros and cons for different options in these cases.  Thank you.

>> BERTRAND de LA CHAPELLE:  Thank you Thomas, I add to the list the distinction between combination of soft law and hard law that could distinguish what is completely forbidden, what is encouraged, what is enabled.  The distinction you made between consumer control, consumer protection and consumer empowerment, which is also a categorization, and the final point regarding the right to reply.  Just as an example, we are facing a problem which is if somebody sends via a Tweet something defamatory, and this Tweet is reTweeted, if you make a correction, the correction will fly the first step but not necessarily be reTweeted by the people that propagated the comment so how do you handle that kind of thing?  So I think it's you, and then you, and then finally the gentleman there.
Hi, I'm Reynolds from the virtual policy network in the U.K.  We specialize in computer games and social media.  As an independent scholar, I write about the ethics of technology and quite specifically about things like community based norms.  I think in the discourse about social media and things, we tend to get hung up about the specifics of ever changing new technologies, where there as I think there is a lot to learn from kind of early internet standards and some of the things done in data protection law.  Some things are really kind of pretty much invariant across applications.  So as we have data protection, we can look at things, I think, like transparency of search.  The search ranking that Amazon does that Google does, that Facebook does, they may do it in different ways put under are public principles of transparency that could be applied to all of them.  So I think the ways to approach this is to look at horizontals, also talk to the technical community and look whether there can be community standards such as data portability and I think we will understand where the gray areas are and in the gray areas are the areas where it could be left and possibly should be left for communities to decide.  So Facebook versus another social network may be able to decide who can, who can determine whether a comment on somebody else's wall or the technical equivalent is deleted or not.  And I think with that systematic type of approach, we will stop kind of worrying about very, very specific cases which by the time we have written anything coherent, that particular technology will have disappeared.  Thank you.

>> BERTRAND de LA CHAPELLE:  Thank you very, very much for this.  I get from your comments and I add to the list the notion that I could qualify as virtual territories that basically those social networks are territories in cyberspace in a certain way.  The notion of invariance, which means going to a higher principles or more permanent horizontal principles which I think is connecting very nicely to something that was said in the very beginning regarding the predictability of the framework so that users do not have to worry about constant changing of the rules.  The more invariant the principles are, the more understandable the evolution or applicability of the rules can be.  So I just wanted to make the connection.
So finally, please.  And then you.

>> Thank you.

>> No, no, wait a second.  First Hung.

>> Yes, we have talked about    we talked a lot about the standards setting.  We would like to really to be very cautious because we know we work with members and we know how difficult to get any standard set and through.  And another mention that the internet is running globally, but, I mean, but there are still international exploration.  And it's really important to get involvement of the companies and industries in this process.  And with this you have the Google representative to speak to our workshop yesterday, and while we are talking about the liabilities of the company, they have a different idea.  This has been asked to restrict, to have limits when they run the operations in different countries about various countries.  So to think that government and a company should really work together to promote free expression and protect privacy.  For the workshop, it will take place this afternoon at 4:30.  We will have the Facebook and the Google representative on the same panel to talk about the social networking and privacy.  Where I was organizing the workshop, I did try to bring representative from China, because you, you know, in China, not unlike Italy, China the social network is everything.  But it's very difficult to get the company from developing country, I mean, like the biggest social network in China, I mean, Q zone, it has 500 million users.  I mean, you can only imagine, but it's such a huge company that most young people, they use it all of the time.  They receive news, they make friends, they create accounts, but the company is not aware of their responsibility.  They are not interested to talk about the public policy.  That's real a challenge for many companies, I presume, in the developing country.  So I will stop here.  I mean, it's ongoing discussion and that's why we are here. 

>> BERTRAND de LA CHAPELLE:  Thank you.  Two Olympians here the notion that attention that international laws and universal standards, what is the articulation, and the second notion you are alluding to when you mention, for instance, I saw that in the transcript it was the word Olympians.  I don't think what I highlighted.  There were two points, one is the articulation between national laws and universal standards, and the other point is basically the dimension of the cultural environments and the very large social networks having a weight that is basically connected with a cultural environment, like a script basis.  There is a Chinese huge social network, whereas on Facebook, things are more in Latin script or Latin characters and whether this composes an evolution that has to be taken into account with different governance regimes is probably an element to discuss.  Sir, you have the last comment before I give Lisa the opportunity to answer to the questions that were asked before.  Thank you, go ahead.

>> There is a short feedback, just a short feedback for comment of Vitolo of Italy.  There is a problem about unconventional use of social networks.  Because in Russia we also as in China, as in other countries, we have national social network.  It names in contact.  Which has a lot of unconventional things posted there.  There is a lot of, for example, child pornography, a lot of animal cruelty materials.  So there is a problem that we need to control social networks legally.  And we need to insure balance between self regulations and unconventional use of materials.  While we will raise a culture of the users of the social networks, we could move the balance to self regulation.  But before we need to do something to prevent their unconventional use.  Thank you very much.

>> BERTRAND de LA CHAPELLE:  Thank you very much.  I'm picking the word legally in here, because it's directly connected to the article 29 of the universal declaration of human rights saying any limitation to the exercise of freedom has to be done by law and in the pursuit of insuring the rights of others, so on.  I think this is a major distinction that I want to insert in the debate, in the whole debate about freedom of expression and other freedoms on the internet, the clear dividing line is not between those who control and those who do not control.  The clear dividing line is those who control according to a due process and legal process and those who control without a due process and a legal process.
So Lisa and Court, maybe you can answer the questions that were asked and then once again I will recap the main topics that we discussed and ask you to weigh in on the ones you would like to be put forward in the main session.  Lisa.

>> Thank you.  Apologies in advance if I don't answer the question, because I didn't hear it very well because I am falling into Brett's second class citizenry of not having the headphones.  Bertrand, I think I go ask you to quickly clarify the question.

>> Quickly and loudly.  What's been said is the fact this bill of privacy rights proposed by EFF will not be enforced as a legal document because it will change too quickly?  Aren't you frayed of that.  I'm a big fan of international treaties, don't get me wrong, but isn't it the case that when you elaborate the document and it is ready for other countries it will just get outdated too quickly.  Is that a risk you are considering.

>> Yes, thank you that was an important clarification because I was about to answer a completely different question.  So that's useful.  Yes, it's a consideration that we very much, we very much discussed within the coalition.  And that's why I guess the underlying principle of the technology neutral regulations and principles where possible comes in.  That's what we have been trying to do with the charter in terms of the three tiers I mentioned before.  So tiers one and two in terms of defining the universal right and how it applies on line and then going the next step down in terms of general principles, prying to be as    trying to be as technology neutral as possible.  And then it was really in the third tier down where we are actually teasing out the specific issues and talking about roles and responsibilities that we feel we can't be so technology neutral, and we have to be addressing specific technology, specific issues, specific platforms.  So then we envision that third tier to be a rolling document that will always need constant updating.  But also I would like to think that we are trying to build into this approach the notion of kind of bottom up norm setting as the primary, the primary approach what we are trying to do, rather than that notion of top down legislation and regulation.  We are hoping to feed through the very process of what we are doing, bringing people together to see if we can shift values and start to share common values.  I think that sometimes overlooks for me human rights are very much about the values that underpin them so for me that's the starting point by use of the equality of human dignity.  They are universal, technology neutral and we sometimes overlook that so it's about a dialogue to create shared values that we have, and I think that maybe just quickly in response to some of the things I have heard, I think what is different about the internet environment that I sometimes think is overlooked and important to take into our session tomorrow is the importance of openness.  And we must never forget that we are where we are today on the internet because of our principle of openness, in a way the genitivity of the internet, the innovation and creativity that it spawns.  So we must always be protecting human rights on line, we also have to remember the notion of fulfilling, and fulfilling freedom of expression in its expansive sense.  So for me underpinning that is the principle of openness.  One of the ways we can fulfill the right to freedom of expression is maintaining openness whenever possible to we have an internet that is truly creative and empowering, that people can use to achieve the goals that they really want to achieve.  I wanted to add that on the end there.  I think that what is new with this environment is the principle of openness.  If we can keep that, then I think we are going a long way towards fulfilling human rights on line.  Thank you.

>> Before giving the floor to Kurt one point on what you said to maintain a connection with other comments we have made, I sense in the discussion, and we have seen outside a trend towards the reclosing of the open web, we had AOL that was closed that opened into the web, and now the social networks become the window or entry point not to mention the ipad or the iPhone as selection of apps we have new entries into something, there are spaces on the internet that become more closed with their own internal rules.  So when you talk about openness, I would connect in that context this with the notion of interoperability that has been mentioned before, because beyond portability, there is the question of can we circulate from one of the virtual territories to the next or is there going to be a barrier that says, no, you don't know, you don't have data, so on.  So I just wanted to make the link.  Kurt.

>> Kurt Opsall, EFF.  Thank you for mentioning the hard problem of when you have competing equities in a piece of information.  And this actually is going to come up a lot within social networks.  There is going to be competing equities between freedom of expression versus privacy, where one entity wants to express a true fact about the world, but that fact also implicates another person who may have a privacy interest in having that fact not expressed.  And this comes up in photos as in the example you used.  It comes up in address books where what many people think of as a piece of their information, and what they would want to bring from one site to another, it consists entirely of the information of other parties.  Location based services also it comes up where often it is what is interesting about a location based service is not just you saying that    you are at a particular location, but who else is there with you.  I am at the IGF forum with these people.
And that is information about what I am doing.  It is also information about what they are doing, and it brings that tension into line.  So in terms of how to think about that, one is to make a subdivision, which is right as against the service provider versus having an issue with another user.  And this, to put it another way, it's when the service provider is enabling it, versus when the service provider would prevent something.  Enabling would be a situation in which there is a third party application that asks your friend can I get access to everybody in your network's musical selections so that I can tell you what music you might like?  And that is a situation where the service provider is enabling it, and the second would be preventing, that they have an open text box in which someone can write any string of text and that person might want to say some private information, and so in order to implement it, there would have to be a way of stopping that from going into the open text box.  And I think the former one is what we are speaking about in terms of the right to control.  The right to control vis a vis the service provider.
And some of the thinking on that is a little bit of borrowing from the rubric of friendship that is part of the construct of social networks.  The notion is that these people are your friends and you could debate whether they are really your friends or just, or social network friends, but there is a notion that these are people you have a relationship with and those people probably should respect your wishes that if you are actually a friend with somebody and they said please don't put this photo of me on line, it would be the polite social norm thing to do to not put the photo on line and the rude violating the social norm to put it up against their wishes.
Perhaps there is also the freedom of expression angle to that, and one way of trying to balance that concern would be an override.  You could say, well, I don't care about their wishes.  I am going to put it up any way, and the system could send a notice to your so called friend and say it turns out your friend isn't very friendly maybe you want to defriend them or take some action and allow social norms that exist in the real world to be replicated in terms of doing this.  But as an initial position it makes the most sense to try and replicate that norm of friendship and have the system that would enable transfer of information and enable friends to share information, have that system also respect the friends' wishes about that information.  With respect to user to user disputes, I think it is perhaps too dangerous to have a system that would sort through everything that is posted and try and determine whether there is potential multiple equity information that is posted and take action before it gets posted.  I would come down on the side of freedom of expression there where freedom of expression gets the edge.  The information does get post and if there is a controversy as between the users about the propriety of posting that, then that controversy would be resolved subject to the applicable laws affecting those users freedom of expression rights and privacy rights.

>> BERTRAND de LA CHAPELLE:  Thank you very much.  I pick from what you just said, the notion of social norms, which is also going to evolve as Thomas said, there is soft law and hard law in social networks or social media.  We expect to see soft social norms and hard rules.  And the second point is in user to user disputes, regarding use to user disputes, but the notion of having dispute resolution systems inside as a part of the construct of the social network may be something that would be worth exploring.

>> BERTRAND de LA CHAPELLE:  It is a quarter past 1:00 and we are supposed to finish this session at 1:30 so I will maybe skip what I intended to do in terms of going through again all of the different point that's I highlighted, because I really would like to give the opportunity in a very quick comment around the table for you to highlight one or two not even sentences, but just    not even sentences but just among the points listed one or two elements that you felt you learned or you understood better during this session.  Not necessarily the thing that's are the most important, but the things that became clarified during this session, and that would deserve to be put forward in the main session so that other people get a better understanding as well.  Can I start in this direction?  And    no without comments on who you are so we can go around.

>> One of the things that was clarified was some practical solutions to address the gray areas when the analogue and digital world and the policies that should come out of discussions like this to address them.

>> I didn't follow the whole workshop, but one thing is that I keep realizing that things are very complicated when you would like to get concrete, and the solutions are not so easy, but I think there are several layers of possible ways out and I think what I'm, again, realizing that you should try the pros and cons of all layers and try to find the best solutions on an appropriate level for the challenge.

>> Okay.  I'm not sure that anything was made more clear than before, but the most important maybe would be, and the first ability and hard law and soft law ethics versus regulation and the second would be portability and the possibility of take data before the account is deleted.

>> Well, I actually feel the same way, I'm not sure if I got too many things clarified, but one thing I realized during this session is that it's great to create these principles, but what's the next step, and how are we going to implement these principles?

>> I would probably say to move from the concept of portability to interoperability, although I'm not sure how that would work, but that's the idea.

>> Okay, hello.  Three points I have, I like the idea of giving a license to private companies.  I would like to see the discussion about how profit, the argument of cutting profits from social networks could work or what enforcement can be used and a third one, I am a strong fan of any opt in option, I would like to see that develop further.  Thank you.

>> So I mean that content of privacy is that everybody has a right to decide how much information that she or he wants to make public.  And it was pointed out at the end, the violation of the rights of somebody else is really important issue in that sense.

>> An issue that was not discussed here today is using social media sites for on line propaganda, like in 2007 when the attacks against Estonia were launched, directions were posted on Russian on line forums, how to launch distributed service attacks against Estonian web sites.  So this is definitely an issue to be dealt with.

>> Just the complexity of the legal environment both in the remedies and the environments in which differing environments which the legal remedies may or a not be applied.

>> I like the issue with respect to the balance on the license agreements for users so that IR, you know, in terms of real world scenarios where contracts are not one sided in favor of the owners of the social networks, I think that makes a lot of sense.

>> Lastly, the complexity of the issues.  The complexity of balancing the contention between full user rights and how do we have user rights with the speed of the way the technology is unfurled and how do we actually use it.  How do we match the two?  I'm not sure how we would do that.  And the other big thing is people operating social networking sites aren't here so we are missing important people from the dialogue.

>> I said before it's not necessarily more clear, but I would point here only the emphasis on user friendly and user assumed policies.

>> What do you mean by user assumed policies?  Like they engage in the development?

>> Exactly.  I would say that assume includes understanding, participation, involvement, everything.  Users being self conscious about other activities in social networks.

>> I have three points.  The first is national laws against international standards.  The second thing is possibility to delete some information, and the third one what appeared about the best practices, and I have come to conclusion that there should be some kind of place, institution, whatever it's called which gathers the possible best practices of this internet governance where every country, every company, wherever, who is interested in this issue can take a look how the problem is solved in particular country or company.  We have taken up so many issues here, but probably this is issue what is solved in one country and big problem in another one.

>> I think I would stress that market solutions don't operate well in this arena because of network effects, and, therefore, striving for interoperability is crucial but I'm not sure it's technical possible so we have to involve the technical community in a big way.

>> I think one issue missing, can you hear?  I mean, under the governance of social media, maybe there should be a not only the right perspective, but also a    because now internet is major information infrastructure.  We have to make sure that, I mean, every country, all long wages, they have their local count in the production, and also for those who have a content, can we make sure that the quality of the content good enough to inform the society just like the traditional service media.

>> I think there is a real importance for a technical neutral regulation so that you can really go a step forward and not only the whole time, one step behind the technical world.

>> Well, I will have two points, first, the right to participate, because really, we have to make sure that the right mechanisms are in place to me, you, anyone to participate, and the right to be    for me it's like a key concept.

>> Thank you.  I confirm by myself complexity of issues related to media, social media governance, the first point.  And the second point I assured by myself that constructive dialogue of interested parties would be very useful in resolving of the issues, thank you.

>> I think to reiterate previously, let's look at common standards and not think that everything is new in this sphere.

>> For me it's really the transboundary nature of the internet and so, therefore, the transboundary nature of the issues and the importance of finding places like this where we can come together to engage in a process of international norm creation and value creation.

>> For me, I think it's around decision makers in the room, you know, like actually I think there is a lot of agreement amongst this group and I'm wondering about the development of communication channels with the people who can actually make some effective changes.

>> Thank you.  For me, I think it would be the right to delete and also these issues around the digital traces that both you have made and also others around you on the network.

>> The need to include social networking site in decision making processes with regard to privacy, and the complexity with regard to doing that especially with respect to the different cultural ways in which social networking sites are being used.  And especially with respect to different countries where it's actually been taken over it was very interesting to learn about Italy, for instance.

>> For me the privacy, deletion of data and portability of data, consumer control and empowerment, the internationalization of standards, ideal, but it should be very loosely defined because of the dynamic nature of the internet, and the dynamic nature of social media.  And the possibility of how we    how to bridge this cultural divide in terms of your social media networks, whether we go in a direction now of having a, of having a which would give different nations more control over that social media entity.  Thank you.

>> Okay, I have three conclusions, first conclusion is quite clear is the need for a multistakeholder international cooperation.  We need to get involved at all levels internationally.  In order to do that, and that is the second point, we need to raise awareness among the communities and the community members.  Without that nothing will be achieve and for me the third point the only solution we have at hand is ethics, ethics for cyber communities.  This is what you mentioned, what cyber communities being elements of the intergovernance system.  Thank you.

>> Again, it's complex and I'm completely confused about what should happen.  What was missing actually is any conceptualization, not only the technology and how fast it changes but how people's relationships to questions around privacy, for example, might be changing as a result of the use of social media and actually for young people, that may be a very different relationship to their understanding of privacy as a result of their engagement on line.  And also, I still think that there are particular issues that are not addressed in terms of the ways in which different governments will mobilize.  We have two examples from Russia and Italy but criminality on line as a means to increasing regulation or need for intervention and that needs to be investigated as to whether or not that's appropriate.  Thanks.

>> For me, as the need to deepen ideas around self regulation of internet, to insure that the openness of internet will continue.

>> Thank you.  I think for me, it's the need to think about the economic incentives of the sites to actually implement the privacy solutions.  And also the need for them to be included in the dialogue itself.  Thank you.

>> I would highlight the issue around the governance and understanding different governance models and the balance between self regulation and formal regulation.

>> Basically repeating some of what was already said, but the legal complexity, especially, and balancing openness and freedom of expression.

>> I really appreciated the feedback on our bill of privacy rights which helps us understand perhaps how to better present it and understand it ourselves.  It is also a good reminder that there are a lot of hard issues involved in doing appropriate governance.  It's hard to come up with a good set of principles and even harder to implement those principles in a way that is going to be effective and achieve those principles, but I appreciated the discussion.

>> BERTRAND de LA CHAPELLE:  Does anybody want, that isn't, take a mic, anybody in the back of the room that hasn't spoken would like to make a comment?  Please go ahead.

>> I think one of the things that worried my slightly is there is no distinction made between social media and social networking.  A lot of discussions today were about social networking.  Social media and the notion of people being able to have a two way conversation is an important distinction, and I really worry that some of the restrictions that we plan or favor for social networking will kill off the open internet and social media.

>> BERTRAND de LA CHAPELLE:  Thank you very much.  I would like to wrap up this session with an amazing timing.  I'm really impressed.  I want to thank you all extremely deeply for the quality of the comments.  I will not summarize what has been said.  You heard the different comments.  I encourage you, first of all, to participate actively not only in the workshops that have been mentioned, the ones this afternoon and tomorrow, but also in the main session on security, privacy and openness, to share what you took away from this session, and from the workshops you participated in.  I have the list of the participants.  I will take the transcript in the coming days, not this week, but afterward, try to clean them so that we have a background material, and maybe do a summary of the different points that I raised as a reminder, mostly headlines, not the arguments, but mostly the topics.  My email address is my name is Bertrand de La Chapelle.  My email address  I encourage all of you who have taken notes and that have made personal comments on your own during this session to share it, if you want, with me so that I can circulate them on the list and piggy backing on the discussion that emerged at the very end there is a lingering question that we need to address and we maybe will do that on line is how do we continue this discussion?
I heard calls for continuing the multistakeholder interaction.  There are spaces, there are international institutions like Council of Europe, UNESCO and others.  There are initiatives by NGO's and others.  How could we build a very light framework coming out of the IGF that is not exactly a dynamic coalition, but that could help thread this discussion until next year?  This is an open question.  I don't have a solution.  I will share it and spread it on the web.  Any feedback is welcome and I thank you very deeply for your participation in this, in this session.