Privacy and Social Networking

15 September 2010 - A Workshop on Privacy in Vilnius, Lithuania

Also available in:
Full Session Transcript

Note: The following is the output of the real-time captioning taken during Fifth Meeting of the IGF, in Vilnius. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.


>> JÄNIS KARKLINŠ:  Good afternoon.
Good afternoon, ladies and gentlemen.  I hope you hear me well.
Do you hear me?
So my name is Jänis Karklinš.  I'm Assistant Director General of UNESCO in charge of communications and information.
Let me, on behalf of UNESCO let me warmly welcome all of you to the workshop on privacy and social networking.  I thank the Panelists whom I will introduce once we will go along.
And in starting, I would like to say that UNESCO has been interested in questions of social networking, blogging, wikis, new tools, media platforms based on Web 2.0 applications already for awhile.  We have been addressing these issues in a number of occasions lately in the past Internet Governance Forum in Sharm El Sheikh.
Previous discussions proved that social networking tools have tremendous potential in promoting freedom of expression, political and social cohesions, social media is a unique tool to foster all forms of human rights.
Council of Europe, European data protection advisory, Spanish agency of data protection, organisation Article 19 and others have worked in developing regulatory instruments which would guide social networking.
At the same time there are a number of challenges, particularly the one that, the legal frameworks of different countries are different and the unique regulatory framework cannot be applied to social networking since Internet does not recognize national borders.
Therefore, UNESCO is entertaining this discussion, international discussion, multistakeholder discussion in order to address issues and involve private sector Government, civil society in the search for the right solutions.
Today we are pleased to greet here as Panelists Parliamentarians, representatives of companies running social networks, representatives of Data Protection Agencies and we are looking forward to this debate during which we will try to reconcile the notion of freedom of expression and privacy.
And seek whether there are some solutions which would provide, which would boost both at the same time.
We will start today with Mr. Richard Allan, Director of policy of Facebook.  Richard Allan joined Facebook in June 2009 to lead the company's public policy work in Europe.  In his duties, Richard works with a broad portfolio of issues including privacy, online children's safety, freedom of expression, eCommerce regulation and public sectors users of social media.
Prior to joining Facebook, Richard was European Government affairs Director for Cisco and also Visiting Academic of Oxford Internet Institute.
Richard, may I ask you in maybe seven to ten minutes, kick off this debate and give your perspective, how you in Facebook address the issues and whether there is a magic recipe how to boost at the same time freedom of expression and privacy.

>> RICHARD ALLAN:  Thank you very much.  I'm grateful for the opportunity to try and address what I think is one of the most complex questions of the day.  And I think it's no surprise that many of the sessions here have covered the subject of social networking with apologies to anyone who was at one I spoke with earlier in the day.  I will cover many of the same issues, but I will stay for questions this time.
The critical challenge that social networking throws up, we always had a notion, I think, of personal data as being something that is, all the regulation of the personal data is between what I would consider small individual citizens and large organisations, whether those are private companies or Governments.  Those relationships, of course, still exist today.  A lot of our regulatory effort has to be focused on Governments collecting data about people and the way in which commercial corporations use data for their own commercial purposes.
What social networking has done, I think, is to introduce a whole new set of relationships where the expressed purpose of a service like Facebook is for individual citizens to share personal data with each other.
I think in many ways that was not a framework that was imagined previously.  So we've created something novel that creates new challenges.  I think great challenges also in two different respects.  The first is that these services tend to be global.  Somebody somewhere bulls what they think is a great Web service and makes it available to everyone in the world.  And when they do that, they are not sitting there typically with a group of lawyers as they contemplate whether users in different countries might use that service, but sitting there with a group of engineers trying to bull the most compelling and useful service that they can.
That globalisation typically occurs from the day you launch your service and there are a whole string of consequences take stem from that.  Our experience has been one very much of as the service has grown in different geographies, engaging more and more with the regulatory authorities in those countries, almost irrespective of formal questions of jurisdiction.  For example, as the service grew in Spain, colleagues at the Spanish Data Protection Agency naturally raised queries with us and we found ourselves engaging more and more with them and the same applies to other countries around the world.
Rather than a formal process that was set out from the beginning, it has been a process of evolution of relationships as the user base has grown.
There are some very thorny questions regarding the U.S. EU safe harbor and the way that services are deemed to be established in various countries for legal purposes, that are untested.  There are arguments for or against and little has been tested.  Our issue is to work through with good will and with local agencies, trying to be respectful of their concerns.
The secondary that the growth of social networking has generated is particularly tricky is that every one of us who uses a social network potentially becomes in data protection speak a data controller as well as a data subject.  Those of us who hold the data and those whom the data is about, we have both sets of rights and responsibilities.  We are both concerned about data other people publish about us, but equally as soon as you start posting photos and status update and I write about people in this room today, I'm a data controller potentially interfering with their privacy.
That is a core issue.  There are legally recognized exemptions, particularly personal domestic use is exempted from the data control requirements, but again we are in a world where commercial and personal is blended and I would say is again not entirely new from social networks.  If you look to blogs, as soon as blogs started taking Google ads or something like that, they became somewhere between the commercial and the private.  The Internet is not tidy.  These services are not tidy, not neatly commercial and personal.  They are a mixture of both.  They are not neatly American, German, Spanish.  They are a mixture of all of these and there will be individuals using services where the designers of the service never intended that but people simply picked it up.
That is a world of complexity.  In terms of trying to resolve it, I think there are questions that all of us need to work on together.  We had a very powerful engagement last year with a Canadian privacy Commissioner who had complaints about Facebook.  They brought them to us and we have been working through the process of resolving them.  One of the insightful comments in the report was to say that we are very much at the starting point of a whole new set of relationships.  No single party in this, neither the regulator nor civil society nor business on its own can resolve that.  It does require a dialogue or trilogue among the parties.  The good thing is that there is a willingness to engage.  I think for that reason this Forum today is very, very significant.
I'll close in touching on some of the areas which I know will be contested and people will contest whether or not they think Facebook has it right.  I want to touch on some of the areas where we are working hard so you are aware that we are conscious of them and trying to address them.  They are two touched on in the brief.  The first is around freedom of expression.  Where Facebook's essential approach is within our own statement of rights and responsibilities, so we have some defined content that is inappropriate and not permitted on Facebook, but within that we try to permit as broad a range of speech as we can.
So we do work very hard at having user operations staff who understand those rules, who apply them in a consistent manner and ensure that the service is a platform as far as it can be for our users to speak to each on or about an not for Facebook to interfere in that speech unless absolutely necessary.  That's a major focus of the work that I'm certainly looking at at the moment.
The secondary is the area around mobile networking which you've referred to.  We have now over 100 million mobile users.  Mobile usage is growing dramatically.  That's from a user base of 500 million.
Mobile is clearly the preferred platform in many geographies using all Internet services, including Facebook.  We expect that to continue to grow.
We believe that adding location into social networking has huge value potentially for our users, but we want to make sure we do that in a responsible way.  So people will have seen a service called Facebook places launched in the United States a couple of weeks ago that will also be spreading to other countries as we roll it out.  And in that, this is just one example of how we tried to capture this balance.  We've tried to ensure that users can very simply and easily say they are at a place.  We tried to make it easy for them to make connections with friends at that place, but we tried to make sure that that data is not revealed, is not shared, is not generated in unless there is a conscious action on the part of the user.  Users check into a place.  They say they are there.  And they agree to post that data on that particular Facebook page associated with that place.
The data typically is shared just with their friends.  If they are under 18 they can't be shared any more widely than their existing friends.
That has been part of an ongoing dialogue with those who are interested in child protection outside the company with many in civil society, with the technical people inside the company to try to come to a product that does balance the usability that the users are seeking.  The willingness of people who want to take part in that service to use it and the privacy concerns that people naturally have.
I expect to continue discussing it with folks in data protection services and others across the next few months, and the example of the sort of challenge we are facing right this week in services like Facebook.

>> JÄNIS KARKLINŠ:  Thank you, Richard.
Very interesting, especially for me who is maybe one of the few people in this room who has no his own Facebook account and so when I ask a member of European Parliament, Mr. Lambert Van Nistelrooij, what would be your message?  He said I will talk about Facebook.  
So now Mr. Van Nistelrooij, you have this chance.  Before that, I will tell you you have a degree in geography and history the Catholic University in Nijmegen and have been a member of European Parliament since 2004, and you are working in the European People's Party Group.  
So please, the floor is yours. 

>> LAMBERT VAN NISTELROOIJ:  Thank you very much, ladies and gentlemen.  From the side of the European Parliament, we are very engaged with this debate about the privacy, about the new media, but the whole digital agenda.  Because yesterday Mrs. Cruz, our Commissioner, she spoke about this and we are looking to all aspects in this evolving media.
But a couple of remarks maybe about Facebook because I saw our representative, Richard Allan here.  That's why I said Facebook, but first maybe a couple of general remarks.  For me using social media like Twitter, like linked in, like Facebook, it's normal.  It is one possibility to communicate with my photos.  I use it very functional.  I don't show my personal things on it, every day thing.  I give my opinion and I want to have them heard everywhere.  This is something very, very new in some sense.  Fifteen years ago it was not.  I have been in politics now, I had my last year I had my eighth election and there are several fields, but now I use it.
But one side of the coin.
The other side is probably when we talk about younger people, you not aware, a lot of people are not aware of what can happen with what they put on the, on such a kind of blog or such a kind of a page at this moment.  And that it can be used some years later or that although it is said by the colleague that there is standards position that under 18 years that their message, their opinions, their feelings cannot get to people that they don't want, I doubt this because at this moment you have to make your own proposition in the Facebook, for instance.  And if you are not aware of what you are doing, there should be a standard that under 18 you can just send to friends and you can change it.  So I think we have to evolve.
What European Parliament is doing, we are, of course, very, very good to connection and cooperation with the national states.  A lot of things are still happening in Europe in the 27 states.  This is still a part.  For instance, some of our legislation is really old.  If I say the 1995 data protection directive, that has to be renewed.  It will be an initiative at the end of this year and beginning of this year and at the end we get these principles back.
Now you work with fermented legislation in states and they have to work it out in their own laws and they did, but we have to renew this legal side.  We will do this.
But in between, there has been a lot of activities.  I'm very glad that at most, probably all the companies that work on the, in this field have agreed last year in a covenant safer social networking principles in the European Union.
I think this is an important step that first you have your experiences and you try to find the right balance, as said by the previous speaker between this openness and privacy.  And in this principles, for instance, there is important remarks that we should be age appropriate, age appropriate services.  In that sense we can really make steps forward and better safeguarding.
So this is the way.  It will end in legislation later this next year.
One remark more about how to communicate and within Europe we have now the safer Internet day.  It is the second Thursday of February, in which we bring this more to the press, to the public debate and I think this is a very good action and we have combined it now by European action that think before you post.  This is directly addressing to schools, addressing to parents, addressing to the organisation that Advocates the Internet of children and families.  And in that sense, I think we might invest a lot in good practice because with all those things that I said, it is a very, very powerful media and we are just standing at the beginning   
(Loud music). 

>> LAMBERT VAN NISTELROOIJ:  As I said, this is just the beginning, the beginning of having the principles translated into legislation, et cetera.
So I am really willing to have the debate later on and answer your questions and I think the IGF is the right Forum to exchange experiences not only about Europe but worldwide.  Thank you very much.

>> JÄNIS KARKLINŠ:  Thank you, Mr. Van Nistelrooij for your remarks.  Now I'm turning to Ms. Ceren Unal.  Ceren has studied law in Ankara University and also civil law at that institution and the dissertation at the civil liabilities and Internet service providers for online content.
Ceren's primary area of interest is ISP liability, privacy and data protection, protection of intellectual property rights over Internet, eCommerce and Internet regulations.
Ceren, I know I asked you to share your comments on the State of discussion on the privacy and social networking in Turkey and in parts of the world where Turkey is situated.

>> CEREN UNAL:  Is it okay?  Okay.
Well, the right to privacy was defined as the right to be let alone in 1890.  So although this definition remains, it is definitely a challenge for social networking how to protect such a right in an online environment in which the main idea is sharing personal information, putting yourself out there, is tricky.  Arguing that social networking and privacy are incompatible, I wouldn't go that far.  It is true that social networking presents some unique problems which are hard to resolve through existing legal protections on privacy.  There have been some initiatives to address these problems that resulted in soft law, solving as guidelines and principles specifically relating to issues as my previous presenter stated, the position paper on security issues and recommendations for online social networks by ANISA and the article 29 data protection working party's opinion on social networking.
All of these instruments define the possible privacy risks and threats in social networking services.  They establish the roles of stakeholders involved and establish principles to prevent privacy violations.
I will share my local experience with you.  Users of social networking services can find themselves in privacy threatening situations in mainly two ways:  Through content contacts or contacts coming from others or by the very conduct of the users themselves might be threatening as well.
Content constituting a violation, such as hate speech, forbidden remarks and child pornography or it could be age inappropriate for young users and children like sexual or pornography material.
Since young users consist the majority of the users of social networking users, age inappropriate content protection is crucial, such as sexually explicit messages to online behavior for targeting cyber bullying, cyber stalking and other forms are examples come to mind.  Cyber bullying is another issue of particular importance for young users which might result in serious consequences even suicide as it was the case in which a 13 year old teenage girl took her own life as a result of bullying on Facebook.
The users are also perfectly capable of putting themselves at risk by unconsciously disclosing too much personal information including sensitive data.  This might be in part due to the privacy settings of some social networking platforms as we have all experienced.
In order to prevent these threats, there are duties to assign for all of the stakeholders involved.  Social networking providers, Governments, regulators, law maker, law enforcement officers, officials, as well as the users themselves of course not to forget parents, educators or other caretakers of underage users.  They have the social and legal environment for effective privacy environment.
The governments and law makers need to provide effective regulation and make sure they don't over regulator hinder the free flow of information and free speech or at this point I should add I'm affirm believer of industrial self protection when it comes to data and privacy protection.  Self regulation of technical standards in a manner complementing legislation which sets general standards provides the best solution and provides flexibility and updates based on technical developments in a way much faster an more important than detailed state regulation.  Perfectly put as article 29, working party, such software regulation should be disciplinary in nature and coupled with enforcement measures.  Law enforcement needs to combat privacy vials through appropriate training and along with other public parties they need to work for international cooperation which is a key element for combating any kind of illegal online activity as a whole.  Users also need to act responsibly.  I keep remembering the bumper sticker saying:  Don't drink and tweet.  They have to act cautiously while deciding whether or not to make their personal information available online.
After all, under the right to information self determination which is the underlying principle for data protection provisions users are the only ones to decide what to do with their personal data or who to share it with for whatever they want for how long, et cetera.
Of course this is only possible when the social networking service providers are meeting the requirements at their end by providing appropriate terms of use, practical solutions to possible threats and issuing technical safety in their networks.
For minors, parents and educators also carry an important burden to make sure that all online are appropriate.  We have the substantial legal requirements.  First of all, consent, freely given, specific, unambiguous and informed consent of data subject in our case being the user of the social networking service is most important requirement for fair and lawful processing of personal data and it is even more significant in social networks since it's much more difficult to make sure that the consent given by the user is really an informed one.
To achieve such consent clear, practical, easily accessible use of privacies that include updated warnings on privacy risks not only at the beginning but during the whole term of subscription is essential.  Especially with content provided by third parties.  Self selected contacts by users should be usable and making changes should not be burdensome as we sometimes all experience.
The principles of privacy by design along with transparency and accountability should be incorporated in regulation.  In fact as you said, the EU protective data is in line, in review with this slide.
With regard to protection of minors, appropriate age verification tools should be made available by social network providers.  I believe also privacy protection on mobile social networks need to be enhanced.
The European Commission framework for safe use by young children should be a good guideline to follow on that.
Effective an practical mechanisms of reporting abuse should be made available to the user.  This can be done by the service providers themselves and public regulators through take down procedures.  This is important for service providers since receiving such notice is likely to be considered as having or at least ought to be having actual knowledge of the violation, which is the first step towards liability for third party content.
Upon receiving the notice, the service provider would acknowledge and assess the notice and if required remove the content in violation of privacy.
While regulating notice and take down procedures it should be taken into consideration that such provisions would only be effective if logging access toe Web content is for privacy violations is simple.
This happened in my home country, Turkey.  As of 2007 we have a new law which really simplified the procedure for blocking content to criminal grounds which resulted in the infamous ban on YouTube for two years now.  This is not created as grounds, grounds such as obscenity and sexual abuse are likely to be considered violations and therefore subject to this procedure.  If not, the courts have an tendency to block content as sanctioned.  This law, one of the few positive provision providing for a general framework for notice and take down provision for take down procedure for privacy violations has remained ineffective and hardly ever resorted to.  Of course such national provisions will be meaningful when the service provider is located in that country which is hardly ever the case most of the time.
That brings us back to the importance of international cooperation and harmonization.
I would like to emphasize the principles that should be followed at all times while trying to protect privacy on social networks.  Free speech and free flow of information should not be    user generated content and interaction between users.  All of which enables social networking platforms with enormous potential to promote democracy and diversity.  Preventive measures should be favored rather than the punitive sanctions.  Such measures need to be designed in practical and user friendly or we may rephrase it as being usable for an average Internet users.  Sanctions should be proportionate and in line with Democratic society, no more blocking a site because of a single file.  Thank you for your patience.

>> JÄNIS KARKLINŠ:  Thank you for your remarks and the recommended suggestions.
Now I am turning to Mr. Rafael García, the Head of the International Department of the Spanish Data Protection Agency.  Mr. García holds this position since June 2008 and before he worked in the Ministry of interior as a deputy Director for the sovereign refugee office and the international relationship of this office.
Rafael, if I my may ask you to share your thoughts on the subject of reconciliation of freedom of speech and privacy; how that works in your opinion.

>> RAFAEL GARCÍA:  Thank you very much.  I would like to thank the organisers for giving us the opportunity to be here with you today with regard to these issues.
I am afraid that I don't have the answer to that question, how to reconcile the different rights which are present in social networks.
I will try to give you the perspective of my agency, which is data protection authority.  So it's the perspective of our regulator, an agency which is responsible for enforcing data protection law in Spain.
So first of all, I would say I have to repeat something that Richard mentioned at the beginning of his intervention and it is that from our position, that two main features in social networks that make them different from other Internet based applications.  Well, at least one of them.  One of these features is that in social networks the user is not passive subject.  He is an actor.  He is not, as Richard said at the beginning, it is not someone, some small individual who is faced with a big organisation.  It is individual versus individual, if you want to put it that way.
And the second feature that I was referring to is the fact that social networks are transnational services.  So they are not confined in most cases, although Spain is    in Spain we have one example of a social network which is confined to the Spanish borders.  I will refer to it later on.  But usually these services are not confined, as I say, to the national territory of a state or even to the geographical region.
So we as regulators, we are faced with a number of challenges or difficulties or problems.
One of them is how to find the applicable legislation to these services.
In order not to be, to be over the five to seven minutes that I am allowed, I will say that according to the European legislation and to the Spanish legislation in some cases it is possible to apply this Spanish law to this type of services, but not in all cases.  So it is again a case by case situation.
Second problem that we are faced with is that of what we can call responsibility.  So here this is something that Richard mentioned also.  The user becomes what we call in European terminology a controller.  So he is someone who is responsible for processing personal data, personal information.
So it is sometimes difficult, it is tricky to identify who is really the controller:  The user or the service provider?
Third challenge, I would say, I don't know how to say it.  I would like to refer to it as privacy by design or privacy by default.  Or the absence of it, which is in some cases the situation.
So by the definition, social networks are a place to share information, to share information with others.  So by definition this is potentially a dangerous exercise from the data protection point of view.  So in our position, all the settings of our service should be configured in a privacy framing way by default.  So this is how to define these friendly, privacy friendly setting is also one of the challenges.
Fourth issue is that of the use of personal data by third parties.  So we have here the service provider.  We have the user, but we have other parties which are also present or part of the social network and that in one way or another are making use of the data of the users.
Developers of some applications or advertisers or others.
Finally, there is another very important challenge.  That is the fact that social networks are the preferred, one of the preferred Internet services for minors.  So I would say I don't really know the percentage of minors who are social network users.  Perhaps Richard will have the figure in Facebook, but it is clear that a significant proportion of social network users are minors, somewhere between    I don't know, between 12 and 18 and around these ages.
So as I say, these are the challenges and are we trying to come up with    how are we trying to cope with this situation?  I would say first of all we are trying to enforce the law as much as we can as I say, this is not the case in some occasions, but we are trying to do that.  And it may be interesting for you to know that the Spanish agency or perhaps I should say before that we are quite famous or infamous for our strong sanctions and.
And the Spanish agency has sanctioned users of social networks for uploading contents which were inappropriate from the data protection perspective.  We are not talking about other perspectives.  We are talking about our responsibility which is data protection.
Secondly, and we have been established, we have established a dialogue with social network providers, not only with Facebook, but with other social networks and I can say or I should say that we have achieved some interesting results.  Richard referred to the case of Facebook.  I talked before about the Spanish social network, which is called 20.  I don't know whether you are familiar with it.  It is not 20, ten plus ten.  It is 20, the acronym for your entity in Spanish.
So it is    this is a social network which is very popular in Spain with nearly more than 5 million users.  And it is a local or national social network.
We are in a permanent dialogue with this social network since 2008 maybe and more I can say that we have agreed on privacy policy and privacy practices which I would say that are quite acceptable for the Spanish Data Protection Agency.
Thirdly, an important part of our activity has to do with civilization.  I always say that Internet technology is something like a car.  I mean, a car may be very good, very safe car, but the driver is important.  With the best car in the world if you don't drive properly, you don't drive safely, you will have an accident.  And in this case we think it is more or less the same.
We want service providers to use technology in order to protect privacy.  Again, privacy by design, privacy by default, but at the same time we need that people and particularly young people are conscious of their risks, are conscious of the fact that what they are doing may harm other people's rights and as I say, part of our activity, an important part of our activity has to do with this sensitization and mainly of the young.
I think I will stop here.  Thank you very much.

>> JÄNIS KARKLINŠ:  Thank you, Mr. García, for sharing the Spanish perspective and Spanish experience.
As I said at the beginning, it is, when it comes to one country imposing the laws of that country to social services and social networks, it is relatively easy.  The challenge starts when you need to reconcile data protection practices of European Union.  For instance, in the United States, which are in some occasions close to opposing.  But now I am turning to Mr. Ivo Corrêa, who is here representing another big industry player, Google.  This time Google Brazil.  And he is holding the post of the Senior Policy Counsel at Google, Brazil.  Before joining Google, Ivo occupied strategic positions in Brazilian Government.  He worked in the Ministry of justice and also in Presidential office as a deputy legal secretary for Presidential staff office.
He served as a legal counsel to majority leader of the Brazilian federal state and as special adviser to Sao Paolo City Council and Municipality of Sao Paolo.  Ivo, now the floor is yours to share Google's perspective on this rather complicated issue.

>> IVO CORRÊA:  Well, can you hear me?  Yeah.
I would like to start by saying thanks to UNESCO for the invitation.  For Google, it is a pleasure to share this room with all of you today.  My impression is the fact that UNESCO is sponsoring this session is indicative of the role that education can play on this debate as well as the commitment that I believe is shared by everyone in the room that we need to balance privacy and protection concerns with freedom of speech.
I need to be brief since Google is just now learning about social networks.  That's one of the reasons I'm here today.  Brazil has one of the most experiences with social networking with in Google.  The network called Orkut.  It has 8 million users around the globe, but the majority of them are Brazilians.  Having said that, I believe if we focus our discussion on social network a broad, all one trend, and not social network in the strict type of platform, we can look to a variety of experiences since the Web is getting more social every day and the challenges are becoming common to other platforms and not only social networks.
I strongly agree with what Richard Allan said before, that is an effort shared by the industry, civil society and Governments and you have to work together to find the best solutions.  In this sense I would like to present Google's five principles on privacy that guide our policy and practice and also highlight specific topics regarding the application of these five principles to Orkut and to social as a whole.
The first privacy principle of Google is use information to provide our users with valuable products and service.  The explanation is pretty obvious.  Here we are committing ourselves to use information only where we think we can provide value to our users.  A clear example of this is that we do not sell our users' personal information.
Our second privacy principle is to develop products that reflect a strong privacy standards and practice.  Here we aim to build privacy and security into our products from the design phase to launch and now interact having just been said by Mr. García when he spoke about privacy by design or default.  Since the design phase is launch, we don't stop at launch.  We continue to innovate and we realise that our users can be the best source of how it should manage privacy in our products, of course.
The two initial issue privacy principles are basic.  Now it is getting more interesting.
The third one, and I think this is huge and really important to Google is make the collection of the user's personal information transparent.  It is about transparency.  The materialisation of this commitment that I can think of within Google is something that most of you probably know called Google privacy dashboard.  The dashboard was created as a one stop shop where anyone can manage the use and storage of personal information associated with Google services and products.  From Gmail to search to Picasa and or justly to Orkut, people can find out what kind of information is stored and managed and they can manage it through Google privacy dashboard.  If you don't know the initiative, I would invite you to visit or at your Google search page you can just click on the privacy word and the bottom of the page.
Moreover, specifically in the case of Orkut, we have just implemented a series of changes and improvements in the last weeks where one of the cornerstone features was to give users the ability to understand and manage the way he or she shares his or her content on the network.  So as it is a pretty Brazilian thing, I imagine most of you don't know.  I would love to show maybe afterwards, but we are now giving more controls to the user in any page while he navigates on Orkut pages.  Finally, I think an important topic on this transparency principle here is also to use the potential of the online tools to educate and prevent bad results or unexpected results in, for users and the products.  So Google has been investing a lot in partnering with different organisations to use the Web and use our own products like Orkut or YouTube in this case to send the right message to our users.
So I think the representative of the European Parliament has already mentioned the saver Internet day which is a great initiative, but Brazil started a series of conversations with the UNESCO office in Brasilia to exactly see how we can produce better content in Portuguese, adopting the Brazilian context so we can educate the users of our tools and offline tools in general in Brazil.
The fourth privacy principal is give users meaningful choices to protect their privacy.  Here is all about the user control of their data and information that we store about them.
Privacy also means giving out users control of their personal information including with to decide to use product.  Most of you probably already know, Google has an initiative called data liberation front.  It is a group of Google engineers to allow users to liberate their doesn't if they decide to leave and go to another provider.  If they decide to go to Yahoo mail or hot mail and even to leave Picasa and provide pictures to another provider, we are developing different tools so it will be possible and it is already possible for the user to transport his data to, to transport his information in an easy way from one user to another.
In the cloud computing momentum is very important.  Orkut is part of this initiative and they can use the data  So I invite you to visit
Last, be a responsible steward of the information we hold.  Along with transparency in user control that I mentioned before, Google good security is vital in maintaining user trust.  Besides, on this case on the security issue, besides the encrypted version of Google search and Gmail that you probably have heard about, security in the case of Orkut means develop the right technology called tools and incorporate with different NGOs and Governments to keep our users safety.  Focusing on malware and child safety and other rights of users in their homes.  The two representatives of partners we have, the President of saver net a Brazil NGO working on child safety and on his side, Mr. List Costa, a Brazilian prosecutor who has been active online with security issues.  They are partnering with NGOs and Google keeping Orkut secure as well.
Finally to conclude as we mentioned at the beginning of the workshop on this session, the challenge to the framework on privacy and social networking, I would like to highlight two additional topics.  One is that the data and information that is stored in, by Google, by other providers is not only to avoid threats to our users and assure the security of our products combating malware and this kind of thing but also can be an unexpected source of innovation.  The best example, the example that Google always used on this is something called trend, from search data from our users we are able to build a fantastic tool for public health.  The problem is innovation, as you know better than me, is that you never know where innovation will come from.  So any kind of legal framework or kind of discussion of attempts to protect users privacy should be leaving openings for innovation here.
Based on culture and traditions, we in South America have been watching the different debate on privacy coming from Europe, coming from APIC and the U.S. and pay attention to different ways that each of these regions provide to the discussion.
My personal Latin America feeling is that you have to respect those differences and try to find our own path in Brazil or in South America as a whole.  But then again we got in trouble when we have an online global platform and different cultures and traditions on the ground.  This is another challenge I would like to discuss with you and hope we can discuss it.  So thanks a lot.

>> JÄNIS KARKLINŠ:  Thank you.  We were planning to get Mr. Pavan Duggal, the Advocate from India and CYBERLAWASIA to participate to our workshop remotely, but there is a technical difficulty that prevented us from doing that.  And therefore, I would like to ask Mr. Kurt Opsahl from Electronic Frontiers Foundation share with us his thoughts about data protection as an organisation, is acting in the field and contributing to international debate of the question.
Kurt, if I may ask you to take the floor? 

>> KURT OPSAHL:  Thank you very much and thank you for having me here.  I'm with the Electronic Frontier Foundation.  We are an organisation that is a similar civil society, trying to preserve freedom of expression, privacy and innovation in the digital age.
We have been giving quite a bit of thought to some of the issues surrounding privacy and social networking.
Which at first glance may seem to be as someone suggested a tension between them, social networking is often about providing and sharing information.  Privacy is often about not sharing information.
What many people I think really want from privacy and privacy through social networks is control over the appropriate context for the information.  So that information is shared with the people that they want to share it with, but kept from people they don't want to share it with.  And the context is very important.  Things that may be appropriate to share with one group may not be appropriate to share with another.  Whether because they are embarrassing in one circumstance or not interesting to another group, but people want to have control over the appropriate context.  So we've come up with what we call the Bill of Privacy Rights for social network users.  This is a set of principles to help guide social networking service providers in how to effectuate good privacy practices on their services.
These principles are, number one, the right to informed decision making.  Number two, the right to control.  And number three, the right to leave.
Starting with the first right, the right to informed decision making.  Users should have a right to a clear user interface to allow them to make informed choices about who sees their data and how it is used so that when they are operating the social networking controls, deciding how to post something to upload it, whether to share things with others, that when they are effectuating their choices through the user interface they know what they are doing and can understand what effect that will have on their privacy.  This means being able to see who is able to access any particular piece of information about them.  Other people, other users get to see it.  Whether governmental officials get to see it, under what circumstances, how the wents, advertising, advertising networks, under that circumstances tease various third parties might have access to the data.
In order toll allow people to effectuate what privacy rights they may have when there is a case of involuntary disclosure, where a disclosure is coming through authority of law, be it through subpoena or court order in order to effectuate people's privacy rights, you need to get notice of that request and an opportunity to assert whatever rights and privileges that they may be entitled to as part of a due process.
And are given a meaningful opportunity to respond to that request.
The second principle is the right to control.  What I mean by that is the users should retain control of the use and disclosure of their data so that the social networking service providers, sort of taking a limited license to the data that allows the service providers to use it for the purpose for which it was originally given to the provider.
But if business changes, if a knew feature comes along as services evolve and they certainly will, secondary uses may come up.  When the service provider wants to make those secondary uses, it should obtain explicit opt in permission from the user.  This allows so the user can have the confidence to know that if they do nothing if they take no further action, that their information will still be treated according to the deal that they had at the time that they started using the service.
So they don't have to go back and check every day to see what new features they may have automatically been opted into.
Importantly the right to control includes the right to decide whether their friends can authorize the service to disclose their personal information to third party Web sites and applications.
One of the interesting features of social networks is that you give access to information to a friend.  One of the main purposes is to share that information with a friend.  This also raises the question of what the friend can do with that information and for many social networks will allow the friend to authorize third parties to have access to that information.
And users should be able to control when that access occurs.
Social network service may ask users' permission before making changes that would share data about users or share data with new categories of people or use the data in new ways.
When these changes occur, it should be opt in by default an not opt out.  Again, users have to make an informed decision to share the information before any of these new changes are being implemented.
This should work out all right because if a social network is providing a service that is adding functionality that users really want, it should be no problem to convince people to opt into it.
Turning to the third right, the right to leave.  One of the most basic ways that a user can protect their privacy is voting with their feet.  Changing from one service to another, depending on what is offering them the best combination of controls and features and Fellow users, including which is going to provide them with the right privacy controls.
So in order to effectuate this right to leave, you need two things.  One, when you go you are able to leave completely.  That your information will no longer be with the prior service provider so that you can move it and not have to worry about what further activities will be done with this information.
So this means that a true deletion of the data so that it cannot be accessed again in the future.  And as opposed to disabling access to the information.
The second portion is to be able to easily, efficiently and freely take the content that they have invested in the social network by uploading into the service and be able to move that in a usable format to a new service.  This is data portability and it is important to promote competition and ensure that if people are sticking around with a social networking service provider, they are doing so not because it is too difficult to leave but because that service provider is providing the combination of services that is most important to the users.
So these are the three principles that we are promoting in this Bill of Privacy Rights, but I want to have one last note which is that we think these are good practices and in fact best practices that service providers should abide by, but there is a lot of challenges if these practices were to turn into enforceable governmental regulations.
One of the principles I think are sound and have remained sound since we enunciated them in the spring of this year, it is a rapidly changing field within social networking.  And what it looks like today will not be what it looks like a year from now.  And one of the greatest challenges with taking valuable principles and turning them into regulations is creating the regulations that accurately reflect what those principles mean to do and moreover, having them continue to reflect those principles and be sensible as the technology changes.
And more often than not, the process of developing and creating regulations and subsequently amending those regulations is going to take a significantly longer period of time than it does for the technology underlying the need for this to change and such that regulations will often be out of date almost as soon as the ink is dry.
So thank you very much.

>> JÄNIS KARKLINŠ:  Thank you, Kurt, for your remarks and in sharing the information about three principles.
So now it is time to engage in the dialogue with the audience present in the room.  And I would like to invite those who would like to take the floor to indicate that and that I see how many people in the first round would like to take the floor.  I see one there; two, three, four.
Maybe if I may ask somebody who is close to the microphone, since two of our four next speakers are not seated at the table, simply to liberate one place where they can come and get access to the microphone, that would be useful.
Secondly, I would like to ask those who would like to be intervene at the beginning to present themselves, say in your name and affiliation.  That would be very helpful for the protocol.
So by saying this, I now turn microphone to gentleman across the table.  So please, you have the floor. 

>> JOHN LAPRISE:  Thank you.  My name is John Laprise.  I'm a professor at Northwestern University in Qatar.  
I have a question for Richard.  This is in the nature of a hypothetical question.  Were there no Government interest in privacy, what would your privacy regulations look like?  Ivo, you mentioned in many states you have varying policies based on local conditions.  And I imagine that there's a range of protections offered, depending on those local conditions.  I'm wondering, what is the baseline?  What might the baseline look like in the absence of Government intervention? 

>> JÄNIS KARKLINŠ:  So that's the tricky question.  I will give you some time to reflect and I am now turning microphone to the next, please. 

>> VITTORIO BERTOLA:  I am Vittorio Bertola from the Italian Chapter of the Internet Society.  I have a question for Mr. Richard Allan or for anyone who wants to take the question.  Starting from the consideration that most of this discourse has been on privacy.  It seems to underlie the hypothesis that these are social networking are for people sharing pictures of their holidays, but now they are become can more and more like media and they are actually given the situation in Italy, it's the only free and uncensored media we have, and it is being using for bottom up activities, political action and so on.  So there is an entire other dimension which in certain cases might clash with privacy because it is a situation where you wouldn't want to be able to circulate content and pieces of news which can quickly as possible.
Because of the regulatory issues, you have to prevent the risk that the operator of the social network may apply some kind of censorship or even favor certain content over other content and in the end stifling some free circulation of news and bottom up organisation.
So I don't know if anyone has any consideration regarding to this other dimension and how it can cope with privacy regulations.

>> JÄNIS KARKLINŠ:  So thank you, for your question.  The next speaker is over there, please. 

>> HENNING MORTENSEN:  Thank you.  Henning Mortensen from The Confederation of Danish Industries.  
Several of the Panelists mentioned this concept, privacy by design.  It could be quite interesting to hear from all the Panelists how you perceive this concept, what does it contain.  Can you give example also of how you design privacy into a solution?  Just to see if the NGOs agree with industry on how to define this concept.  Thank you.

>> JÄNIS KARKLINŠ:  Thank you.  The next question is over there. 

>> AUDIENCE:  Hi.  This is Matthew from Hong Kong.  I'm one of the Mission Ambassadors.  I just registered my Orkut account.  So I just search it for the privacy terms as well as the user terms like that.  It is like a 20 pages long and I believe that it is the same case as Facebook that we registered.  So I wonder, is there any like methods or new technologies or like the big companies, social networking can think of like some buffer time or something like that, or even user friendly terms, user terms that maybe you guys can suggest?  Or have thought about to help us protect privacy?  Sometimes we have easy to like quick agree and then things have closed.
So yes, this is my question.  Thank you.

>> JÄNIS KARKLINŠ:  So thank you.  I think there are a number of questions and we may go now with the answers and then we will take the next series of questions.  So Richard?

>> RICHARD ALLAN:  Take them in inverse order.  To Matthew's question, this is a common issue, President length of our privacy policy    the length of our privacy policies.  The issue is, if you want full disclosure and you have a complex service because of the nature of what it is offering to the users, you wind up with a lengthening privacy policy all the time, especially in light of our discussions with regulators.  If somebody dies, we will place their content into a memorialized state.  We were doing that as a support function.
One of the conversations we had with a regulator, if you want to do that, you need to fully disclose it.  Another two paragraphs go into the policy.
You need somewhere that you need to have along document with full disclosure that people can contest in court if that's where things end up.
What you can do is a layer on top of that that is much more straightforward.  That's what we've tried to do.  If you go to now, you'll see a pictorial page, simple language that tells you the key things you need to know.  If you want to read the whole policy you can click through.  We won't get away from having the complex stuff somewhere.
In terms of the privacy by    should I get them    privacy by design, I mean, I think the term is used massively variably.  So just give you one example.  Privacy by design you can interpret as saying give users a range of controls.  I would argue from Facebook's point of view one of the reasons people criticize us for being complex is because actually we have built into our system a huge range of user controls.  We would argue it's the offering of the user, that ability to have granular control that is privacy by design.
Others will criticize and say no, the key thing is where you set those controls.  I think this is a very significant area of debate.  It is a debate we have with colleagues at EFF and others.  We are setting the controls to what we perceive to be the most user friendly in terms of the way typical users use our service.  We don't have a single set of defaults, all open, all closed.  We have different settings for different kinds of data, status updates, friends only, publish more widely to mass audience and people can change those.
We ended up with a system where lots and lots of controls are in there, but we haven't necessarily set all the defaults as fully closed which some people are arguing that we should.  The reason we don't do that is usability.  By sharing data, we believe people can find each other and find more value.  In terms of it an Indian situation, Vittorio, I met your interior minister last year.  This is an important element for us.  Somebody sets up a Facebook group that says I hate the Prime Minister of any country, that's fine.  If they say they want to kill the Minister of any country, that is not fine.  Much of these conversations are more mixed.  The standard we apply, is it a plausible threat to violence?  if it's a credible threat to violence come down, if it's robust political speech that irritates a politician, stays up.  That's one of the values we can bring, we realise.
A couple comments about the population.  Our under 18 is    87 percent are 18 plus.  Voters are having conversations around that more typically than the childish conversation.  Really important part of the platform that we allow that to happen, but tough to make the judgments.
Finally, what were the policy principles look like, John?  I personally think that the general framework that the European Union has, the notion of principles, things like subject access that you can view, amend, delete data, these are sensible.
The challenge we get into is around detailed interpretation of how they should be applied.  So we do have differences of opinion about what institutes effective consent for something.  Again because we are coming from this usability perspective.  That's where a lot of these debates happen.  But the general principles such as the European Union and other countries have are really sound if we keep them at the principle level rather than getting into too detailed technical conversation.

>> JÄNIS KARKLINŠ:  Thank you, Richard.  No, Ivo, it's your turn.

>> IVO CORRÊA:  Okay.  Let me try here.  Starting for how our privacy policy would look like?  I think almost like everything Google, the focus would be the user and how, as I said, how we can also learn for from our users, how the user feedback would help us.
If I would try something here, it would be the three main messages that were summarized in the last three principles I mentioned.  So it's transparency, user control and security of the data.
The problem, as Richard said, is the detailed application of these three principles.  I would say those three would be the pillars of our privacy strategy, our privacy policy.
Regarding the privacy policy in Orkut and the extension of the privacy policy and the terms of use, I think again rich also really made a good point, I think.  We have a lot of legal issues required there and it's difficult to solve all the, in a very simple and clear way.  Our strategy here has been divided into I would say two things.  One is concentrated information so people can, as I said, in an easy way find what the privacy policy is and what kind of data we are gathering from them.  I think the dashboard is an example, but also even the privacy link on the Google page, that brings you to all the privacy policies among different Google products.  That's an important component here.
The second one, simplify language.  We have just released a modification of our terms of use around some services, especially the Gmail, to simplify the language and get out some of the legal expressions that was not user friendly.
The third component that I also mentioned is as some of the legal words we can take out, but some of it we have to keep.  It's also to use other channels where we can employ, simplify contextualised language, like the YouTube videos that we use, how all the language translate to day to day language.  This is pretty important.  Here our partners as I mentioned are key to the success of this initiative.  Google wouldn't do this by itself.  Partners know better how to translate that.

>> RAFAEL GARCÍA:  I would like to add something about this question of privacy policies, long and difficult to understand.
From our perspective as data protection authority, I think the main problem with this privacy policies is that when they are so complex or so difficult to understand, they may affect something which is very important for us.  And it is the informed consent.  Which is the basis of any data processing.  And the operative word is informed.  With the privacy policy which is difficult to understand which nobody gets to read because it is too long or it is boring, difficult to understand.
So the consent is not informed, is not, I would say it is not free in a way.  This is something that we also discussed with some of the service providers.  We always try to, because there is one important issue.  It is the legal aspect of this privacy policy.  The privacy policy has legal implications.  So it is not only something that has been drafted by engineers who know how the service works; it is something that has been carefully drafted by the legal department of the company.  So lawyers are like that.  I mean, they want to cover everything.  Perhaps something which may be useful is, let's say a light version of the privacy policy.  Something which is not incompatible with the long one, but something    for instance, Google has developed details for informing users about a number of issues.  So this is something which may be useful.
But I imagine that the privacy policy, the long one, the original one, the one which will have legal effect will stay there forever.  I mean, that's my impression.

>> JÄNIS KARKLINŠ:  Thank you.  Now, Kurt, your brief response and then we'll do a second tour of questions. 

>> KURT OPSAHL:  I just want a few comments.  One on the subject of privacy by design, an important aspect of designing for privacy is to have the set of defaults be matching as closely as possible with user expectations.  So that the combination of defaults and user interface, so that when someone is using the system and saying they post something to their page or they upload a photo, that what happens with that information matches what they will expect will happen with that information without having to change the status.  And this can be accomplished through user design such that they are receiving the appropriate notices about what is going to happen in the process of uploading it.  It could also be a process of understanding the users and what their expectations are.
You know, you can go to a variety of social networks and search the publicly available information for key words concerning embarrassing medical issues or skipping work to go to the beach.
And you will find people publicly sharing things that you might imagine that they really did not intend to share with the public.  That's indicia of somebody whose expectations didn't match what the defaults are.
A privacy policy is not the most effective way of communicating that.  This sort of notion that people have been put on notice by information placed in a privacy policy, it's not a particularly good method of informing people.  Now, I understand that there certainly are legal reasons why people want to have appropriate disclosures in privacy policies, but sometimes the privacy policy out there will be so complex as to be meaningless.  I remember seeing one privacy policy that had all this discussion of the various limited uses of data and at the end "or any other use allowed by law."
That pretty much meant they could do whatever they wanted with the information and the rest of the restrictions were not very meaningful.
So to make a privacy policy effective, it has to be simple to understand and it has to be clear.  So it is designed with an idea for having    so at the end of the privacy policy the person has understood the privacy policy and if the legal restrictions where you have to clarify that one thing or the other will happen to somebody's data, make sure that the purpose behind it is that they will understand that as opposed to the purpose being that when you are in a court you can point to some language and hope to get off the hook.
Thank you.

>> JÄNIS KARKLINŠ:  Thank you, Kurt.  Now I have three further requests from the floor.  I'll start with the gentleman there and please, introduce yourself. 

>> JYRKI KASVI:  Thank you for an illuminating discussion.  I am Jyrki Kasvi from the Finnish Parliament.  I would like to go back to discuss young people should be codirected with their parents or teachers with they use social media because they may not be aware of what will happen with the information they put there.  The trouble is, these young people are mostly more aware of the situation and are more educated than their parents and their educators.
I have seen it many times when children tell the parents, please do not put that picture there because you don't know what happens.
Because in the teenager's world, it is the social standing with their peers is dictated by their virtual reputation in the Web.
So they know how to act.  For example, when I put a couple of pictures of my son to the Web, he immediately objected:  Please, don't tag me.  I don't want my friends to see those particular pictures.
Of course, accidents happen, but do think, how can we ask parents to guide their children when they are so totally lost themselves?

>> JÄNIS KARKLINŠ:  Thank you, Jyrki, for your solution. 

>> MAX SENGES:  My name is Max Senges.  I'm a colleague of Ivo's.  I work for a Google policy team in Berlin, Germany.  I would like to raise one point that came up last year.  We organised a section on social media governance and if there is interest, we could organise a Dynamic Coalition in social media governance.  This is an ongoing subject that will become more and more important.
If there is interest, contact me or, yeah, or speak to me afterward.  I am also on the steering committee on the coalition of Internet rights and principles.  They are developing a charter on rights and principles on the Internet, several of the points, especially the principles, the colleague from    might be an interesting discussion on the charter on the principles section.  So yeah, take a look at that.  It's version 1.0.  Thank you.

>> JANIS KARKLINS:  Thank you, Max. 

>> XIANHONG HU:  Thank you, from UNESCO. 
As this session is Webcasted online, so I have received a lot of remote participation, including three hub in Cameroon and Armenia and other several individuals.  In particular in Ghana and Cameroon they have gathered some young people to sit together to follow our session.  They have a lot of participation, a lot of comments.
In conclusion, young people, there is, as I said, even in Cameroon they have increasing use of the social networking, but they really don't know how to correctly use it, how to protect their privacy.  And in Ghana they also had the same question.  They said they are very happy that Mr. García mentioned that educated people on privacy.  They want to know how all Panelists think about educating people on the best practice of protecting their privacy online.
And of course, the young people from U.K., his name is victor.  He said he had the same observation.  He found that even adults, the parents know nothing about the privacy and social networking.  So how to educate everybody on privacy.
And this is one question.  Another specific question for Mr. Richard Allan from Cameroon is from Robert.  He said that he is a Facebook user, but he wonders is there any capacity building programme put in place by Facebook to train the new users about how to correctly use the Facebook?
I mean, for example, the training opportunities, for example, so users can understand the service before they get involved.  Thank you.

>> JÄNIS KARKLINŠ:  Thank you, Xianhong, for bringing to our attention the comments which come into, through the online participation.  That is very useful and that proves we are not alone in this room.  That the debate is followed from elsewhere.
Now I am looking around.  I see, if you would find a place where you can get to the microphone?  Could you turn on the mic and introduce yourself? 

>> AUDIENCE:  I'm Jun Luk from Thailand.  I would like to ask Mr.  Allan, according to a Facebook user, this is a question from a Facebook user.  One first question is about like when we talk about the privacy policy decision in the Facebook, it you is still like that.  We have a button that we can setting what kind of privacy that we try to, but it seems there's kind of like common grounds of the business model of the Facebook as well that keep access to okay, please allow application to access our friends.  So I feel quite like uncomfortable with these kinds of filling.  It seems like I have to allow every time these kinds of things.
So I think, I don't know, maybe we need a kind of like privacy policy deciding that best on the common good, than like a kind of like trading the personal data of our friends and family.  That is the first question.
The second question, I do believe about like the open and user generated content and the kinds of communities like has their own decision to make the platform, but those are the kinds of problems that happen and it has come to be:  How do we deal with this?  Like in Thailand, Facebook already has come to be, like many other platforms, so it has become a political platform for people to express their feelings.
In Facebook, initially, originally most of them use Facebook like family connections or the friend connections.  When these two things come together in the platform and there is already a kind of phenomenon that is called like witch hunting through Facebook.  Like they call like the numbers of the posting that a provider has been and that provides personal data of the people that the thing is like in the opposite way.  So they already post and we do some kinds of a report to the Facebook.
But it seems it doesn't work well.  I don't know what it is.  I would like to know about the effective function, effective feature to report about these kinds of things because these kinds of content already bring the issue up as a death trap or charging for the people as well.

>> JÄNIS KARKLINŠ:  Thank you for your questions.  Now I'm looking around whether there are other people willing to intervene?
Richard, you are the most popular today on the panel.  The mic is back to you.

>> RICHARD ALLAN:  Thank you.  So I'll start with again the last questions first.  In terms of the pop up boxes, things like applications, that's actually something we are being encouraged to do more of.  At the beginning I talked about we are in a world where individual users have this role as controller of their own data.  And if they are, our model is if a user decides to install a application, they are making the conscious choice to share with an application.  There are concerns again which we have heard about, what it is they are precisely sharing when they do that.  We strengthened the pop up boxes now so they are more explicit.  If they are scary, that's deliberate.  We want to be clear when you share, the process that goes with it.  The process has been one of strengthening that because of conversations with regulators, which is    rather than making them less scary which we have been criticized for. 

>> AUDIENCE:  Not scary.  It is a decision for us to join, just yes or no.  It is not deciding at all.  Thank you.

>> RICHARD ALLAN:  Very specifically with applications, that essentially is the model, that the application asks permission to be installed and the model is:  Do you or don't you want to use that application?  We have done a couple of things to try to help with that.  We made it much more explicit.  Again, we had a lot of criticism.  I'm being polite not to single out social networks.  We had many queries about applications had all of your data and now they are restricted to the data they need to operate.  That's a change.  Now applications have to have a privacy policy.  There are areas of open debate where people think we should go further, but we have been making decisions to place you in a better situation.  Ultimately, a user of a social networking platform that has third party applications has an responsibility for deciding whether or not to install those applications and whether or not they trust the third party.  We can help them with that decision, but it's still their decision.
In terms of the reporting content, there are report links right across Facebook.  They may not deliver the results peel want because of that tension I was talking about earlier.  When something is reported to us, the threshold we apply is one that is, with a presumption of free speech unless there's a clear breach of our views.  Nudity, pornography are violations, you can see that.  Others are more difficult to make.  Once you cross into credible threats of violence off the site, it can be very, very offensive speech, particularly in a political context and Facebook will still not remove it.  That's deliberately part of our philosophy.  We may continue to be challenged around that.
And I just pick up a couple of the other questions from Robert in Cameroon and others there.  We have focused on self teaching materials simply because of the scale of viewers that we have.  Two pages, now is very good, much better than anything we did before and  That is an important page because it gives people advice, targeted at educators, parents, children, et cetera.  It is in multiple languages including French for Cameroon.
Finally the issue about children and parents.  One of the exciting things about social networking, it is a technology that is used cross generationally.  Our fastest growing group is 35 plus.  The most mature markets we get the highest proportion of 35 pluses.  Whereas the gaming and Internet stuff is seen as pure kids activities, we now have an activity where it crosses the generations the parents and kids are learning together.  I think that creates exciting opportunities.  We moved on from the days of social etiquette question, is it okay to reject a friend request from your mom, is a question asked several years ago.  People are making those connections now and they have more opportunity to talk about things like online safety with a shared experience of a platform than ever before.  It's no longer mysterious to the parents.

>> JÄNIS KARKLINŠ:  Okay.  Thank you, Richard for your answers.  Mr. Nistelrooij? 

>> LAMBERT VAN NISTELROOIJ:  Reflecting on the last point of debate about children, what information do they have for the parents, et cetera.
Sometimes it is necessary from the side of the legal side of the parliament of the Governments to prevent things and if, just like I said the article here about the Dutch social networks site, Hives, 7 million users.  Only 1 million has, let me say, is protection more than that they don't want to have their information in the broader circle of people.
Sometimes we have to say from the governmental side that we have to protect because this is the way we can give a lot of information, a lot of education, but at the end of the day there is some limitation.  We did it, for instance, when we made law about the crossborder mobile phone and the calls and the way they have to make the bills, et cetera.
So I can foresee that the Parliament is bringing the ideas of the opt out.  For instance, you have the maximum security, the maximum of privacy and that you have to set steps to make it lower and sometimes that you, after half a year, you have to reinstall this thing because people are changing and they forget what they did two years ago.  Sometimes a law maker has to prevent.  This might be the way we follow this thing and next year we will come in this legal phase.

>> JÄNIS KARKLINŠ:  Yes, please. 

>> CEREN UNAL:  In terms of control you're absolutely right.  But as in the real world, in all my world as well, parents, they cannot know.  Also they should not know what their kids are doing everything all the time.  It's impossible.  It will be so difficult.
What I was talking about was the, so if you are providing your kid with a computer, at least I think it's within the borders of parenting to just have a general knowledge.  That's why I added educators as well.  The same applies with schools and Universities as well, even Universities, for example, I'm working at the University in Ankara.  They have a programme for educating the educators because the students are way ahead of the professors.  It is just to do our best, hope for the best.  That's what is going to happen.  Just real life and online world is like the parallel when it comes to parental control and what minors are doing online.

>> JÄNIS KARKLINŠ:  Thank you for those comments.  I'm looking around the room.  Is there anybody else who would like to take the floor, ask questions, make comments?
Xianhong, something else on the net? 

>> XIANHONG HU:  Yes, one more.  There's a user from Facebook Cameroon.  Especially want to ask Mr. Allan again that also everybody, he said does the user really have rights for his information on social network?  And if yes, how can he make sure once he leave a Web site a social network Web site that he can know all this information is deleted?  Really deleted?  It is a specific question for you.

>> JÄNIS KARKLINŠ:  Thank you and there is another question from the Danish Federation of Industries if I recall correctly. 

>> HENNING MORTENSEN:  Thank you very much.  It's actually for you.  Could you when you sum up just mention that you, what UNESCO is going to do about this topic in the future if you have any plans?  Thank you.

>> JÄNIS KARKLINŠ:  I promise.

>> RICHARD ALLAN:  The user in Cameroon on Facebook, there were two procedures specific to each network.  We have one to deactivate account and another to delete account.  Being very up front there is a debate about how prominent those two options are.  I want to explain why we have them and why they are different.
The primary reason for offering deactivation function, we found that's what most of our users want.  They typically deactivate an a high percentage of those users come back to the site later.
If they have deleted everything, then there is no way to recover that account or that data.  So our biggest fear from a user perspective is accidental or erroneous deletion of that data.  Ten years of your photos gone is our biggest concern.
We offer deactivation as a different way of stopping using the site temporarily.  When it's in that state, the data is not publicly accessible to anyone and it is securely stored on Facebook servers.
If the user chooses deletion, they actively say they want it deleted from, we delete it from the server.  The user is choosing from an archiving option and the proper delete function.  I accept there's a debate about the extent to which users are given guidance on those.  If you delete or deactivate in the Facebook centre now you see a full explanation of that.  We have people making that complaint saying you can't delete things and we have to respond to that by offering more and better information and people would like us to offer even more and better information and options.

>> JÄNIS KARKLINŠ:  Thank you. 

>> AUDIENCE:  One more question for you with regard to Facebook.  How do you deal with people who passed away and their profiles?  And has there been any development in that area?
And do you have a lot of problems with homonyms, people with the same names and their privacy and concerns?

>> RICHARD ALLAN:  I'll come back to that.  On people who passed away, it's an interesting case study of how services evolve.  The site was originally designed for using colleges and people were not thinking about death when they built the site.  Within those quite small communities, a death occurred and somebody raised that question.
So the response of the technical staff was to come up with a way of making a profile so that it is only accessible to the friends, who actually want to leave tributes and find it very comforting within that group to have a conversation while make can the site is not accessible to the public or anyone else to make comments on.  They did that as a public service recognizing that was valuable to the user.
But it wasn't sort of planned in a systematic way.  It was a human response to a tragic situation.
Over time, that's now evolved into a much more formalized process.  So there is a form on the site.  Again, help centre look for deceased or death, word like that in multiple languages.  Something will come back describing this process, telling you how you can go to the form.  The form asks for your relationship with the person, some kind of proof that you are who you say you are and you are not doing this maliciously.  As long as you submit that, the profile can be deleted if that's the preference of the family or put into a memorialized state.
That's a human response to a tragic situation.
People with the same name, that is not generally an issue.  People are defined by their set of connections on Facebook and they have a unique e mail address, the identifier that is used internally to the system.
One of the reasons that we encourage people to share more data, around their relationship to Universities and we set the settings that way is precisely so you can sort for a common name between one John Smith and another John Smith when you search on the site.
So internally to the systems it is not an issue because you have an e mail identifier.  Externally to the public it is an issue if the only piece of information available is the name and people want to connect with you and that's one of the reasons we have the site set up as it is.

>> JÄNIS KARKLINŠ:  Okay.  Thank you.  I would like now to give the opportunity to ask questions to our last two youngsters here in the room and then I will make a concluding remarks since we are approaching the end of our workshop.

>> AUDIENCE:  Okay.  Hello.  This is Haki from Hong Kong.  I have a question for Mr. Richard.  First I have to be honest and I'm actually an addicted Facebook user.  I have a question and the question is, as a service provider what are you going to do if there is a problem because there are so many stalkers on Facebook an sometimes this affects privacy problems.

>> JÄNIS KARKLINŠ:  So, please. 

>> AUDIENCE:  My question is in regard to human rights defenders and the power of the password, the incredible, once you have the, you can open that door, you have access to conversations, to documents, to most importantly from a human rights perspective and defenders perspective, an address book.  It's an issue we have been dealing with quite a lot actually and people being detained and tortured for their password, particularly on Facebook.
I'm wondering if this is an issue that has come to the fore and what sorts of steps you are taking to address that.

>> JÄNIS KARKLINŠ:  Richard, again the floor is yours.

>> RICHARD ALLAN:  So essentially people getting unauthorized access to Facebook accounts in its general sense    I recognize there are compelling circumstances that you described are exceptional.  Generally that is something both because it's potential privacy threat to the user and some of the issues around stalking and harassment, is something that we are investing hugely in trying to tackle.  One of the examples of that, all I can say is we are not 100 percent perfect.  But we are industry leading and amongst the best.  An example I'll give, if any of you are logging into Facebook and this is not your home territory you may get a pop up territory saying can you give us extra security information?  When you log into your account, you can say I would like to be sent a text message for every new device that associates with my account.  You can ask for the last IP address when someone accessed your account.  All of those measures I think take us in the right direction to give people that kind of security and we are going to keep doing them.  That's so fundamental to getting the security model right that we can do that.
In terms of stalking, some of those measures are what we are doing to try to tackle that.
We actually have an increasing number of automated methods also to help us detect an only allows behavior.  If somebody goes on to Facebook and a man joins Facebook and tries to friend a thousand women and they all reject him, that flags him as a suspect account.  If people are going to be using Facebook like you do and enjoy it, we have to keep it safe of the we have to make sure your experience is not one of being sent unwelcome material.  A key part of that is making sure that only you access your account and people who have aggressive intentions don't get access at all to any accounts.

>> JÄNIS KARKLINŠ:  So thank you.  I think that we had a very interesting discussion and very informative discussion.
I got a question what UNESCO will be doing with this information.  And I can tell you that the member states of UNESCO, a number of member states of UNESCO are asking the Secretariat to provide a platform for reflection whether member states should enter into very structured debate on use of Internet using UNESCO as a platform for this debate.
And certainly since these questions which are discussing here are very intimately linked with one of the most important tasks of UNESCO, promote freedom of expression and freedom of press and from one side, from the other side people ask questions about ethical use of Internet or ethics of use of Internet and the questions related to privacy, this debate is extremely useful to inform ourselves and make this discussion we will have with the member states much better informed.
For me as not Facebook user and not user of social networks, most probably not because I do not want to do it.  Simply I do not have time to do that, what I retain is the number of issues that have been very acute, maybe a couple of years ago today have been addressed    they are not resolved yet, but there is a lot of work that goes in from industry side.  There is a dialogue in place between industry, the NonGovernmental Organisations active in the field of protection and intergovernmental organisations.
So we still need to make this extra mile to make sure that privacy is honoured and everybody can comfortably use the social networks.  But as we know, once we reach certain goal, those who want to prevent us from reaching, they also have reached certain stage.  So the target always goes further than that.
So this is sort of an endless story.
And therefore, I think that we will continue to animate this type of discussion, be it during the next IGF, be it in the framework of UNESCO.
In conclusion I would like to thank all of our Panelists who shared their expertise thoughts and time with us and with the participants, with the audience of this workshop equally I would like to thank all those people who listened to this debate remotely and participate the in it through    participated in it through remote participation means and asked questions.
Particularly my thanks to Richard Allan who was grilled.  So thank you very much indeed.  Thank you all of you.  And this workshop is brought to the end.  Thank you.
(The workshop concluded at 1030 Central Time.)