Benchmarking ICT companies on digital rights

11 November 2015 - A Workshop on Other in João Pessoa, Brazil

Also available in:
Full Session Transcript

>> NATALIE: Good morning, everyone. Welcome. My name is Natalie. It is my privilege to moderate this discussion on Benchmarking ICT Companies on Digital Rights. We have a great panel of expert practitioners all at the cutting edge of this growing field in their respective countries and industries. I will go ahead and introduce them to you and then we'll get started with a conversation.

We really encourage to you join the conversation and invite your colleagues wherever they may be to do so as well using the Twitter hashtag ranking rights and we have a remote participant who will also be directing questions towards the panel to address those in the latter part of the conversation. To my left, we have Rebecca MacKinnon author of Consent of the Network and co‑founder of Global Voices. Over there we have Jeremy Malcolm senior analyst at the Frontier Foundation in San Francisco. Carolina Aguerre. Kelly Kim is national council which is South Korea's leading national rights and geo. Resil joins us from Rights and Geo Based on Philippines. Peter Wright to my right. His organization has a terms of transparency ranking index that has been really influential useful tool. Ankhi Das is public policy director for Facebook in India and South Central Asia. We have a policy specialist in the Sweden International Development Agency. We're here to talk about how civil society and other actors can induce private sector to be more respectful and promote their digital rights. Digital age has a lot of really great benefits for the defensive. Human rights, civil rights, civil liberty and all these things mean essentially the same thing. Right for human beings to live and prosper in a productive way with all kinds of really bad things happening, to put it in really plain terms. The digital tools can be used by governments, by corporations, by individuals whose aims are not so pure and whose methods report so pure either. So we're here today to discuss the diversity of ways that civil society can help corporations figure out how to ‑‑ how to support and promote human rights in this way. There are a lot of organizations that are trying to do this. The Telecom Industry Dialogue, the Global Network Initiative and in particular have been really useful forms for these kinds of conversations and improving corporate practice, but, of course, those are only applicable to the companies that are members and already want to be doing the right thing, which is far from being the cases for all of the entities we're dealing with here today. We're here today to talk in part for how we can reach corporate actors who are not yet sold on doing so. So, I want to talk first with the electronic frontier foundations who has your back report, which is the first or at least one of the first reports to evaluate ICT companies on human rights criteria in a domestic context. Jeremy, can you tell us about this project and how its evolved since 2011?

>> JEREMY MALCOLM: Yes. It's been going to for five years now and we had 24 countries ‑‑ I'm sorry. 24 companies in the last report and ranking them mainly on data request. But we did also have a separate who has your back report last year on IP related requests. So for a while there, we had these two reports going on tandem. We haven't continued who has your back IT report this year because of reasons I will explain. So without it, this has been very successful at least in promoting these companies to make surface reforms and hopefully some deep every reforms. They do care about the staff. You can see on the slide here the companies that have got between 0 ‑‑ sometimes there are companies with 0 stars and sometimes they can get up to 5 stars. And they really hate getting 1 or 0 stars. They are very competitive at trying to increase their star ranking. Often, we'll give them the opportunity to fix the problems that we found in advance so they don't complain at us afterwards. Many of them do that. So in the lead up and even in the few weeks prior to the release of who has your back, they will improve in terms of service and improve their compliance in certain ways. And even following the release of the report, sometimes that will send them into a complete panic because of the press and they will suddenly ply. We're not supposed ‑‑ suddenly comply. I think that happened with Tumbler that went from zero stars to 5 stars overnight because of the response report. It is effective. If there's a problem with it, if these sometimes have corrections that are cosmetic, they make changes to terms of service. But in terms of the actual experience of users, it may not be as dramatic as the stars they suggest. Each year we do amend the criteria in various ways to increase the bar every year and make it harder for them to get the five stars. So the way we did that between last year and this year, we condensed three stars into a single star which we call with the industry best practices. So if they got three stars before, they would only get one star for that this year. The kind of criteria we have included whether they published transparency reports, whether they tell users when governments request their data or request removal of content, whether policies on data retention they publish and whether they have some kind of pro user public policy activities such as protecting users in court or protecting users in congress and that user public policy star will also change every year. They don't always get the star for the same thing. They can't always predict what they will do to get the star. Otherwise, they will gain the system. This year, it was opposing back doors. Last year, it was fighting privacy in congress. And ‑‑

>> MODERATOR: This project is focused on the United States mainly.

>> JEREMY MALCOLM: So the core project has alluded to ‑‑ we have started to extend this with partners in other countries to make international versions of who has your back. So really just giving them guidance. The legwork is done by the partner organizations. So we're just helping them with our experience of who has your back in the U.S. So we've had similar reports. They're not called who has your back because that doesn't translate very well into other languages like Spanish and Portuguese. We have partners in (inaudible) and Colombia. We the very countries that have adopted the report as a template for their own without a formal partnership with us. So in Hong Kong, there's a who-has-your-back report that's been put out without any close involvement with us. But it is great to so that model has been adopted elsewhere. So it certainly is evolving over time. As I said, we dropped the IP report just because it was getting too hard to work out how to not give these companies stars. Because they were actually covering everything that we picked on them for and the only things that were left were really difficult to quantify in terms of the star. So it's a challenge every year to come up with a challenging sort of star ratings that really does gift the bar and encourage the companies to improve their practices.

>> MODERATOR: That's a really great point and one they think we'll come back to in the discussion. As you said, one of the organizations that you worked with ‑‑ Carolina, could you tell us about your project, the challenges, successes, lessons learned and other organizations could benefit from and your plans moving forward.

>> CAROLINA AGUERRE: It is called understand Mis Datos. The first thing was data protection. What we found is they published and are complying with the law by polishing their data protection. We only do a copy paste of the law just putting out the law part and just copy/paste in a PDF format. We were wondering whether they were doing well with publishing their data protection policy. The second part is regarding surveillance. We have in Colombia a very broad frame of protection for participation and communication on the surveillance and on the intelligence frame of the law. So we wrapped up a couple of questions and how the companies provide information on their users to the government. And the third piece of the questions are regarding content of control. And how do they blog content, how do they blog users and trying to find out if it was good or not. As you can see, the stars ‑‑ is that word in English? Stars charter was very bad. Like only the low compliance because they all published something, but on the rest, we even had to give a 1/4th of the star so that we can show some of them thought a little bit on something and tried to encourage them to do better. The surprise on how this went is that the companies were at the beginning very frustrated, very mad even if we advised them and tell them and showed them on the evaluation and a couple of them replied and told us to improve some things. They were really mad when we published the report and they went to the meeting where we did the launching and through one of the associations that got them together, they were ready to tell us that we were doing something illegal. We were claiming them to be illegal better because they were complying with the law. They were ready. They were very mad. It was interesting to see how their face changed through the meeting because we had been explaining what they were doing and we realized Google, Twitter and everyone was there. And we were saying you can't tell us we're illegal, but we understand this is something different. So let us look at this because we didn't understand it. I've been following up and they were aware at least at the high level that these kind of reports are in the world. There are transparency reports in the companies, but what really struck me it was that somebody says even if they knew it, they never thought that in Colombia this was going to happen so soon. So they were just not ready and not willing to do it. Next assessment will be made. We are doing our preparations jobs to this and probably my main advice will be to identify better and put a talk into the companies and to try to have a good relation with them so that this doesn't become ‑‑ there's going to be bad press some it doesn't have to be worse than it is. Something like that.

>> MODERATOR: Thank you so much. So first, Luca and then Peter. You have some experiences in this field. How does what Malcolm and Jeremy Malcolm and Carolina describe with your experiences, Luca?

>> (inaudible).


>> LUCA BELLI: Yeah. Okay. Good morning to everyone. Thanks for the invitation. We at CTS have been working on a project called terms of service and human rights. We have been assessing the compatibility of the 50 most popular platforms. So there are terms of services and how they recognize human rights standards. So we have used as a criteria innovation the human rights stated in the council and recommendation on a guide for human rights for accusers because it is the most comprehensive recommendation that has been elaborated so far. And also because the result of our project will nurture the implementation of the guides. So they will be directly conveyed to the Europe as one of our partners. Word that we're done was not really ranking. It was more identifying the best practices, worst practices and identifying percentages of what platforms do respect the criteria that we have defining and what platforms do not respect. The goal of the project is twofold. First of all, identify what could be the best practice that could be also replicated by other platforms. And second, identify what could be model contracted provisions that could be defined like creative comments provision that can be interoperable and adopted by platforms in the same degree, the same level of human rights protection adopted by the platform of model contractor provisions. And model contractor provisions can be used by specific labels. This will be the second part of the projects. So far, we have done a first assessment. We have preliminary results and we have analyzed three main rights, privacy, freedom of expression and due process. So preliminary results are quite interesting because for instance, more than 60% of the platform if you have analyzed collect more data than what is necessary for the service. So this would be actually contrary to the assisted principle used in the European level and it would be tried into law in Brazil for coming data protection here. Also retain for more the necessary and also more than 70% of platform allows parties to monitor. And more than 70% shares data with third parties. Another element, which is very interesting with regard to due process is that more than ‑‑ let's say 20% of the analyzed platform requires a waiver of class actions. So speaking about due process, this is extremely important. If we have ‑‑ this is actually a waiver of one of the most important tools that user could have to have their rights respected. So, the first part of this project is now going to its end. Another element, which is consider is important to mention is that we have tried to develop this project in synergy with the work of the dynamic collision of IGF. So we will present this afternoon at 2:00 the set of recommendations on terms of service and human rights that provide some guidelines on meeting standard to protect privacy, freedom of expression, due process and the next phase that will deal with the development of modern provisions. And on the effective data that we will release at the end of the project.

>> MODERATOR: Great. Thank you. Peter, can you tell us about your project and how it fits into the broader environment of this type of work?

>> PETER WRIGHT: Sure. Access now created the transparency reporting index. A couple years ago, we built the index near the beginning of this ground swell that has risen to make transparency reporting the norm essentially. If you're a company without a transparency report at this point showing at least statistics on requests for user data and your compliance rates and statistics on content removal and freedom of expression, if you're without that report, you are lagging behind your peers. And our index rose up to both map this norm as it was becoming established and also to promote transparency reporting. Our index is more of an information clearing house. It's not a ranking. It's a floor on the ceiling. It's a thresh hold you have to reach. It is releasing statistics on government requests for user data. As I said, compliance rates and where possible content removal that we don't require that. And we don't require the narrative impacts to be discussed, but that's definitely a trend we're seeing where no companies will pick out major events like Internet shut down and discuss it in some details. There's much work to be done on qualitative reporting, but should be ranked on our index. You need to release a report. There is gaming in the system. We've seen tell coast say hey, here's our transparency report. There's no statistics. They don't have the data and some of the reasons they give us are amazing. There's large Teleco. Some of the biggest companies say we don't know how much data we give over to governments. We don't get that data from our local subsidiary in a number of different countries. I think the Telecoms where I focused have a long way to go to even internally, you know, calculate, gather and be able to report on this data with some confidence. What we've seen this index do besides I would hope to say build this trend is result in some ground breaking reports. Companies like Vodaphone, but also open the window. If the average degree is kind of like a protester holding a sign outside of the armed secure gates, I think this report gets the CEO to open the window as their limousine rolls in. It's an advocacy window we need to follow up with very considered and thoughtful and, you know, peer reviewed rankings.

>> MODERATOR: Great. Thanks, Peter. As you can see in the audience, and if you're following us online, the website for this is You can see this is a really useful source for a lot of different reports from a really, really wide range of companies and I have to really applaud access now and Peter for this.

>> PETER WRIGHT: Yes. It is a global ranking around 60 companies currently release the reports and more each day. And I can say there's some competition among the companies not only to tell us to list the new data that the report show ‑‑ sorry. The new categories of data the reports show. The caveat is we don't put the actual statistics up on the report to compare. That's definitely an end stage that we'd like to reach, but at this point, the reports are very diverse and the categories of information and the laws that are released under. But in the least, we do get a very global picture of this trend. Thanks.

>> MODERATOR: Great. It's interesting that Carolina mentioned that the Colombian's Telecos that she and her company evaluated were surprised this was happening so quickly in Colombia. That's something that we hear from different kinds of actors that it's great this happens in this country, but it could never really happen here. That's something that Rebecca and her team, which for full disclosure, I am a member of off and on. Each country, does of course, come with a specific source of laws and economics and cultural factors, but the corporate accountability that ranking digital rights last week released last week is there's some standards that all ICT companies should meet regardless of the context they're evaluating in. Can you tell us what some of the standards are?

>> REBECCA MacKINNON: Sure. Thanks so much and first of all, it is great to hear about this entire Eco system of projects that's been evolving with EFF really being the first mover in lots of different ideas and approaches evolving around that. It's really exciting to see. So the ranking digital rights project takes a global approach. So we're basically trying to apply international human rights standards related to freedom of expression and privacy. How are those applied in practice? How should they be applied in practice to respecting user rights by the world's largest Internet and telecommunications companies? So the standards I'll talk more about the results later, but the standards we're applying really fall into three categories. These were developed in consultation with a lot of different stake holder groups and also building on the UN guiding principles for the global initiative standards around how companies should handle government requests regarding user privacy. We have a flyer on people's table that shows more information about the ranking and where you can download T. the first category is commitment. And so this has to do with not only with the company making a public commitment to respect users freedom of expression and privacy, but evidence they are showing they're actually walking the walk. They're actually doing something to implement that commitment. So in that section, it includes things like does the company have training to make sure that employees across the board understand the impact they're having on freedom and privacy. Are they whistleblowing in case they think abuse is taking place. In other words, is it assessing the potential risks to users freedom of expression and privacy that may occur in the course of its business practices and operations and design choices and working to try to mitigate those harms and maximize the benefits. Are they being mindful about what their problems that requires an assessment? Is that assessment being verified by an external party to insure that it is actually real and not just something they're putting on their website. Is the company engaging with stake holders and does it have grievance mechanisms? Are there meaningful mechanisms for users and are the stakeholders to notify the company that their rights have been violated and for those people to then obtain some redress to have this situation addressed in some manner. Peter can probably talk a bit more about some of the company they've been doing on remedy. Then there are two other categories. Freedom of expression and privacy that are looking very specifically at companies policies and practices affecting those two rights and this overlaps much more with both your terms of service project and with the effect project but taking a more global approach. So if they're doing some really great thing in secret that nobody knows about, it's not counted in our methodology. The point being is that the company on freedom of expression not only needs to be very clear about what its rules and are what users can and cannot do on the platform T. needs to be transparent about its process for handling third‑party requests to restrict content or restrict access or accounts. It needs to notify user clearly when content has been restricted either users who posted content or users who are trying to access that content and why. Companies need to be transparent about government requests and there's growing transparency reporting about that for restricting content when it comes to government requests, but not just government requests, increasing on freedom of expression category, you have a lot of private organizations giving lists about companies. Also companies being transparent about the nature and volume of content they're restricting taking down to enforce their terms of service even if nobody is requesting these takedowns. On the privacy side, it's about enabling users to know what is going on. What are you collecting? What user information are you collecting? Who are you sharing it, under what circumstances? Are you giving users any control over what is collected and shared? If it's information that is core to your business, are you being clear about what is central to your business and what are the things that are perhaps more optional? Are you being clear about how you retain data? Are you informing users clearly about where you're collecting from other websites? Then again on transparency reporting, are you reporting data about government requests for user information? Are you reporting ‑‑ are you clarifying what types of third‑party requests you take for user information because there are a lot of cases where it is not just government requests happening. It is either through courts or not through courts and what is going on there. This is not clear and also security practices. How robust does the company provide evidence that they are engaging in industry standards security practices? Are they offering end to end encryption and private communications for private communications enabling users to encrypt the content of their communications. So these are all the sort of standards we're looking at and I guess I can jump a little bit to results. There's a store card here you can see the over arching results and there are certain companies that score higher and others are not going to do a list down verbally. You can read it. But the point is even the highest scoring company would call a D sort of, you know, if not an F based on a lot of testing systems. And that there's a real lack of clarity to users about what's happening both to their access team information and as well as their personal information and how it’s being used and shared. And their needs to be a great deal more clarity. And particularly in the freedom of expression category, we're seeing a lot less transparency reporting around takedown and speech restriction than we are about government request for user data. We're seeing a lot less transparency around privacy request and service enforcement than we are go government requests. There are companies that report government request, but not any other mechanisms related to government requests and be it for user content and takedown. There's no transparency around enforcement and this is an issue where a lot of stakeholders have some serious problems. Grievance mechanisms are very weak. This is an area where we need to have a broad conversation between the industry and stakeholders about how we improver grievance. One other thing and then I will stop and let other people talk and we can talk more about it in discussion is that it relates to what our colleagues at charisma found that some of the companies particularly some of the telecommunications companies that we ranked and that we engaged with about the ranking said we're complying with the law. So why we have to disclose we're complying with the law, everybody knows we're complying with the law. One of the things we found especially with Teleco is that in their disclosures, it becomes quickly apparent their audience is telecommunications regulators and not actually their users and they're not communicating the information users need to know in terms of what's happening to their information. What is the context in which they can access information and who is controlling that context. Telecos aren't communicating that information. So I think part of what many of us are doing is helping to push the focus and sending a message to companies in particular Telecos that have less of a user cent wreck orientation. I would say you need to talk to your users not just to government regulators and maybe government regulators know what you're complying with and what that means for users, but the users don't know. They're not all telecommunications lawyers. Right? So their needs to be clarity about this so people can make their choices and so that NGOs and advocates and policy makers can be clear about how to create an online environment that's more compatible with people's rights.

>> MODERATOR: They would be included and what the standards would be giving them an opportunity to respond to the initial findings before they were made public. Any researcher can miss some disclosures on a website and if companies were able to point to, there is this thing they did not take into account and the project was happy to include that, Rebecca.

>> REBECCA MacKINNON: It fit the methodology.

>> MODERATOR: Of course. There is dialogue throughout. So Ankhi, would you mind telling us what the contribution of the types of projects are and your feedback on this type of endeavor.

>> ANKHI DAS: I think we're sensing toes for companies on engage and build sort of deliberative processes. Hey figure out a mechanism of working together and create a landing plan in terms of translating a lot of these standards and a lot of this. So what we've done on Facebook and you all know this it causes a transparency report. I think the discussion is there at what level of detailing needs to go there. You would have seen successively, there have been improvements and Facebook has been making in terms of the TDRs, which is a takedown request which is the user data request which is coming from government and sort of agencies from time to time across the world and, of course, there is very detailed data out there in the Facebook transparency report that does point to the fact that there is huge pushback. There is sort of a percentage of demand and pushback. There are government agencies talking about India itself. They're having a pushback of 59% coming to us from various agencies. Same goes for south Asia. You look at countries like Pakistan and all that. There is a standard that is applied when the demands are made on Facebook and that's one. The second thing is important to note is that ‑‑ sorry. I have a bad throat. We are members of GNI and again, that is another third‑party mechanism as well. Rebecca you pointed the oversight and parties which have this kind of standard and monitor implementation and participate in those processes actively and also there are processes which GNI is putting in place. This is the highest level in the company. The entire supervision of the engagement with GNR and compliance with the principles is something which is reported to management boards which are not taking this seriously in terms about you. And how you are operating that. Really the level in the details and I will give you a very sort of practical example. For example in India, there are 22 languages. Service is available in 13 languages. How do you put user rights. We have terms available in 13 languages. I manage it very well for us. They developed the environment and think about how this translates, but really having that information available in local language for users to understand what their rights are and then engaging in a very dynamic environment and communities as services used in the countries and also a very effective tool for public education and continuing the dialogue there. I think there are ways and means in which standards are applied at sort of a company level, but there is a lot happening in the field in terms of actual user experience and how you actually put this information in the hands of real users is something which a lot of people miss out on and I would recommend that as you think about broadening your own research and starting to look at, how are these platforms being used to drive real change and how you are retailing cultural diversity, et cetera, all those factors you have to look at. And also looking at how is local language requirement being met? Are platforms making their service available in local language? There's some non‑English speaking users and I want to make sure there is that plus my enablement is a crucial part in terms of the recommend rights framework which you are developing.

>> MODERATOR: Absolutely. So one major category after that we've barely mentioned so far is governments. Governments, of course, continue to bear primary responsibility for respecting and promoting human rights. So we are glad to have representative from the Swedish development agency in particular. So could you tell us about from your perspective from where you sit, what does this kind of effort look like?

>> Of course. Thank you. First be noted that I don't have a mandate for speaking for Swedish government, but I can speak for SIDA and I will explain how we use this information and how we work in the field of human rights in the business. So I'm happy to share that with you. So SIDA is the sweetish international development agency and we do have partnerships with them mainly. And initiatives, but we do have an extensive dialogue with partners from the private sector. We are discussing with private sector actors in order to try to develop their practices and their policies in terms of human rights. This is something that's coming to us even more. The base of our dialogue is always human rights which makes this work, of course, very crucial. We are working on many levels, of course, both with the rights with workers, but also on policy levels and we're always trying to have a human rights base approach to this. Matter of fact, I am always sharing my papers. So SIDA is an agency working with papers. It is a human base approach in context with the private sector actors. I will be happy to share this. This is something we have been dealing with internally, but as from tomorrow actually, we are also publishing this externally on the public website. And this is something we have been doing this is not particularly ICT companies, but it is something we will try to develop in terms of following procedures and activities as well. More and more often, we get the possibility to discuss human rights and also participate in processes that can be commentating on almost rights impact assessments and part of transparency processes that industry dialogue partners have been working with. But also trying to keep the good relations that we have ranking digital rights, but also GNI and industry dialogue. This is something very important for us in order to maintain our central role in discussion. What is interesting is we have been discussing as a development corporation agency, we have been discussing previous issues and security issues for human rights defenders for many years. And I think that the corporation with private sector companies would feel something that is crucial in order to maintain the same safe (inaudible) levels for the common users. Allen was a part as well as industry dialogue and this showed that the interest of these issues is very high in SIDA, but also within our partners. Access now, which is a great partner of ours deals with these issues tremendously. When we met, there was a question on how to bring this forward and I think that we have tried to do this in terms of raising this issue at this document, but we have many requests to try to organize events, but in more secluded spaces because this is really important, but sometimes it is difficult for actors to get down to the really difficult issues in a public space. So I think that is something that we as SIDA get offer in terms of what we could do to offer platform for discussions on ICT and human rights and try to invest more time in digging into the deeper and sustainable issues. To do that, we need to be well prepared and I think this initiative is one of the key initiatives that we need to follow. In the discussions that we are a part of systematics and methodology and that is something that is really lacking because it is hard to compare different companies. It is hard to compare different practices and this field is difficult to compare indicators and different kinds of analysis of the situation. So I think that this is really great. I'm waiting for the following discussions not only because of your initiative, but also other initiatives that are going on and I also wanted to clarify that SIDA is happy to be part of that discussion. Thank you.

>> MODERATOR: Great. We've thrilled to you have be part of this discussion as well. I am possibly more thrilled to have people that traveled all the way from Asia to get here going through many, many trivials in order to get here. These two people who I am going to turned mic over to next. Her organization is right here. Wait. What is it? There it is. based in the Philippines. I understand you are in the beginning ranking of the Philippines. Can you tell us what challenges and how this broader community can help and what's different about the Philippines than some of the other countries we have talked about so far.

>> Okay. I think what's different in the Philippines is that there isn't much awareness of the rights of the users in terms of privacy, in terms of how ICT companies comply with their terms of service. There is also no transparency, but I think we don't really know ‑‑ we don't get the information on what data is being given to the government or to the parties. Although, to give a perspective there, I can also say that relative to other countries, the Philippines on the internet and social media has been relatively free in the sense that as of now, there has been to my mind no blocking of websites, no ban of accounts. We expect that things might change now that implementing these for (inaudible) has already been finalized. This will now provide the mechanism for not only government, but also for private entities to be able to make use of that law to advance whatever intentions they have. In the Philippines right now, there has been no government action versus bloggers or versus Internet, contact with users. What we're seeing is private individual or private company action against each other. So I think that how (inaudible) play into this will also be important to look at. What we consider right now is to use the (inaudible) that you sit, the ranking of digital rights I think is a good model to use. This will also translate easily when (inaudible) see the ‑‑ and the public will be able to understand it easier. I don't really know if right now we're not considering ranking because we're planning to begin with looking into Telecos and they only three major providers. It doesn't make sense us to.

>> MODERATOR: There's a project in Canada called Xmaps. And their focus is on mapping packet in Canada whether or not it goes through the U.S. or not, which has been a major concern for a lot of Canadian consumers in the revelations. But what I would endure and consider a very Canadian trait, professor clement feeling the rank ordering can cause more problems than it's actually worth. And that's something that may make a lot of sense in a lot of places including the Philippines perhaps.

>> I mentioned earlier about providing best practices framework. That will also work for us because since there is not much awareness in digital rights. To be fair to the Teleco, they don't know what they need to do. So something along the lines of what you're doing would also be used in terms for giving them a guide.

>> MODERATOR: Thank you. Kelly, your organization owned in Korea and it's also in the process of launching on a ranking or ICT evaluation project. So what's going on with you guys?

>> Kelly: At that point, actually, we are supporting transparent project Korean internet project. In our project, we are not really ranking or evaluating the companies, but we have ‑‑ we ‑‑ we evaluate data from our government and from private sectors and private companies about in terms of free speech and online censorship and private surveillance issues. We have data. It's been going well and I think it kind of promoted KAKAO. And we put (inaudible) on our website which is transparency.KR. Yes. Well ‑‑

>> MODERATOR: So you bring up a really good point that you have to have companies doing transparency reports before you can start evaluating contents of those transparency reports. You have to start somewhere and as Rebecca has said in a number of different forums, it's a question of figuring out where the floor is and then gradually raising it rather than setting a bar that's so incredibly high that it's just not going to happen.

>> If I can throw another question over to Kelly. One of the things that I think is really important about the work that her organization is doing and they're also looking at government requests. So this transparency report is also looking at what requests is government making and sort of the importance of also to what extent is the government allowing transparency because there's some places where the government permits transparency by companies and then order countries where the government actually prevents transparency reporting because the law forbids reporting on these things and it would be interesting to hear your observations about that.

>> Kelly: All ministries of Korea have obligation to report to the members of the parliament and piece. So, these members have access, great access to those government datas. The government themselves don't really publish a transparency report, but because they have to gather the data and have to report to the national assembly, they have this data. We get the status from those MPs, the members of national assembly and we analyze them and we make tables so they can be easily accessible by the public. And before your transparency report, there was no organization who did this who analyzed this good (inaudible) from the government and to show the public how the government and how the (inaudible) carried out by the government and how they are censoring that information. You have censorship laws. For example, we have (inaudible) called Korea communication standards commission. They can just sense online information on the standard of nurturing some communication ethics. So that means they can just sensor any information they think ‑‑ they think (inaudible). So the standard is very dubious. So agencies have made a takedown request to private sectors intermediaries last year about 20 ‑‑ no. 200,000 ‑‑ over 200 requests last year. It's not like ‑‑ although they don't have legal obligations to follow those requests, of course, you know, it's estimated in the agency and the intermediary are the ones to comply with their requests. Before it's been going on like that and we publish those datas on our website and we analyze those datas and we sometimes write our blog posting very illegitimate inappropriate requests, takedown requests. It's kind of work as a pressure against the KC communications standard commission. Now they're trying to be more ‑‑ how do I say more considerate about making those content takedown requests. That's one good ‑‑ I pull of how our work kind of influenced the government's sector.

>> MODERATOR: Great. Thank you. Yeah. Rebecca, absolutely.

>> REBECCA MacKINNON: If I can make another foot note on Korea. I think the Korean case is fascinating. I want to jump in and make another note. How many people here have heard about KAKAO? Okay. A number of people. It used to be a merge between two other companies that became KAKAO. They did very well in ranking digital rights corporate transparency index. There are some of our privacy indicators where we looked at, you know, how much do you disclose about when you're collecting, what user information you are collecting and with whom you are sharing it. KAKAO got a top score on a couple of the privacy indicators. They were also among the very few companies that do transparency reporting about non‑governmental private takedown requests. So there were a lot of ways and they're transparency reporting again on data handover request was also pretty thorough. So I think it is sort of an interesting example of how a particular policy environment and then NGO action and interacting with the companies ends up having a positive result. It was interesting.

>> Kelly: I do think KAKAO did very well. There was no media coverage in Korea about this great resort. And even KAKAO has been silent about it maybe because they didn't top the ranking. But like a long (inaudible), KAKAO became fixed and four other top companies on the U.S. companies. So KAKAO did very well in other regions except U.S. and England. So it would be really nice if media (inaudible) had been thought before the launch of the event, the launch of the term ranking. We could of found ways to give KAKAO a credit. That might have triggered other internet companies interests and enlightening in how it is important to show that they do care about protecting users rights. So maybe next time, it would be really great if you are reviewing other Korean companies and it would be great if you had media storage planned in events. But anyway, it was great that KAKAO was included and now we are for sure to use this ranking to promote user privacy and freedom of expression in Korea.

>> MODERATOR: Absolutely. And Rebecca was saying maybe we can do that and it's been out forever, but it's only been eight days, which I am told it's scarcely more than a week. So I'm going to lead (inaudible) make one last comment. I would like to invite the remote audience to tweet their questions and for people who are here, come up and offer your questions and comments and feedback. Perhaps Allen can help circulate the microphone. Merchan, you get the last words for this side and then we go to questions from the audience.

>> Hoping to get the last word. Something that (inaudible) mentioned on Monday, Rebecca and something we have seen in Sweden as well is the discussion helps. I see that we have a network for companies that we're working with called the Swedish leadership. Eric sent that too on our part. A part of that will also engage and we said with discussions with other companies. So I think what is interesting with the ranking digital rights report is that the highest scoring companies have engaged in some kind of dialogue on these issues. I think that shows clearly that it might be time to step away from the blaming and shaming games and also engages here with discussions with companies which also comes to a (inaudible) issue that maybe you should think about regranting that there are no winners to dialogue helps or something like that. More positive?

>> REBECCA MacKINNON: Given to the talk, we had to emphasize this is a diagnostic. This is not a certification or a stamp of approval just because somebody got an overall high score.

>> MODERATOR: Jetty, thank you for take the first question. Or you're passing to someone else?

>> Audience member: She was first.

>> I recommend the index that has been developed. I'm just curious if gender has also been ‑‑ I mean, if you thought of including gender in this index because it's actually important really to insure that gender is as a part of the rights struggle ever the services. And number two, just additional point from the Philippines. So we have actually this data privacy act that has not been implemented yet. If ever it will be implemented, we can also try to insure if the Telecos are complying to that. I agree that this index is very helpful in not just advocating, but also increasing awareness among telephone companies in our country.

>> PETER WRIGHT: Thanks. I can say that we've looked at some of the numbers that companies are releasing on the diversity of their staff which can definitely include gender and, you know, it's revealed immense gender imbalance with the staff with many of the internet firms in particular Silicon Valley. It is heavily weighed towards males. We're talking about privacy and expression, maybe it is not directly related, but then we see many companies making decisions that seem and sometimes they say that are based on the identities and the kind of backgrounds and allegiances with the staff. That's a great point we're considering.

>> Audience member: Also that the companies are trying to have a focal point looking into gender. In the Philippines, you have this world they have to have a gender focal point in government organization. But why not do it in ICT companies.

>> REBECCA MacKINNON: We have to limit our scope that if we don't limit our scope, we never actually produce anything. So that was one of our challenges. So we kept fairly focused on free expression and privacy; however, I was in a session before this on violence against women and also kind of dealing with hate speech and other things. A lot of people have asked me and probably some of you have gotten similar questions, you know, what about violence against women? What about the children? What about terrorists? You mention none of these things in your materials or, you know, in your criteria questions in your indicators. And the basic response is we are ‑‑ the way we've designed our particular index is that we are expecting the companies will device rules that no right is absolute and they're going to have to deal with violence against women and they're going to have approaches to dealing with violence, exploitation to children. We didn't list them in our index as specific issues, but assuming that companies will be taking on these issues as well and they will do it in a context where they're being transparent about what their rules are and how they're enforcing their rules and grievance mechanisms and engaging with stakeholders which would naturally include stakeholders who are concerned with gender issues, stakeholders who are concerned with violence against women and someone and child rights and everything else. This would be part of their human rights assessment and their general engagement. So there's a bit of that built in to the broader grievance and due diligence and also just a such that if you are going to be enforcing rules, there are rights that are not addressing directly that you are dealing with and the whole point is that you need to be transparent and accountable about how you're enforcing whatever it is your stakeholders feel that you need to enforce so that it's done in a holistic way. So that's kind of how we approach it. So, you know, APC I would also point out has done a study evaluating a number of Internet companies specifically on gender. I guess the other point is that no one project, no one index is going to satisfy everybody's concerns or lends. Which is why we need a proliferation of these projects that they can be layered upon one another. There can be collaboration. If we put all our data out in the open, stakeholders can mix and match from different projects and put together the picture that they need in order to advocate for their particular concerns. So, you know, let a hundred flowers bloom on this and I invite people to adapt these methodologies and there's a variety of them here to shine a light on policies and practices that you feel are most important.

>> MODERATOR: Great. I will jump in with a question from Twitter. Have you thought ‑‑ you have got any plans to offer certification or recognition for privacy auditors. If anyone has any plans, please jump in.

>> REBECCA MacKINNON: It will be interesting. For ranking digital rights as I said before, we're diagnostic. We're not a certification. We have a process and we don't have that many companies. Trying to treat it as a certification would not be appropriate in our case, but I think there are projects that are moving more in that direction.

>> The goal is not to provide a ranking, but identify best practices and worst practices. So when I was speaking about elaborating model contractor provisions with labels and can be used as created provisions, that was exactly a tool that could be useful without being a ranking. So without blaming (inaudible) or the terms of service for specific corporations, but providing a concrete tool that cannot only protect human rights, but can also be recognizable. When we decided to go in that direction, we simply evaluated one of the main success of the creative common licenses is to be recognizable. So if we can replicate that in the next phase, we also have some partners with common terms working on that sense. If it's also a member of the platform responsibility. So the goal is so far to provide evidence on which this model contractor provisions can be dealt and also guidelines and guidance on how to do it.

>> Good afternoon. My question is to (inaudible). It is very interesting you said from a respect of this, about 55% push back requests. Is Facebook contemplating for community standards because I think it becomes problematic when a lot of people get behind a certain type of content and the shared numbers driving what is restricted. And other problems you face or contemplate should be accounted for in such ranking mechanisms and possibly community standards.

>> It happens either through a government case work because you have agencies and then you have users reporting content and the reporting is done every piece of content which is there on Facebook. I think something which we have explained time and again is that it is not a numbers scheme. It is not a numbers scheme. I think there is this misconception that if you somehow organize and get a lot of people and we typically see this in the gender domain or minority speech where you have your ability to organize people and then get them to report a certain kind of content that will lead to a data action is not alleged assumption because what really happens is that a piece of reported content is looked at based ‑‑

>> Can you get a bit closer to your microphone.

>> The community standards would disallow certain kinds of speech. There is graphic sort of harm or sort of this actual trip to life, et cetera, nobody can argue with and it's (inaudible) standard. If it's reported and obviously through users and obviously it will be taken down. It is an evaluation of community standards. But just because there is an advantage to community because they have been reporting a certain kind of speech that is otherwise not a violation is not going to be taken down. That's not how a system works.

>> MODERATOR: I think we had a question in the back. The woman with the blonde hair over there. Yes.

>> Audience speaker: Thank you. I think it is really, really nice the way both the data and health doing is available online. It truly makes it possible for both organizations researchers and companies to engage with findings. That's the whole purpose of it, I guess. You actually want to improve action in this field. So I have two questions related to that. The first one is rather straight forward. How have the companies responded to this? Have their been ‑‑ has there been a lot of response from the companies and if so, what kind of response? And the order one is simulation to previous questions. One of the findings is the disconnect transparency between the practices related to external pressure from companies and the practices that are self defined by the companies. Are there any indications that the companies are moving towards transparency in this area because I haven't really found any myself. You mentioned the Korean example, but other than that, I would also like to ask Facebook specifically if you are there if there are plans to reveal information about the takedowns, the internal processes leading ‑‑ the takedown of content that's decided by community standards and not by external pressure.

>> How the process works, there's a graph that's been released more than 18 months definitely. It's basically reporting guide to graphs there on our side and you can just sort of look it up on Facebook or I can send you the link or post it. Essentially, it leaves it out. What happens when you report a piece of content. How is it looked at? There are reviewers 24/7 who are trained in terms of community standards and terms of service and policies and they're constantly refreshed. They also have local language capabilities because that also is a very important factor when you are assessing reported content as well as they have local context advantages because a lot of this is looked contextual as well. It plays out very clearly which takes away the types of buys that people normally get worried about from a human rights and that piece of content is looked at. So they're fairly detailed processes in terms of applying that standard and we've made public disclosure of how that work moves. I'm happy to share that with you.

>> REBECCA MacKINNON: So the other part of your question, for some of the companies that we've been in dialogue in any case, but when we started the research, we let companies know they'd been selected. We provided them with preliminary results and then we had a conversation before we finalized the results and that type of information was provided to us. There hasn't been a whole lot of public reaction from companies, but more of a continuation of some conversations we were having before. I think the only company that's issued a public statement about this was Yahoo. They were welcoming ail conversation about some of these questions. And in terms of transparency round takedowns, we had different questions one about your process and what is the volume and nature of the stuff you were removing and that's where we get no transparency. I think it's worth several of the companies and I will let companies speak for themselves about who they were, but several companies argue it's counter productive to issue kind of too much raw data about what they're taking down; however, it may very well be in the way the status is released and the kind of statistics that are released need to be in a different type of form than government transparency reporting on government requests. But there does seem to be a very widespread feeling we hear from all over the world from multiple groups dealing with lots of different types of issues who are feeling that the process is not sufficiently accountable or transparent and that there's a need for greater transparency. So I think it will require conversation. Maybe SIDA can help provide that platform and others, your Danish institute for human rights, et cetera, there will need to be a lot more conversations about what best practice release should look like and around grievance mechanisms as well. There's no really good model for what that needs to look like so that people feel if mistakes are rampant or abuse is rampant or problems rampant, how do you move it into a better direction. I think there's a lot of conversations that do need to be had and kind of what best practices look like has yet to be worked through and it's going to require dialogue among stakeholders to get there.

>> MODERATOR: I think we have time for one more question from the audience and then we can continue the conversation in person, online and through other various means of communication. Was there one more hand in the audience? Yes?

>> Audience speaker: Thank you. I have a question from Ankhi from Facebook. Why are companies not clear about what data they collect what it's used for? Why isn't there data collected that's shared to (inaudible) qualities, governments and other companies and what data is collected more than necessary. Yes. That was my question.

>> So in terms of how data is collected and how data is handled, Facebook has disclosures through privacy policy. In terms of what data is handed over to government is always done following only knew process and we have the government request report that is brought out every six months and covers literally every demanned that's been made on Facebook from governments all over the world. We cover every country. Doesn't matter if you ask for a single. You've made a single request or thousands. All countries, all data is cataloged and reported in that. So there is detail disclosure systems and processes which are out there. In terms of how data is handled and stored or what are users signing up to, this is provided both in the terms of service as well as the privacy policy. We also introduce what we know for the users to a do it yourself which is a privacy check up. Certainly, when you are reminded as a user, you are reminded to do your check up in terms of what is ‑‑ how do you want to set up control? How do you want to control your data, right? And then it's all about making sure that the users are aware through this privacy policy and sign up for Facebook and how the data is going to be used as a user if you're signing up. You another agree to that or you don't and you have that option there. There is that option you have and the second thing is that whatever your question about what kind of data is shared with government is always very strictly based on legitimate demands for associate forms which we quantify in our transparency report.

>> Thank you.

>> MODERATOR: Great. We are out of time here and we want to end on time so the next session can started. But thank you very much for coming. A special thanks to our visitors from all over the world, which I guess including me. I don't know why I said visitors. Luca wants to have the last word.

>> LUCA BELLI: I want to share a friendly invitation for those that are interested and I think everyone here is interested in terms of service, and we'll have a workshop on terms of service and cyber regulation at 4:00. That is room number 6 and that workshop in room number 2.

>> If somebody wants a report in Spanish, there are some here.