The following is the output of the real‑time captioning taken during the IGF 2014 Istanbul, Turkey, meetings. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.
>> JACQUELINE BEAUCHERE: Good morning, everyone. We are going to give it a few minutes. We are still waiting on one of our primary panelists.
Good morning, everyone. We will get started if that's all right. I want to welcome you to the first IGF best practices Forum on Child Online Protection. My name is Jacqueline Beauchere and I am the chief online safety officer at Microsoft Corporation based in Redmond, Washington in the U.S. My co‑Chair for this Forum and cofacilitator is Hannah Broadbent. And we have a third member of our team which is Carla. She is joining us today as a remote participant.
The work of the Best Practices Forum has been in process for the past several weeks from such sessions now and in the future. So you may have been asked to participate early on in the process by responding to eight sort of general questions or items about child online protection. Know that these were the same items that were asked from the five different Best Practices Forums and responses were culled in to a draft document by our consulting scholar and that is John Laprise. And our goal in the next 90 minutes is sort of to recap what we have learned to date from that process. And we want to gather further inputs so that John can prepare the next draft of that report.
We will hear this morning from five more formal discussants. We have someone representing a global view and we have someone representing a more regional and country specific view. And we are privileged to have three youths here with us this morning as well. We are then going to call on you, the audience, a couple of times during this interaction and we would like to hear your interventions on what is going on in terms of Best Practices in your organisations or your regions. We definitely know that there is a lot of very interesting and very compelling work going on.
So we want as many examples as possible to feed in to this process. After we have your interventions and your contributions we will then open the floor to formal questions for either the panelists or just amongst the group in general. And then we are going to reserve some time at the end for some feedback about the process in general. And that is because, you know, that this was ‑‑ take ‑‑ this leading up to IGF took place at a time that is obviously dedicated to holidays and travel and it was a quick turnaround time for the formal submissions. So there is still time to do so if you would like to present a formal submission. And we are able to take those and John will include them in the next draft of the report.
Later this afternoon the co‑Chairs of the Best Practices Forum will report out to the main session. This format is a trial balloon of the IGF of sorts. We need your input and feedback to be able to improve upon it and to advance it going forward. So without further ado we will have Hannah introduce the next panelist.
>> HANNAH BROADBENT: I am very happy to be here. So we do education and awareness work and also working the policy phase both within schools and industry. A key part of this is involving young people in our work. Without further ado I would like to introduce our first panelist, Susie Hargreaves.
>> SUSIE HARGREAVES: Thank you very much. And thanks very much for inviting me today to speak. So I'm going to try and give a global perspective to the issue. I'm chief executive of the Internet Watch Foundation which is a UK hotline of reporting child abuse content. In terms of child sexual abuse content globally we are part of a network. So we also ‑‑ we work very closely with the other hotlines around the world. And we are a member of the UK Safer Internet Center with Childnet.
I'm not going to talk about the ‑‑ I am going to talk about the work we do as a member of the coinitiative for ITU. We need to acknowledge that protecting child online is a global issue. And it brings with it many challenges. And it is one thing that is being done and being done very successfully so far by ITU through the coinitiatives. Child online initiative of ITU has identified five areas that need to be addressed on a global basis on a country by country basis. So the legal measures is in place and some of the key issues is that legislation is different in every single country. In fact, child sexual abuse is probably one of the areas that legislation is very standardized across the world. It means we can take action on content that is hosted anywhere in the world. Organizational structures need to be addressed. Capacity building is an issue and we need international cooperation.
One of the things that I want to bring your attention to is to the coinitiative of whom many people in this room are members, but it is a multi‑stakeholder and brings in people from all different sectors. So I know that I have been involved in and people around this table have. And they have four guidelines that are useful. If you haven't come across these you should. One is for children and parents and educators, and one is for industry and the final one is policymakers. I think we need to make sure that the work of the coinitiative isn't just replicated in this Forum but it is really important that we build on what has gone before.
The other document I wanted to bring your attention to is an UNODC document which is on the study of the effects of new information technologies and the abuse and exploitation of children. And it brings together incredible world research. A great document and available online at UNODC. Looking at these two pieces of work it is important that we build on those. So that's where I think we are. I think those are the issues that ITU has already framed them. And I am happy to expand on that later on. Thank you.
>> HANNAH BROADBENT: Now I would like to go over to Patrick Burton who is the executive director of Centre of Justice and Crime Prevention in South Africa.
>> PATRICK BURTON: Thank you. Thank you again for letting me be a speaker. Perhaps I should just start by explaining a little bit about the centre that I work with. The Centre for Justice and Crime Prevention is a non‑profit organisation and based in Cape Town. We do have some bit of a regional footprint as well. We weren't established and set up to attack online issues, but we are set up to look at violence prevention for generally. Very strong focus on children and youth.
I think through the work that we were doing around 2007, 2008, particularly around school‑based violence, cyberbullying started to emerge. And up to that point there was no research that's been done on the subject. And so we picked it up with a lot of the work that we were doing, and obviously now just more emerged in conversation in South Africa and we work closely with the Department of Communication and the Department of Base Communication in South Africa on various strategies, approaches.
So that's really where I come from in terms of comments. I think perhaps because of the approach that we take to this organisation towards child safety more generally we approach child online protection one of two ways. The first is looking at how we equip children and young people in schools and agencies to deal with this adverse situation that they might face online. So that's the one aspect of it, but the other is looking at mitigating at those threats, inappropriate behavior and inappropriate content and making sure that when children do come across instances where they are being cyberbullied, where they come across inappropriate material online that there is appropriate response mechanisms in place. And I think broad categorization of how ‑‑ I won't speak on behalf of the Government. But generally South Africa from a Civil Society perspective, academic and policy making we are approaching child online protection in three ways. The first is prevention and it is looking at ‑‑ looking at preventing, looking at the content that's available online, looking at things like safety, privacy screening measures that are in place, that are in place online or that needs to be in place online, looking at prevention perspective, also looking at resilience among kids. So that's how people are generally approaching prevention in South Africa.
The second area is looking at response. And this is one of the major gaps that I think people are encountering online because even though Government is taking steps to and the members of the coinitiative by ITU and they are developing policies, child empowerment and online safety policies but at the moment in terms of response, in response of NGOs, in response of justice system, natural prosecuting authority, the police, there is a huge gap in how cases are handled and what the response is to chart online safety issues.
And I think at the moment it is quite an interesting space in South Africa because we thought we had some of the most aggressive child protection policies in the world without a doubt that are very well rounded on keeping child out of the criminal justice system where appropriate, and that unfortunately also somehow at times now when it comes to online issues against what's in the best interest of the child. Because we are seeing cases where 15‑year‑olds, 16‑year‑olds, they are being prosecuted under the justice system but to the point there are some systems where their names are being placed on the National Sex Offender's Registry but that's not the route we want to take in South Africa in terms of what is in the best interest of the child.
And then I guess it is around ‑‑ the third area is around the development of appropriate policies and the Government being quite proactive in there. They are looking at how evolving online protection in to, for example, school safety strategies, education policies as well as broader policies for the Department of Education.
So in terms of beyond these areas good practice and promising practice, unfortunately because we have so little research in South Africa today but we are only starting to develop momentum there is very little evidence that we can identify the practice and only now starting to develop some baselines that we need to in order to identify what is good practice, what is ‑‑ I don't want to use the word best practice. But certainly there are some early indications that where the focus has been on integrating safety strategies in to either curriculum for children where there have been initiatives that focus on communication between parents and children to see good effects. But without a doubt this is a big gap in terms of being able to identify what is actually good practice. I recognize this, research institution, recognize starting to put much more attention on developing sound, rigorous research to start identifying what is good.
>> HANNAH BROADBENT: It is definitely a good sign what is good practice and what is working for the young people here today can shed light on some things that they feel have worked well in their communities and in their families. I'd like to introduce Eleanor Lee, Harriet Kempson and Zach DaSilva. They are all 15 years old from the UK. And they are going to share their perspectives on the things they feel are Best Practices in their school or community.
>> HARRIET KEMPSON: Thank you, Hannah. We are from England from the organisation of Childnet and the IGF youth project. We are bringing our experiences and our friends' experiences to this discussion today.
>> ELEANOR LEE: Through our discussions we found there are three main risks. These are social aspects like cyberbullying, explicit content and protection from data from our contacts or ‑‑ and from our companies.
>> ZACHARY DASILVA: There have been many things done to support and promote child safety, but obviously this is only the beginning. And we will continue to push forward focusing on industry, children and schools. Bring it back over to Eleanor to talk about industry.
>> ELEANOR LEE: We feel the industry is great. They are doing all these filters to block back sites and industry such as the IWF filter out bad sites and block children from going on those sites. However this doesn't filter out abusive language on social media. And we found through experience that children don't actively go searching for these abusive sites. And it comes to them through social media and those types of industry and companies. We would like to promote blocking tools on social networking. Because when you are in that pressurized and difficult situation where you are being abused online it is difficult to find those reporting buttons and blocking tools to create it so people cannot see those abusive language and that children are more confident in what they are saying online.
This is definitely also for other companies' privacy settings nowadays are very difficult to find. Sometimes it is very long processes and for children in that situation where maybe they don't have that knowledge, that technical knowledge to find those pages it is very ‑‑ especially in the context where they are being pressurized because of that abusive online language it is very important that we make sure those reporting buttons and those blocking tools are very prominent for them. So that's why I believe this is something we should focus on in industries. I'd like to pass on to Zach to talk about schools.
>> ZACHARY DASILVA: In schools I feel as there is one big point that we should be focusing on and that's more centered around educating children earlier about the Internet and the risks and how to be safe online. And I feel the best way to do this is to focus on various teaching methods because sometimes when you have children sitting in front and looking at an assembly and someone coming in and speaking to them they don't take the information that they should. Instead of people telling you don't do this, do this, make it a more interactive presentation and share anecdotes and things that have happened to people and that may go and help them.
Schools are doing a lot of good things actually. Some of the things that they actually do are like blocking games and ads of media sites in classes in school, classes IT, and history where you do research online during the school hours. Obviously kids are curious and they like to go through and try all these different things online and games in school. Trying to focus and see if we can come from another angle to try and shut that down so we can focus from our education in school. Obviously there's also ‑‑ sometimes I find in schools that the filtering and blocking can get a bit too ‑‑ a bit too specific and a bit too much. Because I feel if there is some things in school, personally from our school we have our personal e‑mails and Youtube blocked in schools. And I feel as if having personal e‑mails in schools like Google where we can actually pull up our documents and we can have the Google drive and shared things in school and then also use Youtube for educational purposes such as looking up videos that may be specific that we may need to kind of help in many different aspects in school. So I feel as maybe getting those out there and then maybe not being too harsh on those but more focusing on filtering the correct items instead of other things. That's mostly about school. So now we can move over to Harriet to talk about issues with parents.
>> HARRIET KEMPSON: As Zach said I am going to approach the issue from the side of what parents and families can do to try to ensure that young peep are safe. And I think that in some ways this is one of the most important groups of people. And I don't mean just parents, I mean any trusted adult. So maybe aunts or family friends or even teachers sometimes depending on the relationships.
So there are three main things I think that the parents, group of people could do. And they are discussion ‑‑ they are about discussing, sorry, trust on the technology side of it. I think it is very important that children have this kind of family responsible adult who they feel confident to talk to and to talk to a lot. And this promotes the trust. But, for example, if my parents they set up my Facebook and e‑mail account with me and they went through all the privacy settings so they knew and I knew how to use them and to keep me safe. So I think this sort of discussing and doing things on the Internet together, especially with the social media means that has a stronger understanding between the two. And then I now feel safe to go to my parents if I need help with something and this is the trust which I was talking about.
Two‑way discussion brings the trust and this trust goes both ways. So they aren't constantly looking at your Facebook page and checking your phone every day. But if they feel something might not be going quite right, my parents will ask me if they can look through my Facebook page with me and then it is a two‑way trust going that way. And with the discussions and the trust it means that the adult figure can tailor their actions and change how they are monitoring the child to suit them. Because there is also a difference. Some people at my age know loads about technology and everything and some people don't know so much. It is important to know each other very well.
The third point for parents is the use of technology. There are loads of brilliant securities, software and parent control software which are used and my parents aren't very good at uploading them. They always get me and my sister to help them. So these software when they are finally uploaded on to the computer can take a while which are useful in blocking the sites, help young people stay safe.
So in conclusion we think that the Internet can be made a better and safer place, especially for young people if these three groups of people and everyone else work together and make ‑‑ just some ideas that we think would help. So we think industries and organisations such as the IWF do fantastic work in taking down this context. And then for schools, organisations like Childnet do brilliant work in going around schools and educating them and running sessions. There are so many things which already happen but it is just making them more interesting for young people so that it is more absorbed and working together to get these final things done. Thanks.
>> JACQUELINE BEAUCHERE: Thank you very much to the young people and to all the panelists and I believe that Susie wanted to weigh in. There is a lot of issues and threads of issues under this broad umbrella of child online protection. And she wants to make sure that we are keeping the illegal material and content definitely separate from some of the other issues that surface. So Susie.
>> SUSIE HARGREAVES: Okay. Yeah, I just think it is really important to clarify for the discussion that often the issues around access to blocking criminal content and access to inappropriate content, the issues inflated and that's not particularly useful. So I wanted to really clarify that what I do and my company does, my organisation does, we block criminal content. This is content that nobody should ever see. Nobody is allowed to see it and it is illegal. And that's a completely different thing to do nonparental filters and looking at ways that young people have access to inappropriate content.
For instance, we don't block legal pornography and I think what happens it is quite useful in the UK is politicians like to inflate the two and we have spent quite a lot of time saying no, this is very different. And in terms of ‑‑ there has been a lot of activity in the UK over the last two years and we are in a different position than many countries, but there has been so much focus on this issue and it has been taken up at the highest level by the Prime Minister. So we have a number of huge initiatives in the UK which haven't been replicated around the world which John and other people will talk about later which include parental filters implementing on all new devices. If you buy a new device, now you have parental filters and you have questions that you have to answer, if you want filters on or off and that's different to other countries on around the world. That separates the criminal from the inappropriate. Thank you.
>> JACQUELINE BEAUCHERE: We would like to open it up to the floor. We know we are going to have lots of participation. We want to open it up at this point for continued interventions about Best Practices, how are you framing the issues and so forth. A lot of the questions that were brought up in the initial question set. So Larry, do you want to kick us off?
>> AUDIENCE: I am going to do something that is normally considered risky and I will tell you in a minute why I consider it risky. I want to thank Hannah for bringing over these extraordinary people and I want to thank Grace, Eleanor, Hanna and Zach for a strong presentation that helps articulate some of the issues. The risky part generally speaking as an adult one tends to patronize people. That's nice and move on. But I'm going to treat you like I would treat anybody in this room, you are professionals and you are here as professional speakers, and you are here making a point. And I frankly disagree with some of your points. And one time I disagreed with the young person's point and the mentor got all ballistic about it, but I trust that's not going to be the case here.
I am pleased to see the folks from Denmark just came in. And I am sorry they didn't hear your presentation, but based on what I heard some of those folks say yesterday I think had they spoken they would have given a different perspective. So I think it is also important as a measure of respect that just as no single African American who represents every black person in America just as no single woman represents every woman in the world and just as no single Muslim represents the entire Muslim population the say is true of you. Youth come in all shapes and sizes and all opinions. And I happen to think that if we are to ask other youth and perhaps our friends from the Nordic region can weigh in that we would get a different response over the issue of blocking and filtering.
I speak to a lot of young people, but generally speaking when I speak to young people what I hear is that blocking hasn't really helped me learn more. It hasn't helped me enjoy the Internet more. It hasn't helped me thrive as much as education, as much as building up a sense of resilience as much as having a support system, as much as learning how to use the technology in ways that will last me a lifetime. And I say that not because I don't think that blocking has no role. I think it has a tremendous role with very young children who might accidentally stumble on things. As I said in other workshops the strongest filter is not one that runs in your device but the one that runs through your ears. We need to teach young people how to handle the fact that on the Internet just as in real life, just as in the physical world there will be things that are unpleasant. The world is unpleasant. You guys are old enough to know that we had a horrendous situation happen in the world where a journalist who has beheaded by some extremists. In no way, shape or form do I want you to watch that video or me watch that video.
There is even sexual abuse images. I am not going to show them to you. But we live in a world where one of the things that we have to do as we mature and I am actually still maturing, it doesn't end when you turn 18, but as we mature we learn to cope with and deal with. So I would say blocking has a role, but I also think that we really need to think about other ways in which young people can deal with this. And when it comes to older kids I really think that filters and blocking are often counterproductive and they often get around them, but when it comes to very young children I think you guys have an excellent point. I want to thank you, and I want to thank you for appreciating the fact that as adult participants happen to be young adult in this Forum that we can have a frank, open and spirited discussion, sometimes disagreeing with each other.
>> JACQUELINE BEAUCHERE: And I think we have a response from Harriet.
>> HARRIET KEMPSON: I completely understand everything you are saying. And we don't try to say that we are speaking for everyone. I think right at the beginning I said that we are going to bring what we have experienced, things which we have heard. I am trying to not speak for everyone or the Danes but just what we have experienced. So you are talking a lot about how blocking doesn't help a lot. And Zach was saying that with schools we think that the filters are sometimes overprotective with the games. So we did make that sort of point, but when Eleanor was talking about blocking among companies I'm not sure if it was made perfectly clear but we, of course ‑‑ we discussed this before, but she meant a lot of blocking individual people.
So when she is promoting the report buttons, if I didn't want Zach to talk to me anymore because he had sent me some abuse or something, then I like a prominent button so I can stop him from contacting me ever again on that social media site. But yeah, and also with the parents, my point about the parents, it is the trust and the talking between them. Yes, we are not saying that we don't want anyone to know that this pullup field ever exists but having this two‑way discussion, mutual trust allows you to have the confidence to talk to your parents and them to address the sensitive issues. So I understand everything you are saying, but I think what we say was a bit more broad than perhaps came across.
>> AUDIENCE: I want to thank you for that clarification. And I completely agree that the ability to block people who harass you online or annoy you online is important. Thank you very much. That's excellent clarification.
>> JACQUELINE BEAUCHERE: And Zach wants to follow‑up and then we will go to Janice.
>> ZACHARY DASILVA: From my personal experience at home my Internet is not filtered and I have that trust with my parents. So they know I don't go on the Internet and go and search specifically for things I shouldn't be going on. So I have that trust at home. But I find ‑‑ but I find in school that it is really ‑‑ the filters are really necessary in school because I know people who actually get their phones out and they will text, go on social media in school and I mean just ‑‑ that's just because they may sometimes be bored with the lessons or whatever, but that's really no excuse to be going on social media.
So in my particular IT classroom everyone has a computer that has access to the Internet. But if you leave people long enough or if the lessons aren't as interactive enough and the teacher is just really speaking to you the whole time, there is areas in the classroom where the teacher can't see what's going on in your screen. So I know personally I see people playing games all the time on the Internet, and they just find them through whatever. So I am saying if we had filters, if we didn't have filters on social media in school I am nearly 100% sure that most students who wouldn't have their screen being seen by the teacher would be on that or playing games. So I find filtering at home, I don't really agree with it with the sites because I feel at this age I am mature enough to not need the filtering. And I know what I should be on and shouldn't be on. But in school I feel filtering is a necessary thing.
>> JACQUELINE BEAUCHERE: We will go to Janice. One second, Janice. I have a point of order from the secretary.
>> JOHN LAPRISE: Hi. John Laprise for the record. We are talking about perspectives on these issues and looking around the room the Secretariat would like to have a quick show of hands by continent. If you are from Africa, please raise your hand. Asia? Europe? Australia? South America? North America? Antarctica? Thank you.
>> AUDIENCE: Sorry but have missed one area.
>> JOHN LAPRISE: Middle East? Nina.
>> JACQUELINE BEAUCHERE: All right. Now without further ado we will go to Janice.
>> JANICE RICHARDSON: Thank you. Janice Richardson. Something that we have done in Europe is move from safer Internet to better Internet. This has given us the opportunity to bring together various Directorate Generals and look at how to use these tools in a positive way. And I have really missed this word "positive" so far this morning. Looking at positive content, we ran a competition this last year and actually got very few entries. Young people from all ages, age 3 through to age 20 at telling us that this is what's lacking, really positive, reliable, trustworthy content. We believe that European Schoolnet, that you don't teach about being safe on tools. You take the tools in to the classroom, you use them and give good role models. And in this regards we, for example, do an annual webinar or an annual one‑month long course on games in schools. And we have had fabulous results from this. We also run a project called Smile Social Media and learning and education for which we won runner‑up in the European diversity awards for the most outstanding use of technology in the year.
I think that it is really important that we look at these positive things. I was a teacher for the first decade of my career. And if young people are using their mobile phone, are doing other things, then the problem is that they are not learning in a way that's appealing, that's using their curiosity because they shouldn't want to learn and use these tools and young people do from my experience of them.
Secondly, standard setting, many teachers schedule these tools in the school because it is a very ambiguous vocabulary, and it is very difficult to get Government validated content, to know which are the Government validated agencies. And thanks to industry we've set up a standard setting system for schools called the You Safety label which is bearing fruit. Because schools share information on the best way to handle incidents, for example, in the school.
Lastly INSAFE is all about the wholistic approach being evidence‑based because how can we be awareness raisers, be educators if we are not informed through young people in youth panels, youth groups, et cetera, but also through young people in trouble, young people who are contacting help lines, who are trying to get help. So this is the way that we try to inform ourselves to stay not ahead of the field but at least up with the field.
I'm very interested in the active choice in the UK and maybe I have been reading the wrong reports, but it seems to me that only 8 or 9% according to OFCOM actually applied these filtering tools even when they went through this whole process of looking at the tools. So that's my take on it. That's what Safer Internet day is about also. Celebrating, looking at the really great things, these great technologies that are opening, opportunities and that really are our future. Thank you.
>> JACQUELINE BEAUCHERE: So John, I believe you want to weigh in active choice in the UK and Susie wanted to talk about as well.
>> AUDIENCE: The OFCOM report that you are referring to predates the current policy. I will explain what the current policy is in the UK as an example of good practice. Nobody would argue it is necessarily best practice because we have to acknowledge this is an experiment. Nobody has done it like this before anywhere in the Democratic world and everyone acknowledges and accepts that filters are not a perfect solution for everyone in every situation. Bear in mind something like 28% of kids under the age of 10 are now accessing the Internet from a tablet in the UK and that number is going up as the price of tablets is coming down.
And there are very definitely as we have all accepted issues about the risks associated with particularly very young children being exposed to some completely horrible and potentially traumatizing content. And I don't think in that instance that anybody would rule that filters are a good thing. Phone companies know in advance who is going to be using the device. So they have taken ‑‑ we've taken the view in the UK that these filters should be made available by default but, of course, they can ‑‑ they can be removed and this is the way it works. And this has really only been in place since the 1st of January of this year. From the 1st of January this year, the four major Internet service providers who between them cover about 95, 96% of all domestic broadband consumers in the United Kingdom provide by default a set of filtering tools. Now nobody is obliged to use them. The only thing people are obliged to do is say whether they want to use them or not. So it is yes or no. And for all ‑‑ that position is already in place for every new account.
For the existing base of users they will have been put in the same position by the end of this year. By the 31st by the end of this year every existing customer will also be asked if they want to make use of the filters which the company is making available free as I say. So that's the position with ISPs providing broadband connectivity in to people's homes. With mobile phone companies it is different. And this has been in place since 2005 or 2006. Since 2005, 2006 every time you open up a new mobile phone account, and that essentially means when you get a new SIM card, it is nothing to do with the devices, to do with the SIM card. By default they assume you are a child, right? And that means all access through legal adult content, gambling, violence and stuff like that will be blocked. You can, if you want, and many people do, but equally many people don't, you can go through an age verification process with your mobile phone company and prove that you are over 18 and get the adult content filter lifted. If you want to gamble online or you want to order boobs online, it is very simple to do it. But you must go through the age verification process.
What the mobile phone companies also do and by the way the ISPs do it as well that they use the IWF list to block access to child abuse images. That filter obviously cannot be lifted through anybody.
The third and final point, I am coming to an end now, WiFi provision in public places online, all access for child abuse online is blocked. You can't get that lifted and access to pornographic sites is blocked. And you can't get that lifted either. A couple of WiFi companies also do the same as what mobiles do and block access to other types of adult content, but all the WiFi companies block access to porn and child porn in public places. If the WiFi access by the way is being made available in a nightclub or a casino, only the IWF this would apply because it is an adult environment. So that's what we do in the UK. I think it is an example of good practice. And we'll see in time how well it works.
>> JACQUELINE BEAUCHERE: Thank you, John. We have one comment from a remote participant and then we will go to Anne Collier.
>> REMOTE MODERATOR: Carla, our co‑Chair remotely would like to talk now.
>> REMOTE PARTICIPANT: Hello, good morning. Can you hear me? Just ask child online protection, International Telecommunication Union, thank you again for this good workshop and good exercise.
>> JACQUELINE BEAUCHERE: Carla? Excuse us. If you might just speak a little bit more slowly and maybe step a little bit away from the microphone, you are coming in very muffled.
>> REMOTE PARTICIPANT: Okay. Okay. Sorry. A bit better now?
>> JACQUELINE BEAUCHERE: This is better, yes. Thank you.
>> REMOTE PARTICIPANT: Thank you. So yes, so thank you again for this exercise from the Best Practice Child Online Protection. (Muffled speaker). ITU launched the child online issue in 2008 ‑‑
>> JACQUELINE BEAUCHERE: Carla, I am sorry to interrupt you but you are coming across so muffled. Perhaps you could type in your comments and we can read them out loud here in the room. It is just very difficult to understand you.
>> REMOTE PARTICIPANT: Okay. Okay. Good.
>> AUDIENCE: Maybe in the meantime I will help my colleague. I am the director for European Coordination as well as the senior advisor in the ITU. So what we wanted to convey is that the ITU is maintaining the platform, which is multi‑stakeholder, involvement of all partners. Thank you very much for recognition of the work done at the global level where each counts. So those who are still not involved in this initiative and those who are running their own initiatives, there's still place for everyone. But most important is that those initiatives are getting ‑‑ to save resources. And to provide good results for the ‑‑ all stakeholders and the interest have just concluded the work, the World Telecommunication conference in Dubai today where the child online protection was one of the top priorities for the next round of the implementation work we carried out in different regions and even some of the regions have put child online protection as one of the top priority initiatives. In Europe there is still tremendous request for the possibility for incorporation of ‑‑ between the different partners. Therefore as I mentioned there is still a lot of place for the cooperation and for improving this ongoing debate how to make this Internet better or safer.
Just in addition what we wanted to also highlight is a little bit slight change of the approach going towards more accountable process and this requires your engagement where we can really make sure that this is what we are doing is not on top shop but we have the accountable measurement, targets and the dream which we want to fulfill, but we are agreeing on this dream. Therefore we invite you to take a look at the free ‑‑ coinitiative of the ITU but I ‑‑ let me underline one more time, which is open and multi‑stakeholder, and it is not perfected and bounded from entering. And this platform is for everyone to cooperate but also to provide services to the ‑‑ for the community and to work with us on this new project. Of course, this drives and provides the better hope that the child online protection will become one of the significant components of the bigger frameworks on the cybersecurity. We are also ‑‑ we are teaming up with several partners on building the global cybersecurity indexes where the ‑‑ and this aspect should be seen as one of the key.
So therefore please feel invited in case I miss some of the messages Carla will type because she is our top expert in this field. Thank you very much for your attention.
>> JACQUELINE BEAUCHERE: Thank you. And Anne, did you want to weigh in now?
>> ANNE COLLIER: Thank you so much everyone for your comments so far. I thought maybe I would zoom out just a little bit and look at child protection in context. What we've learned from ‑‑ sort of have two key points to highlight that have emerged as we follow the youth online risk and social media research in North America and the EU over the past decade or so. So it is a lot to sum up. But one is protection in context. The idea that we put child protection and again, you know, we set child abuse imaginary off to the side. That's a law enforcement and crime issue. We are talking about all the other stuff and that's a lot. It is kind of all of human life. But child protection in the context of the three categories of rights under the UNCRC, protection is one of three categories. The other two are provision and participation.
And then in that protection framework one of the things that emerged in a couple of national task forces that my codirector Larry and I served on was a promoting the public health field's levels of prevention. Because yes, Patrick, I think you referred to this the other day, not all youth are equally at risk online. And so just to briefly describe the levels of prevention think of a triangle. Primary universal level, primary universal risk prevention education. It is prevention work with all youth pre‑K through 12 as we say in the United States. All the way up to University level.
Secondary is specific and situational risk prevention education. Again just prevention and so that would fall in to categories such as bullying prevention, sexual health. These establish programmes, evidence‑based programmes in risk prevention that we've had all along that need to have digital elements folded in to them. The tertiary level is prevention and intervention for all youth. At the primary universal level true so‑called Internet safety education is literacy education. And we often refer to digital literacy. It is almost a default. It reflects it and it comes up a lot and it is very, very important especially for cybersecurity and privacy, but it is one of three literacy that is needed today in the social media environment. And the other two are well established media literacy field, but now embracing social media that's produced and uploaded as much as it is consumed and downloaded.
And the third one is equally important. It is like the three‑legged tool will not stand without each leg social literacy. Skills such as self‑management, self‑awareness, social awareness, and social decision making skills and relationship skills. What we've found in the work in preparation for Lady Gaga's bullying prevention programme at Harvard back in 2012 in doing the research work, exploratory work to develop sort of a base curriculum for the establishment of that foundation we had a psychologist and a researcher from the U.S.'s crimes against children research centre and an expert in social emotional learning. And what we have found when we reviewed the literate is that really social emotional learning is the lion's share of bullying prevention. So if we could give all children everywhere these social skills, social literacy we would go far in putting down for lack of a better word bullying online and offline.
These are skills that will equip children and protect, safeguard them in all environments, including digital environments. And it is kind of a dream, the idea of bringing SEL in to primary and secondary education. It is an academic standard in only one state in the United States, the state of Illinois. And it needs to grow. But it is in at least 25,000 elementary schools in the United States. So there are details on the three literacies in the report just released this past June of the National Aspen Task Force for learning on the Internet to enable connected learning, the kind of learning that has been referred to before that brings robust digital tools in to the school environment. And these literacies are necessary for advancing connected learning and giving children the skills to move in to a highly digital networked world.
>> JACQUELINE BEAUCHERE: Thank you, Anne. We have one comment over here and then a remote participant and then over here. Thank you.
>> AUDIENCE: Hello. My name is Agnes and I am an FY investor from Hong Kong. From what I heard just now most of you focused on blocking and filtering and there should be blocking tools. I think these kinds of protection are not that diversified enough. Child may not be harmed by information they access on the Internet but by other Internet users. From my experience a photo was put on the Internet without my permission and people commented about my body and face and this made me feel uncomfortable. Nothing can be done by malls or censorships. People neglected the negative impacts that other Internet users can bring to the child. Blocking has its advantages but not contacting with the virus does not mean they are safe from the deceased. Education is always fundamental but it is a long‑term issue like a child in this generation cannot be protected. So may I ask what do you all think about what can or should be done to protect the child from these kinds of like passive facets instead of just education and law enforcement which are not effective enough? Thank you.
>> JACQUELINE BEAUCHERE: Someone want to respond to Agnes? We have lots of people. How about ‑‑ let's start with the youth first. Youth responding to youth and we will go to Larry and Michael Kaiser wanted to weigh in.
>> HARRIET KEMPSON: So to answer that I think one of the ways that we can deal with this and from my experience is positive peer pressure. Because in my experience I had a friend who was being bullied online especially by anonymous sites and one of the ways we can do is to show the person bullying by that cyberbully that it is wrong. You stand up for that person even if you don't know them because it is something that's wrong. And that positive peer pressure can get everyone to work together to get rid of that cyberbully. That's something in my experience that works when law enforcement can't do anything.
>> ZACHARY DASILVA: Also with things like it is kind of a tough issue to tackle, but I've heard someone else Sonia from Facebook, I think she mentioned something before about what they are doing now if someone posted a photo of you and you don't quite like what they posted. They say she encourages and Facebook itself encourages peer to peer conversations. So I have had the same thing happen to me. The way it was tackled is I, first of all, went and I saw the photo and I didn't quite like it. So I found out what I could do. Since I was tagged in the photo I had actually ‑‑ there's an option. I actually went through and there was an option that said that you can report this photo or you can talk to the person about it. And they will bring you in contact and you kind of write somewhat of like a formal complaint. So I went through that message and I went through that method and I went and contacted the person and I said hey, I don't really quite like this photo that you put up of me and please do me a favor and take it down. I went through that method and I took it down. You can report it and obviously that's not the main priority they have with all the other things that are going on. That's one method that I can find, and also Eleanor was talking earlier about blocking people. So I mean if you did ‑‑ if there was people that were commenting and you didn't really like what they were saying or they were writing somewhat kind of abusive messages that you didn't like, maybe blocking them from the social media so they wouldn't be able to contact even though you wouldn't be able to read the messages they were putting.
>> JACQUELINE BEAUCHERE: Thank you. Larry or Michael, did you have anything to add here?
>> AUDIENCE: Everything I was going to say was said in great articulate ways.
>> JACQUELINE BEAUCHERE: Excellent. Let's go here. She's been waiting a bit.
>> AUDIENCE: I actually agree with Ann about social measurement development. I think that for our age blocking is counterproductive. Inappropriate content we need the freedom to learn how to deal with this. Otherwise we will never learn any social emotional. Skills and if they have awareness about it we can learn to deal with it and especially education in the school environment as well. We can kind of learn from that. And I think a lot of blocking makes us think of the Internet as ‑‑ there is so much blocking we think the Internet is negative. Focusing on positivity makes us all aware and we will be more likely to go find positive content rather than negative content. I think that nobody sets out to find negative content. If we do stumble across and we have the freedom to make our own decisions, then we can develop the skills to deal with it. Blocking at our age for inappropriate content can be counterproductive. At younger ages I think that coming across content that is inappropriate is a bit harder to deal with. Blocking at a young age I agree with. But at our age I think inappropriate content, I don't think it should be as high priority.
>> JACQUELINE BEAUCHERE: Final thoughts here, Harriet, and then we will go over here and then here.
>> HARRIET KEMPSON: So if you have a small child, you wouldn't just let them go across the shop to the far side. First if they go with their parents and they may be allowed to go to the last few bits and maybe when they are a bit older, 7 or 8, they are allowed to go all the way by themselves. It is kind of little by little you get more. Exposes people to risk so they can learn and experience and learn these social things. So as Lena was saying it is not just nothing now and everything when you are young. It has to be very transitionary, is that a word? Yeah, to be like that. And decrease as someone grows older.
>> JACQUELINE BEAUCHERE: Is it Mandy?
>> AUDIENCE: Hi. So my name is Mandy and I am running a regional programme across five countries in Eastern Europe and Middle East. We have heard a lot about blocking and filtering today, but one of the biggest gaps we have found is lack of parental knowledge and I know that all of you will agree. So one of the lessons learned or perhaps Best Practices that we've implemented and we spoke a lot with the youth and worked a lot with youth and that was to have youth themselves teach and train and work with parents. And we also found that in the countries where we work with it is very hard to invite parents to come to trainings and specifically fathers. So what the youth themselves encouraged was for us to include a bit more IT technical training and that was a way to entice fathers and parents to come and then we could move in to the safer Internet practices. So I really would like to ‑‑ I felt that was missing in the Best Practices report and maybe something we could add.
And I also would like to make a note that specifically in Palestine as we were working on safer Internet empowerment and raising awareness, capacity building a lot of child protection points came up. So we use this almost as a vehicle to talk about other child protection issues. And again I think that's a gap that was missing that this is not only about safer Internet but perhaps could be opened up and expanded to broader child protection issues such as sexual violence, family violence, et cetera. Thank you.
>> JACQUELINE BEAUCHERE: Thank you, Mandy. We had a comment from the Nordic section.
>> AUDIENCE: Hi. I am John from Denmark. I think that we are talking a lot about like the children and parents and sometimes the teachers also have an issue in dealing, for instance, with cyberbullying or product management, but what we found out in Denmark was also that the Moderators working on the Social Networking Sites where children are allowed also should be educated. And when we did that and how to deal with cyberbullying and bullying the numbers came down and especially when we linked like that ‑‑ in the ‑‑ we linked the effort with 10% of the ‑‑ just business model that really works.
>> JACQUELINE BEAUCHERE: Michael and then we will come right here.
>> MICHAEL KAISER: Hi. I am Michael Kaiser. I am from the National Cybersecurity Alliance in the United States. Best practice, I use the world best in quotes because I think the concept of Best Practices in and of itself is something that should be looked at and defined because it is usually evidence‑based and research‑based. One ‑‑ first off the multi‑stakeholder process is actually an important part of this process, that people come together and find places where they can agree, even if we might disagree about other things and I think that's really important for the rest of the community. We are all experts and arguing about things that are personal and sometimes evidence‑based. And we need to work together much more closely and collaboratively and that leads me to discuss something what we have done in the United States. When we looked at security, we felt that that starts at a very young age. I think one of the young people pointed out about crossing the street. That's a safety issue. That kind of education can be taught by anyone. It doesn't require a lot of technological expertise.
We shouldn't get wrapped around the axle because it is too technical and people can't teach. In the United States when we want people to be safe and secure we have very simple and harmonized messaging that anybody can teach to anybody else in the community to make them safer and more secure. We did a multi‑stakeholder process with 25 countries and agencies that led to the creation of a very simple cultural message of stop, think, connect. Take security precautions and think about the consequences of your actions online and connect and enjoy the Internet. This is a kind of message that can be taught by anyone to anyone and can be taught from the youngest stage and right on through to older folks as well and that's a very classic model in the United States.
So whether stop, think, connect, a cultural message to everyone is pretty tried and true in the United States. We have friends don't let friends drive drunk and we have stop, drop and roll at fires and we have stop and listen at railroad tracks. These are cultural messages that help people be secure. Teaches critical thinking and helps think about ‑‑ people think about places a picture online that somebody might be offended by later on. It starts really at that element of cultural agreement about what people should do. I am happy to talk to anybody more about it later if they are interested.
>> JACQUELINE BEAUCHERE: Thank you. Right here and then we have remote participant still we have to get in and then we will go over here.
>> GRACE KELLY: My name is Grace Kelly and I am a youth Ambassador here with INSAFE. We called on the youth from across the world. We have opinions from absolutely everywhere to give their ideas on what was needed to develop a better Internet. When all the suggestions were in and we did a vote to get the final 30 points and 14 out of 30 were related to awareness, education and respect and freedom. As youth we are not asking for the Internet to be blocked. We don't want to be sheltered from the world. It is something that we have to experience without our parents standing beside us.
So if parents are blocking everything that could be challenging to the children they are, of course, developing their skills on how to deal with things and their children are left oblivious. And they are not developing the things that they need to deal with the Internet when older. We are calling on education. Over the weekend I was talking to Ann quite a lot. An Internet where we all seem to be the danger to each other more than what the technology seems to be a danger to us, I don't think there is anything wrong with the technology and various content will always be there. And I know that legal content is a separate issue. That doesn't change, but I think we should spend our time learning how to use and deal with all of this and how to work with each other instead of trying to change and block everything that we have that can be amazing for us.
>> JACQUELINE BEAUCHERE: Thank you, Grace. Now to the remote participant, please.
>> REMOTE MODERATOR: Thank you. We have a remote participant named IO Salassi. She did want to speak by audio but I will read it out. My concern is that we barely scratched the surface on the need for corporate responsibility regarding safety online. The incredible engineering power behind corporations that have monetized the Internet is AN incredible part to addressing this challenge. I work at salesforce.com. I believe and many of the parents and children we work with in North America believe that more powerful solution rests around the power of recommendations. We need a mandated partnership and collaboration between the businesses that generates and hosts the content that attracts young people. In short if they can figure out who you want to serve and add to a user's based on a highest bidder they can also ensure they are recommending healthy resources and information when the audience is you. I would like someone on the panel to carefully address the need for companies to be more responsible with the recommendations for young people.
>> JACQUELINE BEAUCHERE: I will say something but we will have Susie go first.
>> SUSIE HARGREAVES: I think this is a really good question and I think it is something that we really have really got around very well in the UK. We have a group called UK Council of Child and Internet Safety and that brings together not just three Government Ministers, police, ourselves, key NGOs and key industry representatives as well. And we all work together on child online protection. We've also seen a huge amount of activity from the industry in the UK. Not only at the four ISPs doing filtering but they also have 25 million pounds which is unprecedented in to a campaign for Internet matters. And you will find that every single big Internet company working in the UK is putting money in to the child safety. And I can tell you from the ISP issue we have funded by the Internet industry. 80% of our money comes from the Internet industry and all the players including Jacqueline is here from Microsoft, Microsoft not only fund our work but currently also help us by giving us access to free technology and currently funding a research project. And that's the same for many of our organisations. We are doing a pilot project with Bing and Google on blocking sites and they are continually funding this work. I have to say and certainly I am not saying it is the same across the world, it is not fair to say that industry doesn't do its bit. It is doing an awful lot. I am not saying it couldn't do more. But they have stepped up and really taken action. Thank you.
>> JACQUELINE BEAUCHERE: Thank you, Susie. I will start with one of my basic points and philosophies on life, the biggest room in the world is the room for improvement. Everyone can always do that more. I would like to tell you what Microsoft does in this space. I have left some cards on the table and that certainly doesn't help the remote participant, but our focus in the online safety space is very much three‑pronged. We are a technology company and we focus on technology and we provide parental controls and family safety settings and they are available in our consumer flagship products.
The reasons that parents don't turn them on primarily is because their parents trust their children and we try to encourage them to trust but verify. So you can have this monitor first approach with our technology and find out what kids are doing online and then you can set parameters and boundaries. And the other aspect of our work comes in the form of partnerships. We have some other pilot and technology projects under way. They are the U.S. Hub for Safer Internet day. And we work with a number of other organisations around the table as well and around the globe. We feel that no one entity or organisation can tackle these issues on its own. So it is best that we band together. Best that we pool our resources and best that we come together and collaborate.
And then finally we spend an awful lot of money and time and energy and everything on creating educational resources and public awareness raising materials. I am the current board Chair of the National Cybersecurity Alliance where Michael Kaiser is representing today and we are part of stop, think, connect and have been from the very beginning but we also create our own materials. We have a Website microsoft.com/safety. There is a wealth of materials. You will be inundated if you happen to look at it. But as I said the biggest room in the world is the room for improvement. We can always all do more, resources and more collaboration working together. Right here, please. I remember from previous IGFs but I don't remember your name.
>> AUDIENCE: It is Bianca. We are going from protection, protection and then to teenager you probably need to be more of a guidance role. We are glad that we work on a children friendly Internet guideline. We are applying for dockets and we do not get it yet, but I think it is very important to set up a guideline with everyone here. And we sincerely call for your comments. We are working with a few parties already. It kind of has that elemental cultural agreements. And it could be enforced on the Internet. And I think first with dockets or perhaps we would be a watchdog to have a position to working on the dockets specific domain name, but eventually extend to the entire Internet to put age categorization to different websites. We also have a baseline policy that addresses specifically illegal content that we would potentially take down, but the other thing I think that is important is the best practice. It goes back to the recommendations that you ‑‑ a lot of you suggested. Remote participant on sales force that the recommendations that a lot of these websites can do to suggest to you. I think those would be very useful resources. So again we would love to hear from your comments on the children and family Internet guideline.
>> JACQUELINE BEAUCHERE: Any response for Bianca or ‑‑
>> AUDIENCE: I have a question. First of all, my name is Nora. I am from Turkey and I would like to thank all the participants and panels because I learned a lot on this issue. And while yesterday and today I learned many things about also the IT companies and like Microsoft and Google and I appreciate the efforts of these companies which they are developing mechanisms for protecting our children online.
However I would like to learn whether these filtering mechanisms would be for ‑‑ in terms of the social and cultural aspects of the countries since we all know that countries may differ in terms of social and cultural characteristics. So I would like to learn whether there is these kind of differences in terms of the filtering differences.
>> JACQUELINE BEAUCHERE: Does someone not want to respond or I can respond generally from our perspective? From Microsoft we offer a number of technical tools features and functionality in our products and services. And what we offer are the ability to monitor first as I said in my previous comment. We offer the ability to see what kind of time the child is spending online. What kind of websites they are entering and the search terms and so forth. When the parent gets the idea in the weekly activity report as to what the child is doing then the parents can set the parameters and guidelines. You can see what the child is doing.
In terms of cultural or social boundaries that's really to be set by the parent. What we try to do with our services is provide the information so that the parent can make the decision as to what should follow. And we hope, first of all, first and foremost that what will follow is a conversation. Hopefully we start with the conversation as was suggested by the UK youth here before the child can get online, whatever the age and then those ‑‑ that dialogue is created. That dialogue is started and that dialogue continues throughout. Whatever cultural or social or other guidelines you may have in your own family we want that to be first and foremost to be acknowledged. And we want the parents to receive information so that they can make the appropriate decisions based on the child's age and based on the child's maturity level and based on the family's own unique set of values. As we discussed here today what's going to work for one family is not going to work for another family. What's going to work for one child in a family might not work for his or her own twin. These are unique situations where ‑‑ they are very unique and we need to treat them uniquely. We had a comment here and over here and then Susie as well.
>> AUDIENCE: Thanks a lot. Nicholas from the Internet Society. I'm not an expert in child online and that was incredibly insightful. When I came here I was wondering is that a security issue? Is that an education issue? Are the solutions technology? Are the solutions empowerment, political peer pressure and education? And I think that it seems to me it is a mixture that and that's your point as well. There is no one solution or one approach. And I think that should be reflected as well in the outcome of that Best Practice Forum. It is not only security. It is not only education. It is not only technology. It is a mix of that.
And my question is as we get closely to the end of this session what will be your recommendation for this Best Practice Forum. You consider the work is done. Do you feel that there is margin for this group to continue beyond this IGF and to continue bringing good practices across.
>> JACQUELINE BEAUCHERE: And I would open that question to the group. We only have a few minutes left and we have two more comments here. And I would like to know if there are any comments from some of the other ‑‑ on the periphery, not at the table. So we want to make sure we are acknowledging some of the other areas as well. But I think from my own perspective I think this has been a wonderful dialogue for the most 90 minutes. We had a number of submissions made early on in the process. This is a start. This is by no means complete. We uncovered issues that need further exploration and we have got some topics that are open to debate. And we know there is no one size fits all. It is all based on cultural and social norms and different family individual values. There's many things to consider. That's my feeling but I certainly open it up to the group as well.
The gentleman first and then the lady and then we have another comment in the back here.
>> AUDIENCE: Thank you very much. Bone from Indonesia. So actually we did several activities to family and programme on ‑‑ problem in the Internet. So in terms of overcoming the child and not only child but also sometimes ‑‑ but in our case we have several activities we can ‑‑ can be considered as a best practice because we are also doing the regulatory approach, technical approach as you already mention and social and education, but one thing that we also are doing are making the ICT Forum here which is coming from the (inaudible) itself and some from educational background, academic background. So what they are doing is preparing all the material, approaching framework and also collaborating with the Government institution. So representing the multi‑stakeholders, representing the Government sector and with the societies. So ICT formed this. Now Internet has been spreading over the provinces. And we are also working together with the international organisations. So I'm very proud with the ‑‑ we also want to collaborate. So we are also preparing some ‑‑ what we call ICT healthy and secure Internet Ambassador from the kids. So they are promoting how to use Internet safely by themselves. But we from the Non‑Governmental Organisation we prepare the content material. This is our emphasis and hopefully we can work together as ‑‑ thank you very much.
>> JACQUELINE BEAUCHERE: Thank you. Next please.
>> AUDIENCE: Thank you for that. My name is Amy. We support the establishment of Internet hotlines around the world specifically and outside of the EU, North America, outside of more developed markets. One of the things that we have seen and I would echo a lot of the things that I have heard on the panel today is that the importance of contextualizing child online protection concerns and we deal with criminal content, but we have to contextualize it always within the wider child protection context and also importantly contextualize it in what youth feel in a particular country to really understand how young people feel about the negative but also the positive aspects of Internet safety around the world. So I would really echo, kind of re‑enforce this point, we talk a lot about successes in more developed markets but we have to be careful not to transpose that. Also in the work, I will be brief, also in the work I am doing I do see, you know, I would agree that industry has done a huge amount. But I don't think it is being transposed equally on to all the new markets. And I think there is a huge amount of work and room for improvement in that sense to see policies that are being developed perhaps in particular markets, being effectively transposed on to other markets around the world because that's huge diversity. And I would echo quickly as we are here and talking about multi‑stakeholderism, one of the things that I have seen in several countries with huge success in developing multi‑stakeholder approaches to this and it can start with multi‑stakeholder Round Tables. Address the serious issues within the particular context. So those are some points that I think have been made but really, really key.
>> JACQUELINE BEAUCHERE: So we just have a few minutes remaining. And we do want to set aside some time ‑‑ yes, I think we have got Nordic. Someone will have to come to the microphone, please. That's not working. Maybe you want to come up to the front here and use one of these microphones.
>> RACHEL O'CONNELL: Okay. Can you hear me now? Hi. My name is Rachel O'Connell from the company called GroovyFuture. And I would like to say that the conversation today has been really interesting. And it is good to hear all of the different perspectives. One of the points I would like to raise it is really useful to have tools in place that parents may or may not choose to use with respect to understanding what their child is doing online. One of the issues that doesn't seem to have been addressed very well is the fact that teachers are often the people who are at the front line and have the opportunity to actually educate young people and what we are talking about is about well‑being. How do we ensure the well‑being of young people both online and offline. And one of the things in Europe and also in the U.S. we have been very focused on a harm reduction approach and very freaked out about the negatives of putting measures in place with the understanding that we can proactively address those issues to educate young people about how do you manage ‑‑ how do you define positive behavior. How do you manage behavior maladaptive. How do you handle a situation where someone has been bullied and then shamed by peers. That's the sort of information that people need and teachers also conduit that can pass that on to parents. But I thought I would raise that question and reflect a little bit some of the points that Anne raised about social and emotional learning. Thank you.
>> JACQUELINE BEAUCHERE: Thank you. We really are just about out of time. We are actually a little bit over time. And we did want to set aside a couple of minutes to talk about the process that's been conducted to date. So if anyone had any feedback, particularly about the process, John Laprise our consulting scholar, he is going to be giving documentation to the IGF as well about potentially improving the process. Moving forward should this be replicated in the future? Any additional comments there? At the back. If you would like to come to the front or use one of these microphones, it might be a little bit quicker.
>> AUDIENCE: Thank you very much. My name is Jonathan Simbad from Uganda. I am having a comment. When we are talking about bringing on board parents and teachers providing a self or even made for children, let us also consider some markets. For example, in Africa many children access Internet from cafes and something may be refused to be accessed by a child at school or at home, and that child goes to an Internet cafe and is able to access that. And remember most of the cafes are profit making. So they cannot refuse a child who has ‑‑ who is going to pay money to access whatever he or she wants. So should also think about a lot of bringing Internet cafe operators on board. Thank you so much.
>> JACQUELINE BEAUCHERE: Thank you Jonathan. And John.
>> AUDIENCE: Yeah, I think the process has been just about the most appalling one that I have ever encountered in my many years in this space. Not only the timing progressible, bringing it out was it July the first e‑mail that appeared telling us that this process was beginning? I appreciate not all parts of the world go on holiday in July or August, but big parts of it do. So that's one point.
Second point the length of time. So July to today is what? Seven weeks, eight weeks. And I mean I apologize for being so old, for having been around in this space for getting on for 20 years, but the notion that you can compress 20 years of learning in to a seven‑week period over July and August is close to insulting. And secondly ‑‑ sorry, thirdly and finally, it is very much duplicating things that other people have been doing for a long time. The ITU and UNICEF with Microsoft's involvement and Google and Facebook's involvement and many other companies are working on more or less exactly the same thing for about two years. It is the book that's coming out tomorrow, but yet in this process there was no indication that the people who decided to do this were even aware of that fact. And we simply ask to write in and pass on to some ideas. So I don't know what the future of the process is but I certainly think the past is nothing to be proud of.
>> JACQUELINE BEAUCHERE: Final thoughts, yes.
>> AUDIENCE: Thank you very much. Thank you very much for the comment. For the process as the ITU would be happy to bring further the results of these discussions to the other Forums, this was very enlightening discussions and very happy that cyberbullying was underlined during this session and work is going on at the regional level, in Europe, U.S., Canada, Australia, but we need some much more focus at the global level. And in particular looking at a collection of more data and research and those who would be eager to be engaged in this exercise, this exercise which is really a challenge at the global level in particular. At the national level less but at the global level it is something we need to address on an urgent basis and invite all of you.
The other point is the industry and during this talk which is my last message, for us the child online protection it is the responsibility of all of us sitting here in the room. Therefore the statements that some stakeholders be more responsible than others it is maybe not the way forward. Everyone should feel in charge. And the good thing is that everyone is becoming responsive to the calls. And we are very thankful for a very strong engagement of different partners, GSA, Microsoft, IWF, as well as the other partners like UNESCO team up with us to organize the open Forum tomorrow at 9 o'clock. And therefore please be reassured that we will be bringing forward and those stakeholders who are willing to more engage on the real work with the real outcome and hopefully impact on the decision makers different layers are welcome to join the global initiative of the ITU and other partners which is a multi‑stakeholder partnership. Thank you.
>> JACQUELINE BEAUCHERE: So thank you very much. Nicholas, one more comment?
>> AUDIENCE: Very short. Well, I think the comment on the time needed for the work is well taken. I think this was a new initiative. And next time we definitely need some more lead time. On the duplication I don't think that's the goal at all. The goal of this Forum is to raise awareness and pull different people together to work on different initiatives. If that brings more people to the ITU initiative, that's great. If that brings more people to other initiatives, that's great. I think ‑‑ I think the goal is not to negotiate something that we will have to agree on. It is to repool together what's happening in different regions. So if we achieve that and continue and there is a value for that that is useful and that should not duplicate.
>> JACQUELINE BEAUCHERE: I think that was a great way to sum it up. That's exactly what's expected. So if there are no further thoughts we are actually a little bit over time and I thank everyone for participating this morning and for your comments and your ongoing work going forward. Thank you very much.
>> JOHN LAPRISE: Just one Addendum. I am John Laprise. I am the consulting scholar responsible for the document and supporting the lead efforts. I can take questions afterwards, but I do want to invite everyone to please participate. There is a document that's online that you can contribute to. The contributions will be available ‑‑ it will be available for comment through at least the 6th and it is my understanding that they are looking to extending it to the 15th.
Now I also want to emphasize that the document that will be produced out of this process is not a ‑‑ it is more of a wait point as opposed to a final document. I encourage everyone who did not speak today please contribute. Your voices are important. If you are hearing us online or reading this transcript at a later date, we want you to contribute. More people to the table brings richness to this effort. Thank you.
>> JACQUELINE BEAUCHERE: And anybody who didn't speak today and everyone who did please come back here at 2:30 for a wonderful highly interactive no panel session on digital citizenship and youth.