Protecting the Most Vulnerable Users in Society: The Roles of Different Actors in Helping the New User Survive in an On-Line World Nominet

27 September 2011 - A Workshop on Security in Nairobi, Kenya

Also available in:
Full Session Transcript

September 27, 2011 - 11:00AM


The following is the output of the real-time captioning taken during the Sixth Meeting of the IGF, in Nairobi, Kenya. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.



   >> LIESYL FRANZ:  Good morning, everybody.  Welcome to Kenya and the IGF.  My name is Liesyl Franz.  And I'm with TechAmerica, which is a trade association that represents the technology companies in the U.S. and globally.  I'm delighted to be here today to moderate this panel on addressing the needs of new and vulnerable users on the Internet. 

       I'm going to take a cue from the guy that I had on my safari this week.  And if you haven't had a chance please, try to do so.  It's really wonderful. 

       I'll try to stay out of the way.  I'll try to guide the truck and take you to places you want to go.  But I'll mostly try to keep out of the way and keep quiet.  But I would like to introduce our panelists just briefly today and allow them to say a bit more about themselves and their organisation. 

       We have Dr. Vicki Nash to my immediate right with the Oxford Internet Institute in the UK. 

       To her right we have Marjolijn Bonthuis with the Public-Private Partnership for the Development of the Dutch Internet Society in the Netherlands. 

       And we have the Honourable Alun Michael, member of the UK Parliament with us today. 

       We may have a couple others join.  But I understand that they have pressing duties.  So Alice Munyua from Kenya might join us if she's not trying to run this whole show on her own shoulders. 

       So with that, Vicki, if I can turn it over to you. 

   >> DR. VICKI NASH:  Thank you very much indeed for that introduction.  So I've been given the rather difficult task of setting the stage, if you like, for this debate.  And I'm going to present you just a little bit of research from our department in Oxford.  And I should say up front that this isn't my survey research so please don't ask me anything heavily statistical or I will blush and look very awkward.  Next slide, please. 

       The first thing I really wanted to point out is that new users, the new users that we're actively trying to get online are likely to be those from, if you like, the hardest to reach, most vulnerable groups.  This slide shows -- you don't need to necessarily understand the figures -- but shows, for example, those in the lowest income groups in the UK at the moment in 2011, 43% of those use the Internet compared to 99% of those who are in households with an income of over 40,000 pounds. 

       So income I'm afraid, you know, continues to be a big predictor of whether or not you're going to use the Internet.  Next slide, please. 

       Similarly education.  Again, those who are largely not online whom we are trying to get online, who we are trying to make into new users, likely to have lower levels of education.  Again, looking at 2011 figures, you can see that, you know, 95% of people with a higher education are online compared to 54% of those with basic and secondary education. 

       Next slide, please.  Thank you. 

       The same goes for other factors.  I'm not going to go through all of them.  Another one we often look at, disability, for example.  The reason I really wanted to mention this I suppose is because when we talk about vulnerable users, there's a tendency to regard them or new users as a single aggregate group who have all of the same characteristics.  One of the things we need to know an awful lot more about are what are the specific needs of new users and make sure we direct policies towards those specific needs. 

       In this context, for example, we might find that new users with lower levels of education might need different sorts of media literacy support than those perhaps that are highly educated but have some sort of disability and might need physical forms of access to be directed towards them. 

       So that's the first point I really wanted to make is you need to know who your new users are.  The second point -- and don't worry about the fact that the text is very small.  The second point I really wanted to make, if not all users are equal, then not all risks are equal, either.  We know -- these figures are all taken, by the way, from the survey we do every two years.  It's a representative survey of users at the Internet and non-users of the Internet in the UK. 

       As far as I know, the patterns seem to hold.  Certainly from my other partners in these projects across Europe there are quite a list of reasons to think that the patterns are similar worldwide. 

       This is about risks, this particular slide here.  Again, I wanted to point out to you there are different sorts of risks that different users will face. 

       Last year -- not last year.  2009 -- the 2011 data is not here yet.  In 2009 the most common risk people faced was basically getting a virus on their computer.  Other risks may be much less common.  Like for example, having your credit card details stolen.  That's the second figure from the bottom.  Maybe only 33% had their credit card details stolen online.  But the potential for that is really great. 

       You can have less common risks more serious we want to know which ones we're talking about and want to protect people from the other thing when thinking about risks for new users is to bear in mind just as I pointed out to you there might be different groups of new users different sorts of risks will be greater or smaller for different types of people. 

       This is a -- excuse me. 

       A table here which tries to separate out the different sorts of risks people might come across.  Some of these for example like phishing you know people seeking your bank details I'll show you in a example much more common if you're employed.  Have you got bank accounts undertaking eCommerce much more likely to be targeted others on the right hand column for example there may be risks users generate themselves from the type of information they seek.  Similarly there might be activities people undertake sexting creating pornographic images so again we need to be self aware there are some risks that seek people out some they generate themselves and again different risks for different types of users.  Next slide, please. 

       This is really just a little breakdown of what I just said.  So looking at the types of risks people encounter by their life stage you'll see that it's you're much more likely as I said to be asked for your bank details if our employed or retired however if you're retired you're less likely to have received abusive e-mails we do know patterns of activities for different groups I can break this down I won't here but I just wanted to make a very basic point. 

       So really the two things that I'm going to leave you with I suppose are some discussion about trust and the Internet and the policy implications of what I'm saying. 

       Now, also the research shows a very, very strongly this is across all our last four surveys dating back to 2003 that the Internet is what we call an experience technology what we mean by that is basically once people get online the level of trust in the Internet seems to go up dramatically and in particular non-users show the least trust in the Internet.  And this in fact has become much more pronounced since 2003 so this is a worsening trend we are thinking about get news online -- users online we have to overcome the hurdle to get people to trust the Internet enough to try it once they get online when they realise it's not as scary perhaps or there are not immediate danger or contact by strangers once they get online the trust level goes up dramatically despite that since 2003 we have also noted the number of negative experiences people seems to have suffer seems to have gone up for example stolen credit cards gone up from 1% of users to 3% of users experiencing that misrepresented purchases things you bought online you -- you don't think you bought on the Internet but it hasn't decreased trust dramatically for new users so there's an interesting story what happens when people get online even if they have non-trust experiences their trust levels increase dramatically people are more educated before they get online they see these as prices they have to pay for getting on Facebook or shopping online or seeing pictures of their grandchildren however non-users those who have the lowest trust levels there's a Catch 22 situation you can only raise their trust levels if you can get them online but if they don't trust the Internet in the first place how do you do that really one of the things I want to say today okay the next slide is we need to look at the reasons people stop using the Internet and we need to focus on all of these in the round one thing we're focusing on this session is risks and experience of negative aspects of the Internet.

I would argue that we actually need to look at this in conjunction with the other factors that make people stop using the Internet in their first or second year and there our data shows us quite clearly largely about cost for example kids who started using the Internet at school leave school get a job can't afford a PC or don't have a job that requires you to have a PC and then into work similar lack of interest somebody helped you get online but then it's complicated and they can't see how it's relevant so really what I think we need to consider quite carefully is what -- we'll keep people online once they have gotten online.  -- what will keep them online.  Risk is certainly a significant factor concerns about privacy for example in 2009 were raised by 12% of people who had stopped using the Internet and that had gone up since the previous survey.


       So I think if we're thinking about the policy challenges, we're all at the moment very used to the idea that we need to make using the Internet more affordable and get more PCs out there and find more places where people can use the Internet maybe also used to the idea when we're showing people how to get online we need to make it more interesting arguably the session today will suggest quite importantly we need to focus on the risks that are open to people once they get online I would say we need to focus on them in the round when people are online we need to do several things showing them the full set of applications what's directed to them is this a young users who has already used the Internet at school dropped off while at home but know what it's about is this an older user who frankly can't understand what it can do for them they need to know the tools that can protect them from risks there's thoughts if PCs should come with spam filters installed or not but again people helping users getting online need to flag up the system to protect themselves then we need specific training on particular risks that will affect particular individuals so again you know if I was working with say people of my age to get them online I might want to focus on things like phishing for example.

And how to identify phishing requests and how to respond to it. 

       The last point I'll make is simply that follow-up care is essential here I don't know what you've experienced in your countries but certainly in the UK there's a tendency to see you get people online and that's it the care stops so really what can we do in terms of providing follow-up care to ensure people are using the Internet safely they are addressing their queries when they arise and finally that we're helping them to find ways where they find sort of full range of benefits that are appropriate for them.  So that's really the jumping up point that I'll set up hopefully for others to speak and hopefully they will say more about specifics.

   >> LIESYL FRANK:  Thank you, Vicki.  I think data is very difficult to find so thank you for the work your survey has done and perhaps we can turn to Marjolijn and she can talk about one way that they are looking to address some of the trends that Vicki just outlined.

   >> MARJOLIJN BONTHUIS:  Thank you I would first like to say in the Netherlands we have a rather unique project because we work with the business community public organisations and the Government together to build trust but also to get new people online and of course we started, as well, with some research.  They don't talk about that they asked me just to point out one of our projects because Vicki talked about some groups.  But you have a very small group.  And those are the people with learning difficulties.  It can be because they can be young people but also elderly people, people with Asperger's for them sometimes they trust the Internet completely.  They click on everything.  But it's very hard to train them to use the Internet safely because a lot of programmes they start up for young people at primary schools.  But these people you have to use another term of voice, to use other characters.

But because it's a small target group it's not easy to get some money out of, you know, companies to feel responsible or for other training programmes.  So we support it with you know from this programme, special training programme for just these people.  So this is just something I want to give back to the audience.  Do you recognize that in your country, as well, are small groups who just, you know, there's little attention to because the focus is most of the time that young people because they are seen as very vulnerable or to elderly people because you want to join them to use the Internet and to skill them but you know you have small groups.  So I wanted to ask you do you see the same thing in your country.  And what programmes started over there to get to reach them. 

   >> LIESYL FRANK:  Would anybody like to volunteer their experience now or perhaps in discussion at the least?  Kieren, is there any activity on from our remote participants.  Okay. 

       Okay.  Keep going, Marjolijn, perhaps it will spur somebody.

   >> MARJOLIJN BONTHUIS:  I wanted to say as well I brought two students from the Netherlands today because they are in their 20s but they are the first real digital generation.  So they are the experts.  So I would like to invite you to ask them your questions.  And well to ask the students to join the discussion.  And to speak up.  Because well at the IGF we say for years now that it's very important that youth speak up.  So this is why they are here.  So use them. 

   >> LIESYL FRANK:  Thank you.  And now if we may turn to Alun Michael, if you want to share the viewpoint of this kind of phenomenon of new and potentially vulnerable users and how policymakers are looking at it.

   >> ALUN MICHAEL: If I approach it as a legislator you have to ask first who are vulnerable people.  And I think just picking on the comment here there are sometimes people who are highly intelligent but who are extremely vulnerable and those are some of the most difficult people to help in terms of protections on the Internet.  For instance a young man who is attracted by the things he sees on the Internet and ends up sending a lot of money to a woman in Russia who believes he's creating a relationship.  Most people would look at that and sort of say no, this isn't real, click, delete. 

       But -- and many people, of course in that situation, may not have money.  If you've got somebody with money and that vulnerability, after the event, thousands of pounds have disappeared.  The reaction of many people:  Well, there should be to stop that happening.  That person should be protected.  But how do you do that in an environment where it's also very dangerous if you look at it in a different context to put those things in place and I think everybody approaches this from their own experience we are hearing in the session that we have just finished for instance people from the experience of use of social networks in protest in Egypt saying there should be no controls and you can understand why they are saying that. 

       The risk of the controller doing more damage than good.  Or authoritarian control is enormous.  

       But at the same time the question is:  How do you place protections in place for those people who have a real need for protection? 

       But the other big issue of course is this tension between wanting to get people online.  And knowing that they are likely to worry. 

       There was a friend of mine who started a computer class.  And she called it Computers for the Terrified.  Which I thought was really a good title.  I never understood why it didn't catch on.  It's saying it's a good thing to worry.  It's a good thing to ask questions you don't have to pretend you have enormous confidence.  And on the other hand you refer to the digital generation, the digital natives.  And you talk about issues of being safe online.  And many young people will just say to use the technical term:  Yeah, whatever.  Nothing has gone wrong yet.  So there's a feeling of confidence. 

       You know, it's like flying without doing a training course.  If you haven't realised that there's a need to land, you may not have gone through that part of the necessary training. 

       So I think there are big issues about how we talk about these things.  How we provide opportunities, education, training.  We do in the UK have something called Get Safe Online which is meant to give access to safety for those who have the sense to go looking for it. 

       But the point of looking at vulnerable individuals, I think it's asking the question:  Does this wider society whether you talk about a domestic society or the Internet Society have responsibilities to those who are themselves vulnerable whether or not they would accept that? 

       Now that raises a whole series of issues. 

       There's also a need to engage this -- particularly I think looking at the statistics about who is online and who isn't.  Public services are increasingly being encouraged to provide information and services online. 

       That's great for those who are online.  I can go online and do my tax disk in five minutes, do my registration to be able to vote.  Things -- you know, things of some significance. 

       And they can be done instantly. 

       That, of course, is virtually no cost -- not no cost but much lower cost to those providing the services.  And at a time of cuts, as we are sitting and having at the present time the encouragement is for public services to look at new ways of delivering those services.  That's fine for those who are online.  But is the costing of the savings you make by doing things online does that take into account for those who are never going to be online.  Providing an alternative means for access and the simple answer is no it doesn't.  That's left out of the calculation.  So you have to sort of backfill by finding ways of providing services later. 

       And I can see a really big problem rushing up against us in two, three, four years time from the acceleration of the provision of services online. 

       So something that in itself is good.  It may have those sort of bad outcomes. 

       That also means that the people that you want to go online not because they are using it every day and therefore becoming familiar with it but merely in order to access specific services that they need. 

       How do you make sure that there is adequate facilities and protection for them. 

       Just want to touch on one of the big issues that has been debated over the years in the UK which is how you deal with child abuse.  I'm not going to talk about child pornography because we're talking about abuse of actual children.  And that is the term that we've increasingly used.  For a long time there was a discussion oh can you protect people from the possibility of finding sites by taking them down.  Or is that in itself a denial of service. 

       And the debate was well should we have legislation to deal with this?  Well the simple answer of course is that child abuse is illegal anyway. 

       Would somebody want to take action in relation to denial of service? 

       Well, maybe.  There are one or two pretty peculiar people around but it would be obvious what they were trying to do so it's just not going to happen it's the theoretical obstacle to providing an answer to a problem. 

       So the Internet Watch Foundation provides the service of taking down in co-operation with service providers as soon as the notification of a child abuse site coming up. 

       Did that require legislation?  No.  Is there a likelihood of legislation?  Every now and then somebody says there ought to be legislation.  If we're only dealing with 95% of users, what about the other 5%?  Should there be legislation in place?  So far we have managed to persuade people not to do that. 

       But the answer if you're talking to people in the street or in the pub or wherever you talk to people to any problem, the people identify it would be there should be a law against it. 

       Now, as a Legislator I would say that's a place to finish, not the place to start. 

       Firstly, if something is particularly bad there usually is a law against it if you look for it carefully. 

       Secondly, laws rarely prevent what they forbid.  And that's a Victorian statement. 

       And thirdly if you legislate and get it wrong you end up with things like the Dangerous Dogs Act in the UK which legislates about dangerous dogs but didn't actually prevent the act it was meant to prevent and there's severe danger of having dangerous computers act that seeks to resolve the problem but doesn't do that and again the sort of debate that the IGF provides has been a better way of doing things than lurching into legislation. 

       And I think that's my final point. 

       These are difficult issues which need a more sophisticated offer to vulnerable users which are specifically designed to deal with the requirements of particular individuals, whether they be children, whether they be sophisticated or with Asperger's or -- as you mentioned or anything like that and we need to think it through and find ways of making sure that people can actually get to the offer that they need or that that particular group needs.  But it seems to me that legislation is not the answer to it.  It's actually a more sophisticated approach to our responsibility as a whole community to vulnerable individuals. 

   >> LIESYL FRANK:  Thank you very much, Alun.  In the U.S. we call that trying to avoid taking a hammer to an ant. 

       Alice Munyua from Kenya has joined us taken a brief reprieve from managing this whole event so thank you for joining us Alice for as long as you can and perhaps if you want to make a few remarks before we open it up for discussion.

   >>ALICE MUNYUA:  Well thank you very much and I would like to thank Nominet for invited me and participate in this workshop and again I invite you all to Kenya. 

       For us in Kenya, we have been conducting research on women and cybercrime because we feel women are a particularly vulnerable group.  Yes, you know Child Online Protection is also very important issue for us.  And yes, most people are mothers, as well.  But I think we feel that women have been marginal for quite a while in terms of access to ICTs generally.  And their entrance into using ICTs, mobile Internet already has been marginalized from that perspective.  So while the use of cyberspace and some of the tools both mobile and computers provide for increased freedom of expression and communication for women there's also the double edged sword aspect of it that it also increases abuse. 

       And the research we conducted as the Kenya ICT Action Network in partnership with the business community and the Government has actually brought up very interesting research results. 

       The first one being that the same domestic violence we see conducted in the offline world is actually now being perpetuated in the online world using you know more -- particularly mobile.  Because most Kenyans who access the Internet do so using mobiles. 

       And then you know those are emerging crimes, the new emerging crimes to do with fraud.  Making it more difficult for women to be able to address them because they are new entrants as well they have been marginalized for quite a while beginning to use the Internet for the first time of course as newbies does create new concerns for their empowerment and development due to their obvious safety concerns. 

       From a legislative perspective I think one of the challenges or one of the limitations from the legislative perspective is that the laws especially while acknowledging especially the Communications Act acknowledges cybercrime looks at it from a technology perspective but not from a human perspective so there's still no law yet that addresses abuse that addresses violence against women and children using ICT tools yet. 

       So the Government has had a dealing with this abuse perspective but taking into consideration our experiences in 2008 with the use of the Internet and mobile telephony for hate speech but again acknowledging the fact that the tools both mobile and you know the Internet as an infrastructure does not really cause crimes.  But it does really come from our own values and ethics.  Our own, you know, political processes, our own cultural processes. 

       You know, the power relations -- I mean the gender relations in terms of -- the power relations between men and women. 

       So one -- the research study itself recommends that it may actually go beyond -- that women and cybercrime has to be looked at at a broader context so it goes beyond just coming up with technological solutions to addressing women with cybercrime but also policy social skills and development and also creating increased access to women at a very young age so they begin to use it and understand how to use it and also understanding how to ensure --

       (Audio lost).

   >> ALUN MICHAEL:  Promoting hatred by talking to you or if I use a megaphone or microphone to address people at the other side of the room or if I use a mobile phone.  Surely the offense is neutral as to the technology.  So why do you need separate legislation for that? 

   >>ALICE MUNYUA:  In fact that's the challenge.  We don't really need separate -- we feel there's a serious need to come up with not separate legislation but to mainstream legislation in terms of using ICT tools to perpetuate hate speech.  And I think again coming from my experience in 2008 where mobile phones were used I think it's the impact of the number of people that it reaches I think it's the reach that's more of a concern.  And also in terms of considering when that happens, you know, what do stakeholders do when that happens.  Because while we acknowledge that the tools are used for both positive but also we have to acknowledge the fact that they are also used for the negative but what do we do from a legal perspective I think is the challenge.  Because while there are so many technological solutions to it, you know, there must be a legal way of dealing with it, as well.

   >> ALUN MICHAEL:  Yeah I think the case I'm making is for what is done to be the offense.  And for it to be neutral to technology.  The way one person put it to me when we were debating these issues was if somebody uses a foot path to get to my house to burglar it we don't call it a foot path crime it's a burglary, it's a theft and therefore why should we do that with the technology.  If somebody is doing something to damage or threaten another individual, then whether they are face to face or whether their threat is conveyed by a computer or mobile phone doesn't affect the nature of the crime does it.

   >>ALICE MUNYUA:  No it doesn't affect the nature of the crime I think we're in complete agreement from my perspective and from the research perspective I think the challenge is how to deal with that.  I'll give you an example that I think was used yesterday by my peers. 

       When we are dealing with for example Intellectual Property, from our laws, you know, we see -- we are not still able -- this is it.  When you go to court and say Alun Michael has taken my piece of work and used it plagiarism, for example, and we've had a case like that recently then you are asked but do you still have your document with you because we are still dealing with the break in model but do you still have the document with you?  Yes I do so how do we address that in terms of okay as far as I'm concerned you should still have -- you physically have your work with you.  And that's a very simplistic way of looking at it. 

   >> LIESYL FRANZ: I think one thing we have experienced in the U.S. is that the laws that existed didn't account for the reach that Alice was talking about how -- the nature of the perpetuation of the crime. 

       For example, cybercrime was not considered prosecutable perhaps because the previously current law said that you couldn't trans -- well you couldn't cross state borders.  Right? 

       So in the world of cybercrime, of course they are going to cross state borders.  So you had to allow for what the ICT networks allowed for.  So we wanted to update the law in order to make the crime an actual crime. 

       So while you don't need to accommodate a different law, a different type of legislation for a crime perpetuated offline or online, you might need to update to allow for that crossing of borders.  Or the pure number of people that could be affected.  Or the dollar amount of the crime that would be -- that would take place. 

       So while I totally agree that you don't need a separate or new law for online records offline you might need to update them.

   >> ALUN MICHAEL:  This is a problem with the American Constitution and the law.

   >> LIESYL FRANK:  I'll take it. 

   >> ALUN MICHAEL: That's where you ended up with the problem with online gambling sites being a problem for America but not for anybody else.  Sorry. 

   >> MAROLIJN BONTHUIS: Just to inspire the audience to take it on a children's level.  Cyber Hotel.  I don't know if anybody knows it.  But if you go to the police station and say, "My furniture has been stolen in Cyber Hotel," they will look at you like what are you saying your furniture it's virtual again so they don't take you seriously but actually you take a lot of time, effort and money in your furniture in your virtual world so even in the Netherlands the law is not up to date and the police or law enforcement isn't either.

   >>ALICE MUNYUA:  I think I'll take you again back to the women in cybercrime because that's where it's coming from one of the examples we have as a case study is when we started dealing with women in cybercrime many women would go and report abuse using SMSs.  They will go to the police station.  And in fact I'm even one of the victims in terms of a mailing list discussion where you end up with a tirade of abuse.  But then the police really don't get it and the first thing they will ask you is:  Are you hurt?  Are you physically hurt?  Did the person assault you?  Did you get a doctor's report? 

       And so I think that's the point we are trying to make here in that yes, you feel violated, abused.  Of course you're abused and violated.  So I think -- and that's what we are saying that our domestic violence laws need to actually update the fact that the same crimes are committed online so how do we deal with that?  And also build the capacity of the law enforcement officers to be able to deal with that level of -- with this new level of abuse.  Oh, sorry. 

   >> ALUN MICHAEL: I was just going to say but that is a problem then with educating the police and the criminal justice system and not with the legislation.  I think it's illustrating to me the instinct to say if there's a problem legislator that will solve the problem it's an example of behavior you just gave the example of police officers and you're right quite right not understanding the nature of mental abuse rather than physical abuse for example.

   >>ALICE MUNYUA:  I think it goes beyond the police understanding.  There's no legal basis.  And in fact the police will tell you that and even a lawyer will tell you that it would be very, very difficult to have any recourse at least from a Kenyan perspective to have any recourse even currently if you know if that happened to me, I would still have difficulties taking a case through to get recourse for that or having that particular case dealt with from that perspective. 

       So it's still -- and that's why we are saying that while there are all of these other solutions.  But there must be, you know, a legal way of dealing with it in terms of protecting you know women as a vulnerable group, as well. 

   >> LIESYL FRANK:  Okay.  I think this is a very robust conversation up here.  But I would like to turn it into a robust conversation with all of you. 

       May I -- permit to start by asking -- by taking our own little informal non-scientific poll, if you don't mind. 

       How many people in the audience have been online for over ten years?  Very many.  Almost every -- just for those remote participation, it's almost everybody. 

       How many have been online for less than one year?  Nobody.  That doesn't matter. 

       How many have been online for less than five years?  Well, there you are. 

       So we are not talking necessarily to the new users in the room.  But many of you certainly have either constituencies or current concerns about them.  So we would like to hear about that from you. 

       And I saw one hand in the middle of the back there. 

       If you could identify yourself and your organisation, that would be helpful, thank you.

   >> Sure.  Martin coming from Poland Local ISOC chapter. 

       I think the problem you're addressing, well, one thing is about police and enforcement and how they deal with technology and we think that's clear.  But I see a different story which is related to legislation.  It's about the definition of verbal abuse versus the physical assault. 

       And I think it's same everywhere.  I remember discussions about the definition of rape which was also like pretty you know straight down to the physical facts and there was discussion about being -- it being broadened.  So I think that's the point here.  And Internet is an information transforming forum is actually exposing all of those verbal and non-physical acts.  Good question.  Where it leads later. 

   >> LIESYL FRANK:  Does anybody want to address that, either on the panel or in the room?  Okay.  I think -- is there a question?  Please.  I'm sorry; can you go to a mic perhaps?  Because you're not very far but the acoustics are difficult.  Maybe -- or right here in front.  Perfect. 

   >> Thank you very much.  And thank you to the panelists for introducing the topics. 

       I just wanted to share my name is Bernadette Lewis I'm from the Caribbean telecommunications Union and I just wanted to share some of the experiences we have had in terms of raising awareness and educating on the power and the potential of the Internet and also on the dangers that exist and some of the things were mentioned by the opening speaker. 

       We have embarked on a systematic programme of education and public awareness across our 20 Member Countries in the Caribbean because as practitioners we tend to speak in terms and we assume that everyone understands.  And we have you know come to the realisation that there are huge swaths of our societies that really do not have the information.  And this information needs to be made available to all of our citizens. 

       And it is a collective responsibility.  It isn't just the public services.  It's the operators.  It's the regulators.  It is a collaborative concerted strategic undertaking in terms of bringing the information that is necessary to the vulnerable, to the children.  We have programmes for parents.  Because within the Caribbean we have introduced the One Laptop Per Child so children are going home with their laptops they have unfettered access to the Internet and their parents or many parents are clueless as to what's happening. 

       So we have targeted programmes for parents we deal with dangers in cyberspace for children we look at the elderly and we actually go out into the communities. 

       A couple of weeks ago we were in Saint Vincent and the Grenadines.  We met with the fishermen.  We met with the farmers.  We explained the technology in the context of what they do. 

       Warned them.  Gave them the guidelines of how you remain safe in cyberspace. 

       And this is we recognize more and more this is an exercise that has to be undertaken. 

       (Audio lost).

   >> it's because there are a lot of people out there who have a lot to gain from people being online whether it's it governments with public services telecoms companies who are going to get fees and so on and it seems to me that it's all very well paying for initiatives which are going to get people online but if they are not accompanied by programmes to support those peoples once they have got online then actually that is you know -- there's a degree of social irresponsibility there which I think is quite shocking but I just wanted to follow that up if we are going to ask more of both Government both of telecoms both of the companies like Facebook and Twitter and so on then we also need to ask how -- Twitter and so on we have to ask those messages in a way that doesn't scare people off so using a hammer to crack an ant or nut in the UK it would be helpful to have big campaigns to saying beware of cybercrime that might scare off people than help so it's a fine line for us to tread whether we're thinking of Government campaigns corporate campaigns or just the buddy network that so many of us sort of use when we're trying to help people. 

   >> LIESYL FRANK:  Thank you.  In the back.  Well, we can.  But I'm not sure -- since we're transcribing, it's much better.  Sorry.  And actually it's quite difficult across the room. 

   >> All right.  Okay.  Andy Fipin  from Plymouth University in the UK a question for Mr. Michael really you talked about education of law enforcement agencies but what about education of legislators as well given some of the discussion around the digital economy act and also the somewhat knee jerk reaction to the rights as well.

   >> ALUN MICHAEL: Absolutely that's why myself and Eric Joist are here and that's where why we have ministers here from the UK, as well.  The point I was making is as much addressed to the legislators as anybody else and the old party group which Eric and I are leading on is about trying to get an informed debate with the House of Commons on this use rather than just saying there's a problem there ought to be a law against it that would take us to the dangerous computers act I referred to and actually one minor point in relation to the question of education of the way the police respond to non-physical violence. 

       Actually that may not require legislation.  It may require depending on the way the countries laws are run things like changing the judge's rules so it's clear that examples of verbal violence and other forms of intimidation can be admitted as evidence to the court.  Once they can be admitted as evidence to the court, then the police are in a position to take them seriously rather than saying:  Well the Court wouldn't take it seriously if I were to start taking notes and produce evidence on that basis. 

       So it's that which I'm getting at which I think is absolutely where Vicki was taking us which is the need for the whole of society to judge what are the problems, how do we want to deal with them, can we avoid producing legislation which will have unintended consequences and instead develop a much more sophisticated approach the sort that you had exampled where the way in which you deal with vulnerabilities require the service providers legislators, Government agencies, businesses and Civil Society to be looking at this. 

       Again we come back to that's what the IGF is about.  And that's why the debate is right in this sort of environment with the people who are here rather than just ministers at a UN assembly discussing how to deal with these issues. 

   >> LIESYL FRANK:  Thank you. 

   >> Could I just pass on one other thing to Vicki, BT did a big study in 2006 a large qualitative study I helped with on risk and trust and there was a very clear link between behavior and the fact that trust goes out the window when you can get money offline so there was a big study that correlates with your survey as well.

   >> LIESYL FRANK:  If you have a link to that study as well perhaps we can capture it for the transcript and give it to people.

   >> the final report is on that side and it was a large about 30 Focus Groups of UK assistants.

   >> LIESYL FRANK:  Perhaps if I could ask a question Bernadette or Marjolijn, could you -- I imagine in pulling together the collective that you mentioned for those that need to address the shared responsibility, can you give an example of a time where there was an obstacle that you run into that you had to overcome in order to pull that collective together?  It might be useful for others. 

   >> Fortunately for us, the organisation that I represent, although it is an intergovernmental organisation, we have Private Sector members.  But one of the things we have found exceedingly useful is actually spending time with the Government ministers and explaining the issues in terms, in language, that they understand. 

       So reductions of expenditures, that's the language that they understand. 

       And once we have our ministers on board, they work tremendously in removing obstacles, in bringing parties to the stakeholders.  They will mandate that the operators work with us in a particular area.  They have done it. 

       But the key is getting the political elements on your side.  Understanding the implications and the significance of what we're trying to do. 

       So that has worked extremely well for us. 

   >> MARJOLIJN BONTHUIS:  I like this how you say it because I think that's where you have to start on the top and make them understand the problems.  But I think the other way, as well, because this is what you mentioned before to start bottom up to go to the small target groups and help them with their specific problems.  You know listen to them.  What are their fears.  What are the need of problems and start there we have a very nice project in the Netherlands where we help the unemployed who come to the office to get some money and when they feel like they are digital illiterate they were sent to the library where there's a great programme where students actually help them with their first steps online and then the fear is less but in between they help them with you know how to use it safely.  But to go to the library, meet other people, have a chat with others all with the same fears and difficulties.

       So I think both ways start at the top.  Make them understand.  And start at the bottom.  Just make it small. 

       Go to the special target groups.  I think -- and together of course we say like this collaborative action is absolutely necessary. 

   >> LIESYL FRANK:  Alun go ahead.

   >> ALUN MICHAEL: Just one thing about educating ministers as a former minister I can agree with you.  I think it actually complements exactly what you have said if you are going to a minister and saying in simple terms this is what's needed it's very important that you're right and it ought to be right that involves the building up of the consensus the understanding of the vulnerabilities and so is on so building a -- vulnerabilities so building the consensus so the message to ministers is a clear one from people within the business who say if there are these vulnerabilities we can help to solve this problem if it's done in this way and therefore being able to trust that sort of alliance, that's very important. 

       So it has to be a full circle. 

   >> LIESYL FRANK:  I think we had a question right here.

   >> Thank you so much with the panel with all of the interesting topics you bring up.  I'm Sami Mustafa.  I'm from the Technology Centre.  We are a non-profit organisation that helps people with ICTs and helps with people empowering their lives through ICT and talking about interacting with the community and starting from the top and bottom, as well, talking to both the community and the policymakers. 

       One of the most significant or large growing communities now is the disabled community and we're talking about reporting issues of concern that's an issue for them.  So for example let's not talk about policies.  Let's talk about the technicality of reporting an issue.  So if I'm disabled and online sometimes disabled people don't even know about cyber crimes so education for disabled people is not just done in the normal forum of a Web site or a social media web page or something but sometimes it has to be in a special format for them to read so if it's not kept in mind those formats the things that accessibility issues for people with disabilities, online accessibility for People with Disabilities first of all they won't get educated of these issues and second of all if the tools weren't built to be accessible they won't be able to report those issues so here they are completely blanked out of the whole process and even if you're talking about education processes where you have to go to people and tell them face to face some people have learning disabilities and some people even read backwards people who have dyslexia and so forth all of these things should be kept in mind where policies and approaches are concerned with the communities.

And there's many, many things that have to do disability and accessibility especially in policy making stage.  For example if you talk about accessibility and peoples rights on the Internet again going back to the topic that one of the main and growing sectors are the People with Disabilities.  So -- if you don't have a policy in place to make Web sites accessible so here you don't have a business case to actually make it accessible so here again they are left out of the process so policy again is very important.  I think we are setting ground rules for Government entities, for companies, for Private Sector people to actually embed accessibility in their system as a mandate. 

   >> ALUN MICHAEL: May I ask a question, though again there.  So there's a need to decide which things need to be universal and which things need to be specific for a target group, isn't there?  You know to take one example, visual accessibility.  Because there are many, many people who have failing eyesight or whatever. 

       And actually making that sort of mandatory expectation that all sites will have the right sort of lettering, maximizes the number of people who can have that access but it's actually not bad for everybody else.

   >> Exactly.

   >> ALUN MICHAEL: So that's something that makes sense as being universal, visual requirements and expectations. 

       Other forms of vulnerability, though, might require specific sorts of protections which would be an undue limitation for the generality of the public.  Where do you draw the line on that? 

   >> For setting the level of accessibility the W3C has already gone ahead and made the accessibility stability so there's A accessibility, AA and AAA.  So A accessibility says your Web site could be read and used by moderately disabled people and so forth.  So AAA accessibility if you apply to a Web site it's basically you'll have text and lines that's what you end up with.  But AA accessibility and in a lot of ways is very flexible in how you can actually design a Web site and it more increases your usability of the Web site actually.  It doesn't limit the usability of the Web site so if you want to apply AA accessibility into a Web site you would actually put more tags you would put more links you would put more descriptive text and more descriptive headers and I think they fall in the same line in increasing usability and reach for people.  I don't think it will limit the tools or will limit the development in any way. 

   >> LIESYL FRANK:  We'll come back to you in the middle here.

   >> I would like to follow up.  ISOC Poland.  What's important here is to not push accessibilities into (off microphone) so maintaining, keeping open standards like you mentioned where there are standards or keeping good HTML practices.  Help everybody access information the same way.  And it's only the device or the reader or the browser which needs to be adjusted to the particular person with disability.  When I speak to some industry representatives for example in the marketing area, they tend to think into targeting.  Like you design a Web site for youth.  Like young people listening to music.  And Flash based thing which is like -- it's dubious but it serves some particular marketing goal and this is where we have problems. 

       So marketing goals and targeting is so deeply embedded in some peoples mind sets that they don't think their content is for everyone.  And I think keeping the open standards agenda and making sure just that browsers and computers are being -- because you can't foresee all possible problems when you are designing it. 

       So let's keep the standards open.  And just make specific devices or software for people.  And then we will be able to access the Internet as a whole and not just the walled garden for something for disabled people.  Thank you.

   >> LIESYL FRANK:  Do you want to respond before we go to the next --

   >> Yes a quick addition to the same point.  Back tracking again to the same problem of how many -- how many are the People with Disabilities?  If you don't have the numbers and then you won't make the business case then they won't buy it and they won't make the Web site so gathering accurate statistics about People with Disabilities does fall in the same line as you know convincing entities to actually take on accessibility as a benefit for them because they will have more audience and a bigger target and all of these things. 

       So it's very important to have very active statistics especially with People with Disabilities.  And it's very hard where there's a social aspect concerned because a lot of cultures actually treat a person with a disability as you should not talk about them rather than actually bring him out and see how he can help. 

   >> LIESYL FRANK:  Please.

   >> Yes.  I want to add something.  I'm Juta from Digital Opportunities Foundation from Germany.  And I do really appreciate what I heard from the Caribbean about the training and education for the newbies on the Internet.  Because we do the same in Germany.  But also I would stress that you can't allocate the whole responsibility to the users themselves.  We also need responsibility of the providers of content on the Internet also of the providers of Web sites, of platforms, of social networking sites.  And we've done some research with disabled people in Germany for example --

       (Audio lost).

   >> You need the responsibility of the providers of the shops, of the providers of the platforms and so on.  Because it's not only web accessibility.  It's thinking of the whole thing that maybe disabled people have special necessities that lay besides only the Web accessibility which is well regulated by the Web Content Accessibility Guidelines.  Thank you. 

   >> LIESYL FRANK:  Right here. 

   >> Thank you for the very interesting discussions.  My name is Mina and I'm from the Supreme Council for Information and Communication to colleges. 

       Part of our work was going to schools and doing Focus Groups with the students in order to know how digitally literate they are and how they deal with technology.  I just want to make a small comment on vulnerabilities actually we went to schools and discussed with the students how they use technology and stuff.  We discovered that students are very aware to the extent they can hack the Web sites on their own.  They know coding and all of these things.  So I would like to shift the current view we have.  Is that whenever we discuss vulnerabilities you always say children are the ones that are vulnerable but I think parents are the most vulnerable groups and not children because they have no idea on what their children are doing.  And children are aware of dangers and even sometimes they are intentionally going on Web sites that can be called vulnerable.

       So I don't think children in general are called vulnerable groups so I think we need to change this perception a bit.

   >> DR. VICKI NASH:  I think that's a really excellent comment actually and you're right and we shouldn't label people as vulnerable.  But I think we should bear that word in mind because there may be situations where people's skill outpaces their ability to make judgments about what they should be doing.  I hasten to add I don't take a personalistic stance on this I'm against the idea of filtering out all of the potentially harmful content at the school of the door or home I think people have to make their own decisions for some reason but I prefer to think about it as I just put it these aren't intrinsically vulnerable users but because their skill opens up opportunities to them there may be risks that educators and parents are unaware of.

   >> ALUN MICHAEL: Can I give an example of this?  ChildNet, which is a charity in the UK, which actually has some young people here at the IGF, brought a lot of young people to Parliament.  For a dialogue with Parliamentarians about what their expectations were which in some way goes to the education that parliamentarians commented on somewhat earlier the discussion that we had in the workshop was very interesting because at the end of it one of the youngsters on behalf of them summarized for the plenary session what they wanted.  And he said essentially what went is to be able to go anywhere on the Internet without some technician in the school filtering out what -- where we can get access to.  And we want to know that we're safe.  Sorry.  Excuse me? 

       So and what we took out of that is to know we're safe is partly about understanding what's being used but also there ought to be responsibility in terms of the wider community.  So in a way from those young people it's a challenge back to all of us to think through what we mean by safety online.  It's an initially crude response or crude to somebody which actually the more you think about it it's actually what we all do.  You know it's what kids wanted when they went out into the woods as teenagers.  You want to feel free and you want to feel safe. 

   >> LIESYL FRANK:  Thank you, go ahead.

   >> Yeah one last comment on that.  I think when we went also to school when the filtering point comes up, children are aware that most of the inappropriate content, whatever you would call it is blocked.  But again they are also aware of software that can break the filtering and have access to it although they said they don't actually use the software but they are aware of the mechanisms so I think when children feel like they are not given -- they are not looked upon as equally powerful users as adults they will be motivated to kind of negate that fact.  So just on that. 

   >> Well I really would like to hear from Pablo he was in the Youth Panel because he was 14 years old and now he's in his 20s what we did as an awareness programme in the Netherlands and what made some use and what didn't because sometimes we think about and youth when they come to Parliament, they get very social answers because they know what we want to hear so I really would like to know what he has to say.

   >> Yeah, thank you, I'm a student from the Netherlands.  And I think that what we have seen in the last years in the Netherlands is that the main answer to solving most of the Internet problems for the vulnerable groups and I don't necessarily need to name them they could be handicap people, they could be children, they could be parents but what we see in all of these groups is that people aren't educated enough.  So they might know even for the children who have been using it for years and years, they know how the systems work.  They might be able to break the filters.  But research shows that in the Netherlands where about 85% of people are online and have access to broadband Internet only about 30% of those people can actually make use of the Internet in an optimum and safe way they are not able to use Google, for example, in a right way.

       So what we need to do much more is integrate education into society.  And not only in general education at schools but also for parents. 

       And it's a shared responsibility for everyone,  companies, schools, parents and children, to understand the Internet. 

       And it's become an extension of your daily life and we should treat it is that.  It's not just an extra thing we use.  Treat it as an extension of the daily life.  And make judgments on them in the same way as we do on the streets.  And just think clearly and thoroughly about them and then when we do that, I think we can deal with most of the issues that were just addressed here today.

   >> LIESYL FRANK:  Thank you.  I think that's very helpful.  One question I had is were you scared to get online when you first did? 

   >> Well, no, not at all although when looking back at it --

   >> LIESYL FRANK:  You should have been.

   >> I really didn't know what I was getting into.  And my parents, they were quite aware of all of the issues around the Internet.  So they educated me very much and said okay in the beginning when I was like eight years old or something like that getting on the Internet they said no you can't do this, you can't do that and that's the way it is, deal with it and when I got older they explained to me well you shouldn't do that because and I got to see myself just as I did growing up in the streets when I was little my parents told me no you can't get out of the playground and they didn't need to tell why that's just the way it is and later on they tell me no you can't get out of the playground but when you cross the street look left and right and that's the way it should be on the Internet.  As a young child you can't assess the dangers yourself.

But when you grow up should get to know more of them and how to deal with them. 

   >> LIESYL FRANK:  Thank you.  We've got about 15 minutes left so I think I have three comments in the room.  One over here, one here.  And two more in the back if you can keep your comments and responses short, we'll get to them all. 

   >> Hi, thank you, David Wright from the UK safety Internet centre.  And I really picked up on that point this issue around the term vulnerability because I think we've long had.  And I say that because I think this debate is really helpful and an informed audience around the wider -- the users or the user groups and sort of vulnerable is in that context.  I think with an uneducated or uninformed audience vulnerable actually becomes a bit of a problem.  So vulnerable can mean those that society considers most vulnerable.  Perhaps those around social care or in social care.  So I think that's one of the sort of the issues to do with this term, vulnerable. 

       And a second one is that it doesn't necessarily have to be -- those vulnerable don't necessarily have to be users online.  But we, too, work with schools across the UK.  And we find teachers are particularly vulnerable when they might not actually be users or online users.  Indeed that's the subject of our workshop tomorrow afternoon. 

       So it's just thinking about vulnerabilities in a different sense if we continue to use that term vulnerable. 

   >> LIESYL FRANK:  Quickly over here. 

   >> Yeah; sure.  I'm Jim Prenegass with the Gulf & UK Strategy Group.  And sort of my comment and observation is drawn from both my professional experience with ICT companies and the efforts they are making to try to empower especially for young children parents with tools to help protect their children online such as parental controls and things like that.  But also as a parent of a six-year-old and eight-year-old that are coming online right now I don't own a device in my home right now that doesn't have some sort of assistance to a parent to limit or restrict where my kids can go online the amount of time they can spend online so it's important to acknowledge industry is doing things to try to give consumers more resources and more control over what's happening but at the same time I think there is still some responsibility as a parent at least in that situation to take the initiative and learn about that stuff on your own so that you can help protect your children.  And the online -- in the online world just as you would in the offline world.

   >> LIESYL FRANK:  Okay.  From vulnerability to protection and a gentleman here and then here and then perhaps we can aggregate our responses on the panel. 

   >> Sure I'm Eric Joist I'm a colleague of Alun's in the UK Parliament.  Something Alun I think kind of touched on but maybe to extend it slightly do you think there's sometimes a danger perhaps not in the panel here but a little bit of a risk in terms of the way we use language about vulnerability and fear and so forth.  When you think about all of that stuff you like doing in life, you like it because there's an element of risk.  You know.  And sometimes I just wonder if we overegg the vulnerability thing and it's worth reflecting on the fact if I can put this as cautiously as I can in the ministerial meetings yesterday there's a kind of a trend I think I put this as diplomatically as I can there's a little bit of a trend for some governments to want possibly to use the handle of this kind of risk and the particularly child protection act in some respects as a way of something governments themselves show far greater control over how people use the Internet than as it were stakeholders like people at the IGF deciding amongst themselves and I don't know if that's something that the panel would like to have a view on.

   >> LIESYL FRANK:  Okay one last comment and then perhaps our panelists can respond to any and all of the questions we've heard.

   >> I'm Tim Davies from Practical Participation.  I'm hosting a workshop on Thursday.  But I just want to reflect on another term which is the protection and vulnerability the fact that some responses will want to be protected and another term we want to focus on is empowerment and find a range of responses and particularly in the work I've been doing is to take the Convention on the rights of the child and the event of that into protection rights and provision rights and say responses need to address all of those one of the things young people often say is we want positive space to go online and in response to that we want to go anywhere and be safe are we showing people the positive spaces that are there for them are we investing in them participating and creating positive online spaces rather than people just going to the mainstream Facebooks, YouTubes, other spaces where they are not necessarily getting the online experience so we need to have provision participation and protection as part of empowering people to benefit from the net even when they have those. 

   >> LIESYL FRANK:  Great.  If I can ask each of you to sort of respond to those last four comments or any other closing remarks you would like to make.  I think that hopefully we'll wrap up on time.  Please, Vicki, go first. 

   >> DR. VICKI NASH:  That's fine.  I'm happy to respond.  The reason those are all such great comments is because there's actually a common theme amongst all of them and one we haven't consciously addressed and it's to do with rights and communication rights actually a lot of the work we have been doing at OI is trying to frame phrase issues around vulnerability around risk and harm with the context of a right to framework because we feel too often these debates you know well they really ignore these difficult questions of rights and they ignore you know the responsibility that in some cases they are value complex sometimes as you say Eric some governments unfortunately do use the language of you know protecting against harm protecting against risk as an excuse to maybe sensor or filter out -- censer out more than is needed last year so for me the comment I would just make overall is we can't really talk about vulnerable users or users in need of protection without working out what peoples communication rights are and what we want to provide them with I think in particular I would just want to emphasize the reason we want to educate for example children about the dangers is so they can autonomously learn to use it for themselves as a tool for free expression and communication so the idea that you simply lock things down and never show them how to get outside of that is I think very much a mistake and also the first point of whether or not it's helpful to talk about vulnerable users as if those two always went together something we ought to emphasize it more but parents are often the most vulnerable people than the children themselves, educators as well so yes the word vulnerable itself could be perceived as being a bit patronizing and we need to make sure we're not just talking about users but we're talking about all people who need help understanding and using the Internet I can say more but I think we're short of time so move on.

   >> MARJOLIJN BONTHUIS:  I will be very short.  I have just one comment in the sense of you know to be honest in this, it's nice what everybody is saying.  But have we heard anything new this morning?  Because we are working in this field of vulnerable people.  And we know a lot of stuff.  And we know what should be done.  But how?  What are the best ways? 

       Of course we heard something.  But I think that's the most difficulty with this problem.  It's like where do you get the money?  We know that action results in greater achievements but how do we get this concerted action. 

       And I just want to end with listen to your target groups.  Learn from the newbies.  What are their triggers?  What are their barriers?  And start from there together. 

       But still, it's -- you know we know what to do.  But how and when?

   >> ALUN MICHAEL: Yes.  Firstly can I strongly agree with what Eric said that we need to be very careful about the use of language.  We need to be more conscious and more careful about our use of language.  Especially when it's being used perhaps in a -- a narrow or specialised sense it's very easy for a group to get used to using language in a particular way which is a code and that's dangerous when you come into wider discussion. 

       So I think that needs in this sort of a debate clarity of definition, common language, and a capacity for challenging views and conclusions.  And very importantly, a need to develop consensus.  And that means talking with groups that you're looking at as being vulnerable in one way or another.  I mean when we talk about child protection, for instance, we have people like John Carr representing a network, the European network that is -- that has worked on issues of children's vulnerability and child protection issues for many, many years so that specialization is there.  Other people here who have clearly worked very much in those issues.  So making sure there's a consensus of the well informed and the representative is absolutely crucial.  And the use of language is absolutely central to that. 

       Second point is that two words that have been used are reassurance and empowerment.  That's very important if we're going to get people online firstly that they are reassured that there are ways in which they can discover what they need in order to be safe online and empowerment and enabling people to go as far as they want online. 

       But the final point I want to make is that we do have a responsibility for those who don't want to go online.  Or only want to do limited things online.  I think the most recent research showed that some 40% of those people who are not online would not go online even if you gave them a free computer and a free broadband service. 

       Now, it's very tempting to be patronizing in saying well you're opting out of the 21st Century but that's a choice that people are going to make.  Those who provide public services have a responsibility to say:  How do we deal with that group as well as those who are quick to embrace new technology and new ways of doing things. 

       The development of a Digital Divide which is not just about not giving access or not enabling people to go online but actually considering those who are just absolutely determined to be refusing is something that has to be taken into account into public policy and I think it's something we all need to think about.

   >> LIESYL FRANK:  Great.  I would just like to close if I might take the prerogative of the microphone to sort of sum up -- I came up with sort of four themes that we've heard today. 

       I think one that was the most resounding for me was the need for an effort to be collaborative.  And whatever community you might be addressing. 

       One that sort of came up in the latter half of our discussion was not to be alarmist about the risk or vulnerability that we see and be -- and look toward being pragmatic about what the problems might actually be here from the users themselves and/or potential users and address those specifically. 

       Look at the issue from a top-down and a bottom-up issue.  Getting the buy-in from the -- the buy-in and the understanding of political leaders but also understanding the needs and challenges and opportunities at a very grassroots level, the users themselves or the potential users. 

       And lastly, the Internet user community is hugely diverse.  And therefore, you can't look at it all as sort of one block of a kind of people. 

       So you need to find ways to address that community of -- that might have a commonalty.  And look for ways to empower them and enable them to demand from providers what they need. 

       I'm probably missing a few.  But I hope that that -- I hope that encapsulates some of the conversation. 

       The session will be transcripted.  And I believe we've captured some of the links, as well, to the research that has been provided and some of the sites that people mentioned. 

       If you would like to add any, I think we could certainly take that on and put that in our session summary.  So please bring your card or the name to me or to Kieren, our remote moderator, thank you very much, Kieren McCarthy for handling that.  And with that I'll invite you all to thank our expert panelists.  And as well applaud yourselves for joining us today.  Thank you very much. 



(Session ended at 12:36)