Skip to main content

tv   BBC News  BBC News  May 8, 2024 9:00am-9:31am BST

9:00 am
being told they need to change are being told they need to change their algorithms to protect children and stop children seeing harmful content, or else. what exactly is or else? that's a very good question. ofcom say failure to do this could see apps banned for under 185. they 5ee apps banned for under 185. they could 5ee app5 banned for under 185. they could do that. another excellent question, you are on fire. and they are talking about rigorou5 age checks being on the agenda. i know what you are going to ask. how do they do that? otherwise, ma55ive fines will be imposed. that will be interesting to find out how that will happen. is this company is' responsibility, or is it parental responsibility? should parents be doing more, or5hould companies responsibility? should parents be doing more, or should companies be doing more, or should companies be doing more? and what about the benefits of social media as well —— companies�* possibility. teenagers communicating through social media. it isn�*t all bad. but there is real damage done to young people as well.
9:01 am
ministers say, get this right, do this right and we can prevent future tragedies a5 this right and we can prevent future tragedies as well. we have the digital economy minister saqib bhatti live on this programme taking your calls and i know you�*ll have lots to say and i know you�*ll have lots to say and i know you�*ll have lots a5k lots to say and i know you�*ll have lots ask to him as well. get your calls and texts in. we have some news headlines for you with richard foster. good morning.- with richard foster. good morninu. �* . , good morning. age checks will be tou~hened good morning. age checks will be toughened in _ good morning. age checks will be toughened in from _ good morning. age checks will be toughened in from the _ good morning. age checks will be toughened in from the new- good morning. age checks will be toughened in from the new rules | good morning. age checks will be l toughened in from the new rules to protect children online set out by the media regulator but critics say
9:02 am
they haven�*t gone far enough. the head of ofcom may lenny daw5 admit social media companies could contest the new rules every step of the way. e gates at passport control are working again after a nationwide fault caused huge delays at airports acro55 fault caused huge delays at airports across the uk last night. some passengers waited more than two hours to have their passports checked manually. one man said he spent longer in the queue than he did on his flight from lisbon. rail passengers are warned not to travel due to 5trike5 by train drivers from the a5lef union. they are on their second day of industrial action as part of a long—running dispute of better pay and conditions. tributes are being paid to the game of thrones actor ian gelder who has died aged 7a. he played kevin lang estate in the fantasy series and also had roles in torchwood, doctor who and the bill. it husband ben daniels described him as a wonderful actor and an absolute rock to him. bbc radio 5 live.
9:03 am
they are terrifying figures about how many kids under 13 have seen violent energy and pornography a5 violent energy and pornography as well. we have so much to talk about and i want you to talk about it and i want you to talk to the minister as well who joins us now. i want you to talk to the minister as well whojoins us now. saqib bhatti, digitaland as well whojoins us now. saqib bhatti, digital and economy minister. ., minister. good morning. good morning. _ minister. good morning. good morning, thank— minister. good morning. good morning, thank you _ minister. good morning. good morning, thank you for - minister. good morning. good morning, thank you for having | minister. good morning. good - morning, thank you for having me. it is really important you are here and i�*m pleased that you are. i think the details of what you want to do and you are trying to do will emerge as we discussed this over the next half an hour. if you will allow me, i�*d like to go straight to our whom you can talk, and there is a lot that want to talk to you. sarah jane in portsmouth, good morning. goad in portsmouth, good morning. good mornin: , in portsmouth, good morning. good morning. thank— in portsmouth, good morning. good morning, thank you _ in portsmouth, good morning. good morning, thank you for _ in portsmouth, good morning. good morning, thank you for having me on. it is a _ morning, thank you for having me on. it is a pleasure. and lucy in bristol_ it is a pleasure. and lucy in bristol as_ it is a pleasure. and lucy in bristol as well and katie in bournemouth, hi.- bristol as well and katie in bournemouth, hi. katie, are you there? hi, _
9:04 am
bournemouth, hi. katie, are you there? hi. i'm — bournemouth, hi. katie, are you there? hi, i'm here, _ bournemouth, hi. katie, are you there? hi, i'm here, yes. - bournemouth, hi. katie, are you there? hi, i'm here, yes. how. bournemouth, hi. katie, are you. there? hi, i'm here, yes. how are ou? there? hi, i'm here, yes. how are you? i'm — there? hi, i'm here, yes. how are you? i'm fine- _ there? hi, i'm here, yes. how are you? i'm fine- i— there? hi, i'm here, yes. how are you? i'm fine. i want— there? hi, i'm here, yes. how are you? i'm fine. i want you - there? hi, i'm here, yes. how are you? i'm fine. i want you to - there? hi, i'm here, yes. how are you? i'm fine. i want you to go . you? i'm fine. i want you to go first. you? i'm fine. i want you to go first- we'll— you? i'm fine. i want you to go first. we'll get _ you? i'm fine. i want you to go first. we'll get as _ you? i'm fine. i want you to go first. we'll get as many - you? i'm fine. i want you to go first. we'll get as many of - you? i'm fine. i want you to gol first. we'll get as many of your first. we�*ll get as many of your questions and voices on. katie, speak the minister. hi. questions and voices on. katie, speak the minister.— questions and voices on. katie, speak the minister. hi, my name is katie, i'm speak the minister. hi, my name is katie. i'm a — speak the minister. hi, my name is katie, i'm a single _ speak the minister. hi, my name is katie, i'm a single parent _ speak the minister. hi, my name is katie, i'm a single parent and - speak the minister. hi, my name is katie, i'm a single parent and have| katie, i'm a single parent and have a 12-year-old — katie, i'm a single parent and have a 12—year—old son _ katie, i'm a single parent and have a 12—year—old son who _ katie, i'm a single parent and have a 12—year—old son who insists - katie, i'm a single parent and have a 12—year—old son who insists on i a 12—year—old son who insists on looking — a12—year—old son who insists on looking at— a 12—year—old son who insists on looking at tiktok_ a 12—year—old son who insists on looking at tiktok and _ a 12—year—old son who insists on looking at tiktok and utube - a 12—year—old son who insists on . looking at tiktok and utube despite the fact_ looking at tiktok and utube despite the fact that — looking at tiktok and utube despite the fact that i've _ looking at tiktok and utube despite the fact that i've given _ looking at tiktok and utube despite the fact that i've given him - looking at tiktok and utube despite the fact that i've given him any- the fact that i've given him any warnings — the fact that i've given him any warnings that _ the fact that i've given him any warnings that he _ the fact that i've given him any warnings that he shouldn't - the fact that i've given him any warnings that he shouldn't be. the fact that i've given him any- warnings that he shouldn't be doing this. warnings that he shouldn't be doing this how— warnings that he shouldn't be doing this how will— warnings that he shouldn't be doing this. how will ofcom _ warnings that he shouldn't be doing this. how will ofcom regulate - warnings that he shouldn't be doing this. how will ofcom regulate the l this. how will ofcom regulate the use by— this. how will ofcom regulate the use by minors— this. how will ofcom regulate the use by minors because _ this. how will ofcom regulate the use by minors because there - this. how will ofcom regulate the use by minors because there are i this. how will ofcom regulate the l use by minors because there are so many— use by minors because there are so many dangerous _ use by minors because there are so many dangerous thing's— use by minors because there are so many dangerous things out - use by minors because there are so many dangerous things out there. i use by minors because there are so i many dangerous things out there. he many dangerous things out there. hrtf is 12 many dangerous things out there. is 12 and shouldn't be on it because i512 and shouldn�*t be on it because the terms and conditions say 13. i will take another one, katie, but stay there and you can come back on what the minister says. sarahjane in portsmouth, what is your point? the same, i have three kids, a 13-year-oid,_ the same, i have three kids, a 13—year—old, 12 and nine. they are on social— 13—year—old, 12 and nine. they are on social media, they do have phones. — on social media, they do have phones, but from all the bad stuff
9:05 am
out there — phones, but from all the bad stuff out there on the internet, there is a lot— out there on the internet, there is a lot of— out there on the internet, there is a lot of good out there, inspiring content — a lot of good out there, inspiring content from entrepreneurs, i'm an entrepreneur myself, and i think i love the _ entrepreneur myself, and i think i love the restrictions and potential tightening of the information for the bad — tightening of the information for the bad stuff, but it is platforms like tiktok, instagram and facebook banned _ like tiktok, instagram and facebook banned completely, there will be a whole _ banned completely, there will be a whole generation that will not get the good — whole generation that will not get the good motivational and inspiring content, _ the good motivational and inspiring content, what is your viewpoint on that? _ content, what is your viewpoint on that? , , ., ., , content, what is your viewpoint on that? , , ., , that? grey areas. saqib, ifi can call ou that? grey areas. saqib, ifi can call you by _ that? grey areas. saqib, ifi can call you by your _ that? grey areas. saqib, ifi can call you by your first _ that? grey areas. saqib, ifi can call you by your first name - that? grey areas. saqib, ifi can call you by your first name if- that? grey areas. saqib, ifi can| call you by your first name if you don�*t mind. call you by your first name if you don't mind-— call you by your first name if you don't mind._ age - don't mind. please do. age verification, _ don't mind. please do. age verification, how _ don't mind. please do. age verification, how are - don't mind. please do. age verification, how are you i don't mind. please do. age . verification, how are you going don't mind. please do. age - verification, how are you going to do that? ., ~ verification, how are you going to do that? ., ,, , ., ., ., do that? thank you for that question. _ do that? thank you for that question, katie. _ do that? thank you for that question, katie. this - do that? thank you for that question, katie. this is - do that? thank you for that question, katie. this is an l do that? thank you for that - question, katie. this is an issue that so many parents speak to me about. the first thing ijust that so many parents speak to me about. the first thing i just want to say is i am a parent myself and i don�*t want parents to be on their own on this. last year we passed the online safety act. it is a world
9:06 am
leading piece of legislation, and i5 leading piece of legislation, and is designed to make sure social media companies and big tech companies, search engines, etc i held to their terms and conditions. frankly, the regulator is ofcom, what has happened today significantly as part of the implementation of the legislation they are published a draft code of practice and fundamentally what that means there is a consultation starting today, ofcom have given a0 recommendations laying out their expectations. it is im ortant laying out their expectations. it is important parents get a chance to talk to you about this and there is a specific form katie there. under these recommendations, there are recommendations on age verification, how will katie�*s12—year—old be prevented from going online and seeing harmful content. how will that work? ., that work? one of the recommendations, . that work? one of the recommendations, or| that work? one of the . recommendations, or a that work? one of the - recommendations, or a number that work? one of the _ recommendations, or a number of those recommendations are around how
9:07 am
at age verification basically command how social media companies are now going to be expected to verify those people who access, and those children actually who access their online services. what they have suggested is where it is things like facial id, it might be credit checks, it might be a number of things, they are telling social media companies they must put in mechanisms to verify who accesses social media. i can tell you as the minister who speaks to social media companies, and tiktok was mentioned, i�*m always very clear they must get on with the job and protect. band i'm always very clear they must get on with the job and protect.- on with the “ob and protect. and if they don-n— on with the job and protect. and if they don't? one _ on with the job and protect. and if they don't? one of _ on with the job and protect. and if they don't? one of the _ on with the job and protect. and if they don't? one of the sanctions . on with the job and protect. and if they don't? one of the sanctions i i they don�*t? one of the sanctions i read about this morning is a massive fine, attentive global revenue. just give me a very quick answer and then we will come back to katie and sarah and other course. give me a very quick example of the kind of transgression that would lead to a
9:08 am
momentous fine. tiara transgression that would lead to a momentous fine.— momentous fine. two things i will sa on momentous fine. two things i will say on that. _ momentous fine. two things i will say on that, first _ momentous fine. two things i will say on that, first with _ momentous fine. two things i will say on that, first with great - momentous fine. two things i will| say on that, first with great reward comes great responsibility and i�*m clear to the social media companies that they must comply with the act and ofcom will have them to account. you�*re right, it is a minimum of £18 million or 10% of global turnover, and there is criminal liability for senior managers, and there is also mechanisms for business interruption. so look, this is... criminal liability for senior managers did you say? absolutely. this is why it _ managers did you say? absolutely. this is why it is _ managers did you say? absolutely. this is why it is a _ managers did you say? absolutely. this is why it is a leading _ managers did you say? absolutely. this is why it is a leading piece - managers did you say? absolutely. this is why it is a leading piece of l this is why it is a leading piece of legislation. we are putting our teeth into this and we are saying get on with the job. i�*ve always felt the social media companies need to get on with this. they shouldn�*t have waited for the legislation, frankly. i know we will get onto sarah jane�*s frankly. i know we will get onto sarahjane�*s question but fundamentally there are lots of good things on the internet, lots of good things on the internet, lots of good things that social media can do but it has to be done in a safe way, if children are able access content
9:09 am
around eating disorders or suicide, pornography or self—harm content clearly that�*s not ok. that is why we want to be more robust. clearly that's not ok. that is why we want to be more robust. criminal responsibility _ we want to be more robust. criminal responsibility for _ we want to be more robust. criminal responsibility for senior _ we want to be more robust. criminal responsibility for senior managers, . responsibility for senior managers, that�*s interesting, that will surely concentrates on mines. katie, by the time the legislation comes through, or whatever legislation gets through, your son will be 13. what are you afraid of him seeing? i think, i mean, there is so much, the school— think, i mean, there is so much, the school he _ think, i mean, there is so much, the school he sat— think, i mean, there is so much, the school he sat at the moment, there is bullying, — school he sat at the moment, there is bullying, not necessarily for him, _ is bullying, not necessarily for him. but — is bullying, not necessarily for him, but i'mjust is bullying, not necessarily for him, but i'm just worried that as he -ets him, but i'm just worried that as he gets older— him, but i'm just worried that as he gets older things are going to not improve — gets older things are going to not improve. there is a huge worry for me that _ improve. there is a huge worry for me that he — improve. there is a huge worry for me that he might see something came up me that he might see something came up on _ me that he might see something came up on facebook the other day for me, these _ up on facebook the other day for me, these two _ up on facebook the other day for me, these two random naked ladies, and i don't look— these two random naked ladies, and i don't look at — these two random naked ladies, and i don't look at naked ladies, so i've -ot don't look at naked ladies, so i've got no _ don't look at naked ladies, so i've got no idea — don't look at naked ladies, so i've got no idea why that came up and i'm 'ust got no idea why that came up and i'm just worried — got no idea why that came up and i'm just worried. you got no idea why that came up and i'm just worried-— just worried. you have every entitlement _ just worried. you have every entitlement to _ just worried. you have every entitlement to do _ just worried. you have every entitlement to do that - just worried. you have every entitlement to do that if - just worried. you have every entitlement to do that if you j just worried. you have every - entitlement to do that if you want to do that because you are an adult
9:10 am
and he is not. that�*s the point on this, isn�*t it? as i said early on, a frightening amount of young children are seeing stuff like that. thank you so much, katie. lucy in bristol, what effect has social media had on your children? i�*sre bristol, what effect has social media had on your children? i've got two teenagers. _ media had on your children? i've got two teenagers, one _ media had on your children? i've got two teenagers, one is _ media had on your children? i've got two teenagers, one is nearly - media had on your children? i've got two teenagers, one is nearly 18 - media had on your children? i've got two teenagers, one is nearly 18 andl two teenagers, one is nearly 18 and one is— two teenagers, one is nearly 18 and one is to _ two teenagers, one is nearly 18 and one is 16. and both of them have given— one is 16. and both of them have given up— one is 16. and both of them have given up their smartphones. my 18—year—old gave up her smartphone when _ 18—year—old gave up her smartphone when she _ 18—year—old gave up her smartphone when she was 15 because she saw and knew that _ when she was 15 because she saw and knew that it— when she was 15 because she saw and knew that it was damaging and effecting her mental health. it made her depressed, it fuelled... and now at 18 she _ her depressed, it fuelled... and now at 18 she couldn't be more happier at18 she couldn't be more happier without— at 18 she couldn't be more happier without that tech in her world and my 16—year—old, without that tech in her world and my16—year—old, he has without that tech in her world and my 16—year—old, he has literally 'ust my 16—year—old, he has literally just given — my 16—year—old, he has literally just given up his phone and said to take it _ just given up his phone and said to take it away, don't want it. he cails— take it away, don't want it. he calls it — take it away, don't want it. he calls it his— take it away, don't want it. he calls it his burner phone, so he has a burner— calls it his burner phone, so he has a burner phone now. he is coming to
9:11 am
terms _ a burner phone now. he is coming to terms with— a burner phone now. he is coming to terms with not having a smartphone. this is— terms with not having a smartphone. this is all— terms with not having a smartphone. this is all their choice, this is not _ this is all their choice, this is not my— this is all their choice, this is not my choice. do this is all their choice, this is not my choice.— this is all their choice, this is not my choice. do they not feel excluded? _ not my choice. do they not feel excluded? do _ not my choice. do they not feel excluded? do they _ not my choice. do they not feel excluded? do they feel - not my choice. do they not feel excluded? do they feel left - not my choice. do they not feell excluded? do they feel left out? yes. my eldest, she doesn�*t, she is yes. my eldest, she doesn't, she is quite _ yes. my eldest, she doesn't, she is quite unique — yes. my eldest, she doesn't, she is quite unique in lots of different ways — quite unique in lots of different ways. she wasn't at mainstream school, — ways. she wasn't at mainstream school, so — ways. she wasn't at mainstream school, so she had the opportunity to do— school, so she had the opportunity to do that — school, so she had the opportunity to do that. and it not a factor in the same — to do that. and it not a factor in the same way that it might do with a teenager— the same way that it might do with a teenager that is at school and all their— teenager that is at school and all their friends are on snapchat. my 16—year—old, all his friends are on snapchat — 16—year—old, all his friends are on snapchat. he has said, i've got it on my— snapchat. he has said, i've got it on my computer. so he still on the computer, — on my computer. so he still on the computer, still on tablets or things in that— computer, still on tablets or things in that way, — computer, still on tablets or things in that way, but theyjust don't want _ in that way, but theyjust don't want a — in that way, but theyjust don't want a smartphone in their world. 0k. want a smartphone in their world. 0k~ saqib — want a smartphone in their world. 0k. saqib bhatti, digital economy ok. saqib bhatti, digital economy minister, there is a mental health crisis, for our young people, various factors playing into that, obviously. but social media at its
9:12 am
worst is feeling that pernicious fire, isn�*t it? this worst is feeling that pernicious fire, isn't it?— worst is feeling that pernicious fire, isn't it? as i said there is a balance to _ fire, isn't it? as i said there is a balance to be — fire, isn't it? as i said there is a balance to be struck _ fire, isn't it? as i said there is a balance to be struck here. - fire, isn't it? as i said there is a | balance to be struck here. social media can do many good things. but actually, nicky, when i speak to academics around this, they often say to me there is no obvious causal link. personally, idon�*t say to me there is no obvious causal link. personally, i don�*t accept that. we are looking at ways at trying to understand that causal link better. clearly we understand there is a risk analysis. i don�*t want parents to be on their own and it�*s great to see what sarah jane�*s children are doing around smartphones. they have obviously recognised that. it is hugely difficult place for parents. my eldest is three at the moment and i dread the time they have to go to school and that pressure comes on me and i have to make thatjudgment and decision. it is honestly really difficult. the department for education recently set out guidance to schools around banning smartphones in schools and some
9:13 am
schools are doing that. i have a personal view on that, i think it�*s a very good thing to do. because it means children can stay focused. by the way, i also get parents talking to me about children having access to me about children having access to phones, because for safety purposes, i had a friend of mine talk to me about it the other day and there has to be a balance struck there. this is why the online safety act really is important and i really appreciate the opportunity to talk about it today because i want parents to be reassured that there is a robust piece of legislation we are working at pace with ofcom to implement. bud are working at pace with ofcom to implement-— are working at pace with ofcom to implement. and you are talking to the most important _ implement. and you are talking to the most important people, - implement. and you are talking to| the most important people, you're the most important people, you�*re talking to parents, you are talking to listeners, you are talking to voters, and you need those! ofcom does list some of the ways, and you have referenced facial recognition that social media companies could check the age of users. photo id such as passports, facial age estimation, reusable digital id
9:14 am
services where an external company provides age verification. loads of stuff to get into there. possibly haven�*t got time for it but it is food for thought. good morning. haven�*t got time for it but it is food forthought. good morning. l'zre food for thought. good morning. i've missed food forthought. good morning. i've missed most of the conversation but i missed most of the conversation but i only— missed most of the conversation but i only have _ missed most of the conversation but i only have snippets of what i've seen _ i only have snippets of what i've seen on — i only have snippets of what i've seen on the text. it starts with us at home — seen on the text. it starts with us at home educating the children as to the safety— at home educating the children as to the safety element of it and regulating the devices they are given, — regulating the devices they are given, le, _ regulating the devices they are given, ie, tablets and not phones at such a _ given, ie, tablets and not phones at such a young age, my son is seven going _ such a young age, my son is seven going to _ such a young age, my son is seven going to eight, half an hour in the morning. — going to eight, half an hour in the morning, my son is ten, the same. they— morning, my son is ten, the same. they have — morning, my son is ten, the same. they have their tablets, they other times, _ they have their tablets, they other times, they know the limits. after school, _ times, they know the limits. after school, when everything is done, but it is all— school, when everything is done, but it is all regulated by us strictly. so for— it is all regulated by us strictly. so for example, if they watch youtube _ so for example, if they watch youtube they are not allowed to watch _ youtube they are not allowed to watch short videos at all because it comes— watch short videos at all because it comes with — watch short videos at all because it comes with a band of 24 hours, electronic— comes with a band of 24 hours, electronic devices, they can only play with— electronic devices, they can only play with their toys.— electronic devices, they can only play with their toys. what you mean b short play with their toys. what you mean by short videos _ play with their toys. what you mean by short videos on _ play with their toys. what you mean by short videos on youtube? - play with their toys. what you mean by short videos on youtube? for. by short videos on youtube? for people who might not be tech—savvy. short videos, in my opinion, extremely— short videos, in my opinion, extremely nefarious in the way the
9:15 am
algorithms work. they drive content that is— algorithms work. they drive content that is unsuitable for young children. _ that is unsuitable for young children, for young and there is no way of— children, for young and there is no way of obstructing the content that comes— way of obstructing the content that comes through so they're completely banned _ comes through so they're completely banned. you can't turn them off either~ — banned. you can't turn them off either. ., . banned. you can't turn them off either-_ you - either. you are right. you physically _ either. you are right. you physically can't _ either. you are right. you physically can't turn - either. you are right. you physically can't turn them either. you are right. you i physically can't turn them off either. you are right. you - physically can't turn them off from coming _ physically can't turn them off from coming up — physically can't turn them off from coming up in the app. so they have no access— coming up in the app. so they have no access to — coming up in the app. so they have no access to social media. they can literally— no access to social media. they can literally use — no access to social media. they can literally use youtube, disney, netflix, — literally use youtube, disney, netflix, the kids' section is there, and that's it. they know they are not allowed a mobile phone. if we go to dinner— not allowed a mobile phone. if we go to dinner and not allowed a mobile phone. if we go to dinnerand we not allowed a mobile phone. if we go to dinner and we aren't, they are eating— to dinner and we aren't, they are eating their— to dinner and we aren't, they are eating theirdinner, to dinner and we aren't, they are eating their dinner, they play, they can sit— eating their dinner, they play, they can sit down while we are out, while we are _ can sit down while we are out, while we are eating and have the phone then but— we are eating and have the phone then but it — we are eating and have the phone then but it is all restricted and it comes— then but it is all restricted and it comes from us first. when the right behaviour— comes from us first. when the right behaviour is — comes from us first. when the right behaviour is their first from us, it then— behaviour is their first from us, it then falls— behaviour is their first from us, it then falls on the regulators who unfortunately are over a decade behind — unfortunately are over a decade behind. they are too slow to react. too slow _ behind. they are too slow to react. too slow to — behind. they are too slow to react. too slow to implement things. and unfortunately, you have lost a whole
9:16 am
generation _ unfortunately, you have lost a whole generation of people for poor regulation and fear of upsetting these _ regulation and fear of upsetting these companies that generate billions — these companies that generate billions of pounds profit a year and continue _ billions of pounds profit a year and continue to— billions of pounds profit a year and continue to cut moderators and you end up— continue to cut moderators and you end up with — continue to cut moderators and you end up with more harmful content. which _ end up with more harmful content. which is _ end up with more harmful content. which is driving an entire generation of young people with poor mental— generation of young people with poor mental health. and you've got a generation of people who interact through— generation of people who interact through a — generation of people who interact through a screen.— through a screen. whether it be drivin: it through a screen. whether it be driving it or _ through a screen. whether it be driving it or causing _ through a screen. whether it be driving it or causing it, - through a screen. whether it be driving it or causing it, that's i driving it or causing it, that�*s interesting. driving it or causing it, that's interesting-— driving it or causing it, that's interesting.- we - driving it or causing it, that's interesting.- we have l driving it or causing it, that�*s interesting. bath. we have heard interesting. both. we have heard from the minister— interesting. both. we have heard from the minister the _ interesting. both. we have heard from the minister the concerns. i need to move it on but you have made some very important points. thank you so much. the algorithms, instagram, i�*m flooded with beatles content, so i kind of get that. who have we got? let�*s see, anna in north london. good morning. faraway and speak to the minister. my daughter is on a psychiatric unit. i would _ daughter is on a psychiatric unit. i would say—
9:17 am
daughter is on a psychiatric unit. i would say that she is completely addicted — would say that she is completely addicted to her smartphone. i am begging _ addicted to her smartphone. i am begging the psychiatric unit to take smartphones away from not only my daughter— smartphones away from not only my daughter but the entire population of young _ daughter but the entire population of young people who are on the ward. i'm of young people who are on the ward. l'rn not— of young people who are on the ward. i'm not saying take away all means of communication at all. that is not what _ of communication at all. that is not what l'rn _ of communication at all. that is not what i'm saying. i'm saying very clearly _ what i'm saying. i'm saying very clearly please replace smartphones with brick— clearly please replace smartphones with brick phones so that these people. — with brick phones so that these people, these children, can communicate with people who they trust and _ communicate with people who they trust and aren't communicating with an outside _ trust and aren't communicating with an outside world that is actively encouraging them to kill themselves and actively saying to them on psychiatric wards that they will never — psychiatric wards that they will never get better. the phrase, there are so— never get better. the phrase, there are so many— never get better. the phrase, there are so many different hashtag here but one _ are so many different hashtag here but one of— are so many different hashtag here but one of them is hashtag psych ward _ but one of them is hashtag psych ward and — but one of them is hashtag psych ward and what is happening is that all these _ ward and what is happening is that all these children in desperately vulnerable situations are actually sharing _ vulnerable situations are actually sharing information, photographs of themselves, and guided tours of the ward, _ themselves, and guided tours of the ward, dancing to music when they are
9:18 am
stick thin. _ ward, dancing to music when they are stick thin, touching themselves. i'm going _ stick thin, touching themselves. i'm going to _ stick thin, touching themselves. i'm going to have to be graphic because i going to have to be graphic because i don't _ going to have to be graphic because i don't think people understand how serious _ i don't think people understand how serious this — i don't think people understand how serious this is. what is also going on is _ serious this is. what is also going on is that— serious this is. what is also going on is that actually is a parent i have _ on is that actually is a parent i have logged on and i am tracking my daughter— have logged on and i am tracking my daughter as — have logged on and i am tracking my daughter as a 15—year—old. and as a daughter as a15—year—old. and as a result— daughter as a15—year—old. and as a result of— daughter as a 15—year—old. and as a result of that, i am seeing some of the content — result of that, i am seeing some of the content that the girls on her unit are — the content that the girls on her unit are sharing. and what they are saying _ unit are sharing. and what they are saying is— unit are sharing. and what they are saying is that they are terrified about — saying is that they are terrified about going into adult services. why am i telling you this? because i know— am i telling you this? because i know as — am i telling you this? because i know as much as the professional staff on— know as much as the professional staff on the ward about this girl's situation — staff on the ward about this girl's situation. this is meant to be about confidentiality. we have all these different— confidentiality. we have all these different hoops we have to jump through— different hoops we have to jump through when it comes to the nhs and getting _ through when it comes to the nhs and getting information out. and yet believe — getting information out. and yet believe me, if! getting information out. and yet believe me, if i want to find anything _ believe me, if i want to find anything out about what's going on with my— anything out about what's going on with my daughter i look online on tiktok_ with my daughter i look online on tiktok and — with my daughter i look online on tiktok and i see it. i have recently gone _ tiktok and i see it. i have recently gone on _ tiktok and i see it. i have recently gone on tiktok as well and literally
9:19 am
it is wall—to—wall, as i log on as a 15-year-old. _ it is wall—to—wall, as i log on as a 15—year—old, and i'm talking about today, _ 15—year—old, and i'm talking about today, yesterday when tiktok says that it _ today, yesterday when tiktok says that it is _ today, yesterday when tiktok says that it is doing everything it can to take — that it is doing everything it can to take things down, it isn't. one of my— to take things down, it isn't. one of my daughter's friends told me about— of my daughter's friends told me about one — of my daughter's friends told me about one of the other girls who was on the _ about one of the other girls who was on the unit— about one of the other girls who was on the unit and isn't there any more. — on the unit and isn't there any more. and _ on the unit and isn't there any more, and she said this girl had blood _ more, and she said this girl had blood running down her face because she had _ blood running down her face because she had scarred her face. my daughter's friend asked tiktok to take this — daughter's friend asked tiktok to take this post down five times. they didn't _ take this post down five times. they didn't. , ., take this post down five times. they didn't. , . ~ ., ., didn't. ok, listen, and akamai have a response — didn't. ok, listen, and akamai have a response from _ didn't. ok, listen, and akamai have a response from tiktok_ didn't. ok, listen, and akamai have a response from tiktok which i didn't. ok, listen, and akamai have a response from tiktok which it i a response from tiktok which it behoves me to do. but you have made some really telling points. you have also pointed out a particular scenario that is not uncommon —— anna, i have a response. iwas really disturbed hearing that, but is down to algorithms, isn�*t it?
9:20 am
these are incredibly sophisticated bits of software that have been developed to provide you with the content that they think you want. how are you going to deal with that? can i first say, anna, thank you for sharing that. i�*ve been hearing stories from parents this morning and the secretary of state was on bbc earlier, and these are emotional stories. some parents are going through some of the most unimaginable, harrowing experiences. today the reassurance i would like to give is as part of ofcom�*s announcement today, one of the requirements we are putting on social media companies is exactly that, which is to look at their algorithms and take away harmful content, whether it is around eating disorders or self—harm, or suicide inducing content, take that away from those people or children under 18, and they have to do that. and if
9:21 am
they don�*t do that they will be held to their sanctions. because it�*s just completely unacceptable. there is no excuse for it. they need to do it. frankly, they should have done it. frankly, they should have done it before. when i was a backbencher i talked about it in parliament, i talked about racism on platforms, especially after the euros a few years ago. and my argument as it has been the same, there is no reason why you need to wait for the legislation to do the right thing. businesses have to be a force for goodin businesses have to be a force for good in society. businesses have to be a force for good in society-— businesses have to be a force for good in society. they are on watch, these big companies _ good in society. they are on watch, these big companies have - good in society. they are on watch, these big companies have been i good in society. they are on watch, these big companies have been putj good in society. they are on watch, i these big companies have been put on watch, that�*s interesting. hello, nick in harpenden, you are very welcome to the nation�*s phone in. speak to the minister. welcome to the nation's phone in. speak to the minister.— speak to the minister. hello. the ministers have _ speak to the minister. hello. the ministers have done _ speak to the minister. hello. the ministers have done their - speak to the minister. hello. the ministers have done theirjob i speak to the minister. hello. the ministers have done theirjob so. ministers have done theirjob so thankfully— ministers have done theirjob so thankfully it is now going to be down _ thankfully it is now going to be down to— thankfully it is now going to be down to ofcom to hopefully find some of these _ down to ofcom to hopefully find some of these large tech firms. —— fine. we had _ of these large tech firms. —— fine. we had an— of these large tech firms. —— fine. we had an experience of how quickly these _ we had an experience of how quickly these algorithms work. we sat down with the _ these algorithms work. we sat down with the boys and did the let's talk
9:22 am
a book— with the boys and did the let's talk a book about girls and babies, girls. — a book about girls and babies, girls, boys and babies, and a couple of days— girls, boys and babies, and a couple of days later — girls, boys and babies, and a couple of days later one of my boys is on the sofa _ of days later one of my boys is on the sofa having a look and i disappeared and he looked up the word sex— disappeared and he looked up the word sex and the perfectly natural curious _ word sex and the perfectly natural curious nine—year—old. and then that very quickly— curious nine—year—old. and then that very quickly led on to a pop—up. we think— very quickly led on to a pop—up. we think he _ very quickly led on to a pop—up. we think he then kind of came out of that and — think he then kind of came out of that and a — think he then kind of came out of that and a couple of minutes later he searched for people having sex, which _ he searched for people having sex, which then — he searched for people having sex, which then took him to an x—rated site _ which then took him to an x—rated site that— which then took him to an x—rated site. that was within two minutes. and the _ site. that was within two minutes. and the tech companies know what these _ and the tech companies know what these algorithms do. they have an ability— these algorithms do. they have an ability to— these algorithms do. they have an ability to stop them today. they do not have _ ability to stop them today. they do not have to — ability to stop them today. they do not have to wait 18 months. i was amused _ not have to wait 18 months. i was amused to— not have to wait 18 months. i was amused to hear the minister say earlier— amused to hear the minister say earlier on— amused to hear the minister say earlier on she squoze and squoze them _ earlier on she squoze and squoze them down, which is incredibly good use of— them down, which is incredibly good use of language, in terms of the time _ use of language, in terms of the time frame. my wife is probably far more _ time frame. my wife is probably far more aware — time frame. my wife is probably far more aware than i was but this is
9:23 am
the first— more aware than i was but this is the first time i realised just how evil these — the first time i realised just how evil these algorithms and this technology is.— evil these algorithms and this technolo: is. , ., ., technology is. evil, nine-year-old, lookinu technology is. evil, nine-year-old, looking up — technology is. evil, nine-year-old, looking up sex— technology is. evil, nine-year-old, looking up sex perfectly _ technology is. evil, nine-year-old,| looking up sex perfectly innocently, trying to find out what it means and then being led into this cavernous world that they should not be in. minister, i hope you have time to address that. i know you are limited and it is really appreciated you are there. that is a great example. speak to nick. nick, thank you for sharing that. i think when i was listening to you i was reflecting on how social media has changed, certainly since i came on to facebook, i was in my early 205. a5 on to facebook, i was in my early 205. as you said, your child went on there very innocently to look for information in a safe way. it is just unacceptable. i welcome ofcom�*s focus on the algorithm. a5 just unacceptable. i welcome ofcom�*s focus on the algorithm. as a backbencher i focused on this as well. i think the algorithm absolutely need to be looked at. it is not ok for the social media companies to say they can�*t do it
9:24 am
when they have the financial resource and technical know—how to do that. resource and technical know-how to do that. , ., ., ., , resource and technical know-how to dothat. ., ., do that. there is a lot of focus on tiktok and _ do that. there is a lot of focus on tiktok and instagram _ do that. there is a lot of focus on tiktok and instagram and - do that. there is a lot of focus on tiktok and instagram and those . do that. there is a lot of focus on i tiktok and instagram and those ones. i tiktok and instagram and those ones. i mean _ tiktok and instagram and those ones. i mean this— tiktok and instagram and those ones. i mean, this is google, just a search— i mean, this is google, just a search engine. i mean, this is google, 'ust a search engine.i i mean, this is google, 'ust a search enaine. , . . , search engine. the search engines are covered _ search engine. the search engines are covered by _ search engine. the search engines are covered by the _ search engine. the search engines are covered by the online - search engine. the search engines are covered by the online safety i search engine. the search engines| are covered by the online safety act as well. and by the way, i should just say this, because nicky, you have artificial intelligence coming in as well moving at pace. the way we did the legislation is basically technology agnostic. so even as technology agnostic. so even as technology develops, the requirements, the legal requirements on those companies is going to be exactly the same, whether it is on search services or social media. but you are absolutely right, your child should be able to do it in a safe way. you should be able to have those conversations in a safe way and that�*s why today is really important because for me as a parent it gives ofcom�*s approach, it really gives parents support. you talked about... i also heard the section
9:25 am
where they used the word squoze and squoze, the truth is we are pushing hard, even on ofcom. there is a balance to be struck because the challenges that though social media companies with big pockets and legal teams, but if we do this too quickly, or if we leave loopholes they could do legal challenges and that�*s why there is a little bit of time before the implementation gap. thank you very much. another very important caller, joe in southampton. a quick one, you are with us until 9:30am, which is good news. facial recognition, lots of people asking about that. we will get back to that in a second. facial recognition, you mentioned credit checks. how does that square? credit checks, minors, how does that work? ofcom have set it out, there is a requirement on age verification on social media companies. how they go about that, it will be on them. but it is on them to uphold their terms and conditions. so that�*s the key
9:26 am
bit here, and they go about that. when they talked about credit checks, i suspect what they are talking about is parental consent around that and that�*s what they are talking about. around age verification and facial id, all those different aspects, these technologies are there and ofcom�*s approach is if you are under 18 they should apply uniformly and i think that�*s really important. should apply uniformly and i think that's really important.— should apply uniformly and i think that's really important. thank you. secretary of _ that's really important. thank you. secretary of state _ that's really important. thank you. secretary of state michelle - that's really important. thank you. i secretary of state michelle donelan, your boss, describing it as a wild west, not an inaccurate description in many people�*s minds. but then there is much beauty and benefit in that wild west as well. joe in southampton, hello.- that wild west as well. joe in southampton, hello. good morning. good morning- _ southampton, hello. good morning. good morning. the _ southampton, hello. good morning. good morning. the minister - southampton, hello. good morning. good morning. the minister is i good morning. the minister is listening, which is all good. what would you like to say? back to algorithms, is it? it would you like to say? back to algorithms, is it?— algorithms, is it? it is kind of, eah. algorithms, is it? it is kind of, yeah- and _ algorithms, is it? it is kind of, yeah- and just _ algorithms, is it? it is kind of, yeah. and just the _ algorithms, is it? it is kind of, yeah. and just the general i algorithms, is it? it is kind of, | yeah. and just the general kind algorithms, is it? it is kind of, i yeah. and just the general kind of control— yeah. and just the general kind of control over social media. my daughter— control over social media. my daughter is 11, she is due to start secondary— daughter is 11, she is due to start secondary school in september and we have really—
9:27 am
secondary school in september and we have really held out on letting her have really held out on letting her have a _ have really held out on letting her have a phone, despite most of her friends _ have a phone, despite most of her friends having them because we are really— friends having them because we are really conscious of how dangerous and bad _ really conscious of how dangerous and bad they can be. but uppishly she's— and bad they can be. but uppishly she's going to be catching a bus, she's going to be catching a bus, she is— she's going to be catching a bus, she is going to have a little bit more — she is going to have a little bit more independence —— obviously she's going _ more independence —— obviously she's going to _ more independence —— obviously she's going to be _ more independence —— obviously she's going to be catching a bus. we want to know— going to be catching a bus. we want to know where she is. she will not have _ to know where she is. she will not have social — to know where she is. she will not have social media but she will have things— have social media but she will have things like — have social media but she will have things like whatsapp. it is more about— things like whatsapp. it is more about what her friends see. we can impose _ about what her friends see. we can impose all— about what her friends see. we can impose all the parental restrictions we want _ impose all the parental restrictions we want to— impose all the parental restrictions we want to but if she sat next to a friend _ we want to but if she sat next to a friend on — we want to but if she sat next to a friend on the bus whose parents don't _ friend on the bus whose parents don't really care and just let them have _ don't really care and just let them have free — don't really care and just let them have free reign over social media, because _ have free reign over social media, because it — have free reign over social media, because it is — have free reign over social media, because it is so easy for children to access — because it is so easy for children to access social media without these checks _ in place, she has already disclosed to her— in place, she has already disclosed to her teacher that she feels that she is— to her teacher that she feels that she is fat — to her teacher that she feels that she is fat. she is absolutely not, she's— she is fat. she is absolutely not, she's beautiful, she is a completely healthy— she's beautiful, she is a completely healthy weight. that she's beautiful, she is a completely healthy weight-— healthy weight. that is heartbreaking. - healthy weight. that is heartbreaking. it i healthy weight. that is | heartbreaking. it really healthy weight. that is i heartbreaking. it really is, it healthy weight. that is - heartbreaking. it really is, it has shattered as. _ heartbreaking. it really is, it has shattered as. my _
9:28 am
heartbreaking. it really is, it has shattered as. my concern - heartbreaking. it really is, it has shattered as. my concern is i heartbreaking. it really is, it has shattered as. my concern is the | shattered as. my concern is the algorithms, and while i appreciate what an— algorithms, and while i appreciate what an early caller said, there are so many— what an early caller said, there are so many positive things on social media _ so many positive things on social media online and on google, things she can _ media online and on google, things she can look for, that are aspirational and fabulous things, unfortunately, it is so easily one click— unfortunately, it is so easily one click and — unfortunately, it is so easily one click and you are down a rabbit hole and how— click and you are down a rabbit hole and how do— click and you are down a rabbit hole and how do you get back out of that algorithm? — and how do you get back out of that algorithm? it terrifies us. she wouldn't — algorithm? it terrifies us. she wouldn't have a mobile phone until she is— wouldn't have a mobile phone until she is 18 _ wouldn't have a mobile phone until she is 18 if— wouldn't have a mobile phone until she is 18 if i — wouldn't have a mobile phone until she is 18 if i had my way. but equally, _ she is 18 if i had my way. but equally, you don't want to exclude your children and for her to be pushed — your children and for her to be pushed aside. it is so difficult to know— pushed aside. it is so difficult to know what— pushed aside. it is so difficult to know what to do as parents. it is only— know what to do as parents. it is only because we care. of know what to do as parents. it is only because we care. of course. i know the minister _ only because we care. of course. i know the minister has _ only because we care. of course. i know the minister has to - only because we care. of course. i know the minister has to go. i only because we care. of course. i know the minister has to go. but. know the minister has to go. but again, another superb call. keep them coming. we are staying on with this and we are going to talk about it further. i know you have to go, obviously it is a busy day for you, saky bhati. but you are one click away from going into the rabbit hole, as i said earlier on —— saqib bhatti. but are we at a point of no
9:29 am
return? it is out there, it is too late. thanks for a really good session, i really appreciated talking to parents. i don�*t think we are at the point of no return. that is why the legislation has been put in. it took almost six years to put it in, and we are absolutely determined in this department, and in this government, to make sure that parents have the support they need to deal with the challenges of today. it was very emotional, hearing the child say anything of that sort. that is why the algorithm aspect of this is really important. we have also set aside £2.5 million of literacy and education, because that also plays a huge part. we are working robustly on this, we want to make sure. my final message to the social media
9:30 am
companies and tech companies, if you don�*t do this, we will come for you and ofcom will come for you. that�*s really important. we want to make sure the online world is safe for children and we want our children to have a bright future. fundamentally, thatis have a bright future. fundamentally, that is what this comes down to. thank you for coming on, it is an important day. the digital economy minister, saqib bhatti, you are welcome anytime. taking calls, speaking to parents. anna, do you have confidence this will be effective?— have confidence this will be effective? , ., , ., effective? the problem we have, with eve da effective? the problem we have, with every day that — effective? the problem we have, with every day that passes _ effective? the problem we have, with every day that passes we _ effective? the problem we have, with every day that passes we lose - effective? the problem we have, with every day that passes we lose more i every day that passes we lose more children. i�*m very worried my daughter is going to be one of them. i would urge all parents to absolutely pay attention to the rules that you can have her own controls, around smartphones and things. ultimately, the only way in which you can really make a difference is to choose not to give our children smartphones until they are mature enough to really deal

0 Views

info Stream Only

Uploaded by TV Archive on