Skip to main content

tv   BBC News  BBC News  May 8, 2024 9:30am-10:01am BST

9:30 am
«some for you don't do this, we will come for you and ofcom will come for you. that's really important. we want to make sure the online world is safe for children and we want our children to have a bright future. fundamentally, thatis have a bright future. fundamentally, that is what this comes down to. thank you for coming on, it is an important day. the digital economy minister, saqib bhatti, you are welcome anytime. taking calls, speaking to parents. anna, do you have confidence this will be effective?— have confidence this will be effective? , ., , ., effective? the problem we have, with eve da effective? the problem we have, with every day that — effective? the problem we have, with every day that passes _ effective? the problem we have, with every day that passes we _ effective? the problem we have, with every day that passes we lose - effective? the problem we have, with every day that passes we lose more i every day that passes we lose more children. i'm very worried my daughter is going to be one of them. i would urge all parents to absolutely pay attention to the rules that you can have her own controls, around smartphones and things. ultimately, the only way in which you can really make a difference is to choose not to give our children smartphones until they are mature enough to really deal
9:31 am
with what they are seeing. i'm not saying take away all communications, don't give them smartphones, give them a break and a trucker. it allows you to do the most important thing is, which parents need to do. but we can't wait until ofcom gets its act together. —— give them a brick and tracker. we can't wait for ofcom, we need to do our own bit to make sure we do what we can now. some children are more vulnerable and more fragile than others. all children are different, and mature at different rates. at what age would it be acceptable for a young person to have a smartphone? i would it be acceptable for a young person to have a smartphone? i think in a perfect — person to have a smartphone? i think in a perfect world _ person to have a smartphone? i think in a perfect world i _ person to have a smartphone? i think in a perfect world i would _ person to have a smartphone? i think in a perfect world i would say - person to have a smartphone? i think in a perfect world i would say 16. - in a perfect world i would say 16. but we are not living in a perfect world. and i think that part of the problem will be that if we try to put it to 16, we will get a very big push back from a big attack. i think
9:32 am
14 push back from a big attack. i think ia is probably more realistic. but i would like to go for 16, if i could. —— from big tech. on your point of vulnerability, that is true, there is a particular group in society which is already in psychiatric units, they are being deeply let down. the health service, the health department, needs to take a look at this. and honestly admit that smartphones are a problem. at the moment, they are completely in denial about the impact that smartphones are having. they will give any excuse other than talking about social media. that has to change. i don't know what they are worrying about, because this is so fundamental to saving the children who are on the brink. actually, next time you have someone on your programme about this, please include
9:33 am
the department of health. it has so many questions to answer. the children on the edge are really in trouble, and the department of health is not helping them. amazing call, so powerful. _ health is not helping them. amazing call, so powerful. so _ health is not helping them. amazing call, so powerful. so much _ health is not helping them. amazing call, so powerful. so much to - health is not helping them. amazing call, so powerful. so much to think. call, so powerful. so much to think about. i wish you well and i wish your daughter well. please talk to us again some time, we will be discussing this. we'll discuss it again, obviously, because it is such again, obviously, because it is such a massive issue for parents and our society as well. the individual well—being, the collective well—being. we are here until ten o'clock. let's get a word, before we take more calls, from a technologyjournalist who has worked at facebook and
9:34 am
instagram. will, hi. i have some rights of replies to get to from the companies. is it feasible? change the algorithms, facial recognition, orface the algorithms, facial recognition, or face the consequences? what do you make of this?— you make of this? there is a lot to un ack, you make of this? there is a lot to unpack. an — you make of this? there is a lot to unpack, an awful— you make of this? there is a lot to unpack, an awful lot. _ you make of this? there is a lot to unpack, an awful lot. i— you make of this? there is a lot to unpack, an awful lot. i know- you make of this? there is a lot to unpack, an awful lot. i know they. unpack, an awful lot. i know they have given the right to reply, but you should have somebody from matter, snapchat or from you should have somebody from matter, snapchat orfrom here talking through the complexities of this, not me, somebody who used to work there. i'm happy to be here to talk about it, by the way. there is a huge set of challenges. i want to start by saying that nobody disputes the fact that more needs to be done to protect children on the internet. i think that is the clearest place i want to start. how you go about age verifying, how you go about controlling this, how you go about the process, it is actually very hard, despite what anybody is
9:35 am
currently saying. there is a mishmash of suggested technologies and ideas, but short of getting young people to scan in a passport, which given our government has had all sorts of fears about state id etc... ~ ., ., ~ all sorts of fears about state id etc... ~ . ., ~ ., etc... we are talking about facial recognition- _ etc. .. we are talking about facial recognition. facial— etc... we are talking about facial recognition. facial recognition i etc... we are talking about facial. recognition. facial recognition does not work properly. _ recognition. facial recognition does not work properly. i _ recognition. facial recognition does not work properly. i did _ recognition. facial recognition does not work properly. i did a _ recognition. facial recognition does not work properly. i did a facial- not work properly. i did a facial recognition test the other day which told me i was 22, which was very flattering to me. it simply doesn't work properly at the moment. you are in a situation where less than five years ago, when the government was looking realistically at how to tackle a different issue online, legal pornography for over 18 is, they were looking at scratchcards. we would have had to traipse down to the local newsagent, get a scratch card, to prove we were of legal age to view it. that would have caused problems because it would have pushed it underground and people would have started finding more illegal contact. it didn't happen, thankfully. we are in a situation
9:36 am
where, let's break down the numbers, last year, who run .uk on the internet, they did a survey, 56% of 8-16 internet, they did a survey, 56% of 8—16 —year—olds had a whatsapp account last year. now, you should have been over 16 to even be on it. so, some of that is the problem of the tech company for not properly age verifying. but mark zuckerberg does not drive the phone to your house and install whatsapp on your device. that is still the choice of the parent. to go into a situation, don't get me wrong, i am a parent, i know the complexities of this, to suddenly go into a situation where you say it is the fault of the tech companies, that is parents washing their hands of a tricky set of conversations they have to have. ofcom have said, i will quote them, in some cases it will mean preventing children from accessing the entire site or app. it might mean age restricting part of their
9:37 am
site to adults only access. you would say that whatsapp would be an example of that?— would say that whatsapp would be an example of that? whatsapp is now 13. the ace example of that? whatsapp is now 13. the age limit — example of that? whatsapp is now 13. the age limit has _ example of that? whatsapp is now 13. the age limit hasjust _ example of that? whatsapp is now 13. the age limit hasjust changed. - example of that? whatsapp is now 13. the age limit hasjust changed. 56% l the age limit has just changed. 56% of 8-16 the age limit has just changed. 56% of 8—16 —year—olds in the uk said they had an account, when you should have had over 16.— have had over 16. what about family urou s, have had over 16. what about family grows. what _ have had over 16. what about family grows. what are — have had over 16. what about family groups, what are you _ have had over 16. what about family groups, what are you doing, - have had over 16. what about family groups, what are you doing, where l groups, what are you doing, where are you coming credible for the entire family?— entire family? you and i get on well, i entire family? you and i get on well. i don't — entire family? you and i get on well, i don't dispute _ entire family? you and i get on well, i don't dispute that - entire family? you and i get on well, i don't dispute that with i entire family? you and i get on - well, i don't dispute that with you, what i am saying is that there is a 16 age limit on a series parents come in many cases, have chosen to ignore it and allow their children on it. the number of parents i speak to say, yes, my kid is on snapchat, they put in a false age, getting into a position where you can age verified successfully is going to be a challenge. also, the obsession with algorithms is a bit buzzword and irrelevant, because the issue is around age verification. if content
9:38 am
is age suitable, anybody should be able to view it. as you find yourself in a situation where many people are just saying, actually, i don't want to do the policing of my kids, i don't want to have these conversations, let's get the government to do it. i understand the pain of parents who think more should be done and tech companies need to do more, but there is a much more complex and nuanced argument. well, you know stuff on this for sure. let's get back to the callers. it's interesting that michelle donlon says it may go against the grain of her conservatism, small government, but she says sometimes intervention is absolutely necessary, she says it is all about doing the right thing. gloria in milton keynes, vicky in cambridge, david in st neots, thank you all for calling. gloria, you work in digital marketing. so, you have knowledge of this area. an appropriate
9:39 am
hinterland. is it down to parents or companies? it hinterland. is it down to parents or companies?— companies? it is a bit of both, to be honest- _ companies? it is a bit of both, to be honest. social— companies? it is a bit of both, to be honest. social media, - companies? it is a bit of both, to be honest. social media, i- companies? it is a bit of both, to be honest. social media, i know| companies? it is a bit of both, to | be honest. social media, i knowi be honest. social media, i know i agree with the last caller, buzz words about algorithms, social media is made to be addictive, to make people stay on as long as possible for advertising purposes. the longer you stay on social media, the better it is for social media companies and their advertising perspectives. even if we took away the algorithms, the harmful nature of targeting certain content, it is about how long some people stay on there and what they are going to be accessing along the way. parents do have screen... what is it? sorry... they can stipulate screen time for their children. i
9:40 am
think a lot of parents don't even know what screen time is, i have a daughter, and she gets around one hour of social media per day, if that. ., ., , , hour of social media per day, if that-_ she - hour of social media per day, if that._ she is i hour of social media per day, if| that._ she is 15. that. how old is she? she is 15. she's always — that. how old is she? she is 15. she's always telling _ that. how old is she? she is 15. she's always telling me, - that. how old is she? she is 15. she's always telling me, there | that. how old is she? she is 15. i she's always telling me, there are other children that are allowed social media for a long time. and i know the reasons why, it is completely addictive. even if you flush away the harmful nature of social media, it is all about isolating the children, they are going out less, they are doing less social things because they are on social things because they are on social media, they are addicted to content that comes in and of the strolling effect, where one video, after another, after another, something that facebook have adopted and especially instagram, it keeps people on there for a long time. when you see one video, another will come and another will come. and this is messing up the minds, the cognitive minds of children as well.
9:41 am
it's notjust about cognitive minds of children as well. it's not just about the algorithms and harmful content, it's about how long you are staying on social media. things like screen time, that is so important, because it tells you where they are going. and i also have the passwords to all of her social media accounts, which is limited to whatsapp on snapchat, thatis limited to whatsapp on snapchat, that is the only team of cauchy is allowed on. it tells me how long she has been on the amount time. she has learned now to spread her time, the one hour that she gets, between those two social media is. but, yes, it is a mixture. you know, the thing coming out about up to 13 years old, most social media platforms are ones where up to 13 are not supposed to be on there. i where up to 13 are not supposed to be on there-— be on there. i want to pick up on that point _ be on there. i want to pick up on that point as _ be on there. i want to pick up on
9:42 am
that point as well. _ be on there. i want to pick up on that point as well. parents - be on there. i want to pick up on| that point as well. parents having passwords. there is two side to that coin. it could be kids who need somebody to talk to. it coin. it could be kids who need somebody to talk to.— coin. it could be kids who need somebody to talk to. it has to be consensual- _ somebody to talk to. it has to be consensual. i— somebody to talk to. it has to be consensual. i do _ somebody to talk to. it has to be consensual. i do talk _ somebody to talk to. it has to be consensual. i do talk to - somebody to talk to. it has to be consensual. i do talk to my - consensual. i do talk to my daughter. obviously, as your mum, i need to know where you are and what you are accessing. it needs to be a trust thing and a conversation. it's not about demanding, it's about having the conversation as early as possible, so they get used to the fact that you do have their passwords. fact that you do have their passwords-_ fact that you do have their asswords. , ., ., ., passwords. listen... you have got their passwords, _ passwords. listen... you have got their passwords, i've _ passwords. listen... you have got their passwords, i've got - passwords. listen... you have got their passwords, i've got it. - their passwords, i've got it. gloria, thank you. vicky in cambridge. good morning. welcome to the nation's phone in. what are your thoughts? my the nation's phone in. what are your thou~hts? ~ , the nation's phone in. what are your thou~hts? g ., , the nation's phone in. what are your thou~hts? g .,, the nation's phone in. what are your thou~hts? g ., .,, thoughts? my mind was racing as i have been listening _ thoughts? my mind was racing as i have been listening to _ thoughts? my mind was racing as i have been listening to your - thoughts? my mind was racing as i have been listening to your last. have been listening to your last couple of guests speaking. what are my thoughts? i think that, regardless of anything anyone else has said, the most key point to me
9:43 am
is why are there creators putting content on platforms that coerce children to harm themselves, and this is being left there to be viewed? ~ , ., , this is being left there to be viewed? ~ , . , , viewed? why as it there in the first lace? viewed? why as it there in the first place? you — viewed? why as it there in the first place? you could _ viewed? why as it there in the first place? you could tell _ viewed? why as it there in the first place? you could tell me _ viewed? why as it there in the first place? you could tell me you - viewed? why as it there in the first place? you could tell me you have| place? you could tell me you have every password — place? you could tell me you have every password to _ place? you could tell me you have every password to your _ place? you could tell me you have every password to your children'sl every password to your children's social media accounts, you can control how long they are watching it, but if they spend ten minutes looking at a video which coerces them into self harming, how do you control that? how do you control that when their child is the bedroom with the our�*s screen time, that they are not in their bedroom with 1000 strangers that want to sexually exploit them, that want to coerce them into self harming? that says, why not try this choking challenge? put a choking challenge and to take stock and there are thousands of videos. why are they there in the first place?—
9:44 am
videos. why are they there in the first place?_ that| videos. why are they there in the | first place?_ that is first place? beggars belief. that is where we come _ first place? beggars belief. that is where we come to. _ first place? beggars belief. that is where we come to. i _ first place? beggars belief. that is where we come to. i think- first place? beggars belief. that is where we come to. i think i - where we come to. i think i mentioned to your colleague, without going into too much detail, it is very easy for the professionals to say it is not the algorithms, parents want to wash hands, but our family have a beautiful little girl that was full of life, she lived very happily, she went outside, she played and spent her summers in ireland. n played and spent her summers in ireland. , u, , played and spent her summers in l ireland-_ she ireland. i remember the case. she had everything _ ireland. i remember the case. she had everything to _ ireland. i remember the case. she had everything to live _ ireland. i remember the case. she had everything to live for. - ireland. i remember the case. she had everything to live for. she - had everything to live for. she decided to end her life. while we are fighting for access to know exactly what she was looking at, because this is the other thing, if your child is rolling on tik—tok, your child is rolling on tik—tok, you don't know what they have looked at. there is no way of accessing the information. if you have a child whose life has been lost and you
9:45 am
suspect that there was coercion, that they have been watching things that they have been watching things that have programmed them to want to try out a challenge, then you can't access and see where that came from, who created it, why did they create it? how do you hold them responsible in a legal capacity? they have blood on their hands, the creators. and, for me, the social media companies that are not stopping this content being put in front of our children. most platforms are happy for i3—year—olds to be on the platform. so why is it there? 13-year-olds to be on the platform. so why is it there?— so why is it there? there is a libertarian — so why is it there? there is a libertarian free _ so why is it there? there is a libertarian free speech - so why is it there? there is a - libertarian free speech argument against that, isn't there?- against that, isn't there? well, it's a freedom _ against that, isn't there? well, it's a freedom of _ against that, isn't there? well, it's a freedom of speech - against that, isn't there? well, it's a freedom of speech for - it's a freedom of speech for somebody to be a paedophile? for somebody to be a paedophile? for somebody to be a paedophile? for somebody to harm another child? no. absolutely, absolutely. some people would argue there are grey areas, and who is to police what is acceptable and what isn't? there is acceptable and what isn't? there is a text here, thank you for getting
9:46 am
in touch, there is a text that really goes to back you up. if any food company knowingly distributed food company knowingly distributed food which contained a poison or something that would damage children physically, they would be prosecuted. surely it should be the same if social media companies on the internet allow the distribution of anything which will poison or in any way damage children's mental health? that is what you are saying. that is what i am saying. there are studies out there, there was a study last year, you will have to forgive me, i don't know where, but there was another one conducted in ireland, it is proven, it is almost like psychological warfare. it is being rammed down children's throats, the second they have a device. that could be as simple as a child on an ipad, watching youtube at the dinner table, with their family. they are now there were thousands of strangers. for me, tech
9:47 am
companies make millions and millions of pounds per year, some of that needs to be reinvested into making sure that anybody, children and adults, are safe. there will be adults, are safe. there will be adults out there as well who also fall into this trap. 0k. adults out there as well who also fall into this trap.— fall into this trap. 0k. we'll want to come in _ fall into this trap. 0k. we'll want to come in and _ fall into this trap. 0k. we'll want to come in and i _ fall into this trap. 0k. we'll want to come in and i have _ fall into this trap. 0k. we'll want to come in and i have the - fall into this trap. 0k. we'll want | to come in and i have the tik-tok to come in and i have the tik—tok statement. it's powerful, what you say. tik—tok says the deepest of these i would do family experience in this tragic loss, referring to that case, vicky, which you mentioned, of maia. our deepest sympathies with the family, the safety of our community is our priority, we do not allow content that promotes or glorifies suicide or self—harm on our platform. we continue to prioritise protecting our community, working with xp partners and providing safety
9:48 am
resources to those that need them. hertfordshire police say the investigating officer's thoughts are with maia's family and friends, and it would not be appropriate to comment further until the inquest has concluded and all evidence has been presented. will, her central point, why are some of this stuff even on there in the first place? what is the justification? it’s even on there in the first place? what is the justification?- what is the 'ustification? it's a very good — what is the justification? it's a very good question. _ what is the justification? it's a very good question. when - what is the justification? it's a very good question. when you j what is the justification? it�*s —. very good question. when you get a case as tragic as that, it is hard for me to sit here and say everything is fine, because it isn't. however, the number of posts that get made to these platforms on any day, millions and millions of posts, in my time at instagram, speaking of 201a, it was in relative infancy, it had only got a 300 million users. the people involved in trust and safety attack it incredibly seriously. whenever it
9:49 am
was made aware of content that shouldn't be there, it worked to remove it. however, in the case of content around anorexia, you removed one hashtag and it was like whack a mole, the content would appear elsewhere, within moments, when they realised a certain hashtag had been blocked. social networks need to do more to keep this content off the platforms. there is no dispute. hope platforms. there is no dispute. how do the do platforms. there is no dispute. how do they do that? _ platforms. there is no dispute. how do they do that? artificial— do they do that? artificial intelligence _ do they do that? artificial intelligence will - do they do that? artificial intelligence will play - do they do that? artificial intelligence will play a . do they do that? artificial. intelligence will play a huge do they do that? artificial- intelligence will play a huge role. there are already successful examples. we talk about child sexual abuse material on the internet, a lot of that is being removed, thankfully, at source, as soon as it is identified and labelled by some of the incredible charities that work in this space, notice i say the word charities, it has not been done by tech companies but by charities, those images are then blocked at source. those images are then taken
9:50 am
off the internet. all of the tech companies are collaborating with the charities and many have given money to fund this software, or fund these ways of doing this stuff. but you can't help but feel, why on earth can't help but feel, why on earth can't meta, snapchat, google, bytedance, all get together and work on this? ifeel bytedance, all get together and work on this? i feel it is important, people say there is a cast—iron case that social media is polluting kids minds, all of the research at the moment, none of it really holds up. that is one of the things that people don't like hearing. i'm not saying that these things don't happen. i would like some properly funded research to take place into the harms to young people, and then we can definitively talk about those dangers and the harms. and, maybe, the government, i'm not here to talk government policy, could start
9:51 am
taxing the companies that operate in the uk, given that meta and other social networks used perfectly legal tax avoidance techniques do not pay tax avoidance techniques do not pay tax in the uk. but they can make huge money in this country and they could do more to protect people. vicky used a powerful phrase blood on their hands. the minister earlier on their hands. the minister earlier on said the company should be held to criminal responsibility. vicki, i think you want to come back. do you have the fear that pandora's box is opened and will never be closed again? opened and will never be closed auain? , , ., , ., again? definitely. that is quite a aood again? definitely. that is quite a good analogy- — again? definitely. that is quite a good analogy. the _ again? definitely. that is quite a good analogy. the main - again? definitely. that is quite a good analogy. the main point. again? definitely. that is quite a good analogy. the main point i l again? definitely. that is quite a - good analogy. the main point i would like to make is that, regardless of what is or isn't on the platform, you will never have access to that data. i think that is something i really want to emphasise. for us, to never understand the full content of
9:52 am
what maia was looking at, to be able to know if it was the driving factor to know if it was the driving factor to make the decision. tik—tok�*s speech is great, saying that they take the videos off, if you watch the last video that maia ever watched, it is a little girl talking to somebody about how you must be an angel because you are self harming yourself and you have cut yourself, and you are an angel like me now as well. and there are roblox people running around, to attract children. there has to be more done to make sure that the content is not there in the first place, but also when these tragic circumstances happen, and there are so many cases out there now, there are so many cases now, we need to read that book to understand what actually happened
9:53 am
and understand that if somebody does have blood on their hands, they are held responsible in a criminal capacity. held responsible in a criminal caaci . ., ~ held responsible in a criminal caaci . ., . capacity. thank you so much. on the radio after ten _ capacity. thank you so much. on the radio after ten o'clock _ capacity. thank you so much. on the radio after ten o'clock we _ capacity. thank you so much. on the radio after ten o'clock we have - radio after ten o'clock we have three mps. we do that on a wednesday, they take your questions and they answer your questions, hopefully, if you have any questions on this for mps, stay with us on the radio and you can ask away and we will see what they have to say. some very relevant and important points have come up over the last hour. someone who has been waiting for most of that last hour is david in st neots. i'm so sorry, it is your time, yourtime st neots. i'm so sorry, it is your time, your time has come, what would you like to say? i time, your time has come, what would you like to say?— you like to say? i run a charity for ounu you like to say? i run a charity for young iturns— you like to say? i run a charity for young burns survivors, _ you like to say? i run a charity for young burns survivors, and - you like to say? i run a charity for young burns survivors, and i've . you like to say? i run a charity for. young burns survivors, and i've been doing that for over 25 years now. we
9:54 am
noticed a big change in the use of phones at camp, we do residential camps. eight or nine years ago we instigated a no phones or technical devices allowed to come to the camp during the week. a bit of a kickback to start with. but most of the children now appreciate the fact that they have not got their phones, they have not got the pressure of going on in talking to their friends at home or going on social media. because they were disengaging by having those smartphones rather than just phones at camp. even to the point where a lot of the children, up point where a lot of the children, up to 17 years of age, they let us know that they use their smartphones less now because of the rules we implemented. as hard as it is to take it away, eventually they get used to not having it. it is take it away, eventually they get used to not having it.— used to not having it. it is not as if they turn _ used to not having it. it is not as if they turn into _ used to not having it. it is not as
9:55 am
if they turn into the _ used to not having it. it is not as if they turn into the famous - used to not having it. it is not as if they turn into the famous five all of a sudden, but they feel it is liberating? all of a sudden, but they feel it is liberatin: ? , all of a sudden, but they feel it is liberating?— all of a sudden, but they feel it is liberatin: ? , . ., ., liberating? very much though, that is one of the _ liberating? very much though, that is one of the words _ liberating? very much though, that is one of the words that _ liberating? very much though, that is one of the words that a - liberating? very much though, that is one of the words that a couple i is one of the words that a couple children have used, they now feel they don't have to have their phone all the time. because we have done one week without it, they understand they can get through time and do things, and engage with other people around them. that is one of the biggest things, that they engage more with other burn survivors around them, which is what the camp is all about. it around them, which is what the camp is all about-— is all about. it sounds great what ou do, is all about. it sounds great what you do. thank— is all about. it sounds great what you do, thank you _ is all about. it sounds great what you do, thank you for _ is all about. it sounds great what you do, thank you for getting i is all about. it sounds great what you do, thank you for getting in i you do, thank you for getting in touch. develin, in burton on the water. hi. touch. develin, in burton on the water- hi— touch. develin, in burton on the i water- hi-_ i'm water. hi. hello, how are you? i'm fine, how— water. hi. hello, how are you? i'm fine. how are _ water. hi. hello, how are you? i'm fine, how are you, _ water. hi. hello, how are you? i'm fine, how are you, and _ water. hi. hello, how are you? i'm fine, how are you, and what i water. hi. hello, how are you? i'm fine, how are you, and what do i water. hi. hello, how are you? i'm| fine, how are you, and what do you want to say?— fine, how are you, and what do you want to say? fine, how are you, and what do you want to sa ? ., , ., want to say? from my time, coming to the uk as a _ want to say? from my time, coming to the uk as a south _ want to say? from my time, coming to the uk as a south african, _ want to say? from my time, coming to the uk as a south african, it _ want to say? from my time, coming to the uk as a south african, it was i the uk as a south african, it was such a nuance for my daughter to be able to use their cell phones, my daughters, 21—7, having the capacity to charge their phones, walking
9:56 am
around in the streets holding their sally —— cellular device, the networks being active 2a—7. for my girls, it was an incredible change to their lifestyle. we noticed they were quite technologically behind the other children in the uk, because obviously of the amount of exposure that they had to cellular devices. ., ., ., devices. how old? not you, the children! a _ devices. how old? not you, the children! a 14-year-old, i devices. how old? not you, the children! a 14-year-old, a i children! a 14-year-old, a 15-year-old _ children! a 14-year-old, a 15-year-old and _ children! a 14-year-old, a 15-year-old and a - children! a 14-year-old, a i 15-year-old and a 21-year-old. children! a 14-year-old, a - 15-year-old and a 21-year-old. ok, 14 and 15. — 15-year-old and a 21-year-old. ok, 14 and 15. prime — is—year—old and a 21—year—old. 0k, ia and 15, prime area. 15-year-old and a 21-year-old. ok, 14 and 15, prime area. correct, i 1a and 15, prime area. correct, correct. automatically, i 1a and 15, prime area. correct, correct. automatically, there l 1a and 15, prime area. correct, i correct. automatically, there was a certain amount of bullying that was happening at school because they were not on snapchat, they were not on certain social media app etc. they felt they were already being
9:57 am
bullied under that environment. a lot of school work is conducted by cellular mobile. i would say, put your cell phone down, you have to do your cell phone down, you have to do your schoolwork, and they say, but, man, we are busy doing our homework on the app on the cell phone. i doubt we can stop the usage of children on these applications. they will put in different years of birth. how do we protect them? how do we protect them when it is happening in a school environment, where the pressure is starting to build, and it is starting at school and transferring into social media? what is the answer to the question? how do we protect them? i what is the answer to the question? how do we protect them?— how do we protect them? i honestly don't have the _ how do we protect them? i honestly don't have the answer. _ how do we protect them? i honestly don't have the answer. we - how do we protect them? i honestly don't have the answer. we are i don't have the answer. we are fortunate enough that our daughters have experienced life in south
9:58 am
africa, they are very intuitive of what to believe. to an average child that has not had the upbringing, you're caught between a rock and a hard place. i you're caught between a rock and a hard lace. ., ., you're caught between a rock and a hard place-— hard place. i have to do some housekeeping. _ hard place. i have to do some housekeeping. caught i hard place. i have to do some l housekeeping. caught between hard place. i have to do some i housekeeping. caught between a hard place. i have to do some - housekeeping. caught between a rock and a hard place is where a lot of parents are. thank you for watching on bbc news.
9:59 am
live from london, this is bbc news. the us reveals it paused a shipment of bombs for israel over concerns it was going ahead with a major ground operation in rafah. russia launches another large—scale missile and drone attack across ukraine — damaging 13 residential buildings in kyiv. the world's oceans have broken temperature records every single day over the past year according to bbc analysis. # i'm down, down in my doomsday blues... # and ireland breaks its recent eurovision "curse", qualifying for the grand final on the song contest,
10:00 am
for the first time in six years. hello, i'm azadeh moshiri, welcome to the programme. we start with the war in gaza and let's bring you these live pictures from the israel—gaza border, where the israeli bombardment has been particularly intense around rafah. more thani million palestinian refugees are sheltering in the city in the south of the strip. in the south of the strip. it's emerged that the us paused a shipment of bombs to israel last week, over fears they might be used in an assault on rafah — that's according to a senior administration official who's been speaking to cbs news, they're our media partner in the us. israel says it will deepen its attack on rafah, until hamas is eliminated, or the first of the remaining hostages are released. israeli forces went into rafah on monday night in defiance
10:01 am
of international pressure, seizing the border crossing with egypt.

0 Views

info Stream Only

Uploaded by TV Archive on