Online and Mobile Phone Safety

online technology

This policy outlines some key information that all Sparks Fostering carers and social workers need to be aware of in relation to use of online technology.

Under no circumstances can a child in foster care have unrestricted and completely unsupervised use of mobiles phones, laptops or other devices. The risks and appropriate interventions are summarised here. 

Common risks.

A few of the common risks that young people may be exposed to online include:

Body dysmorphia.

Images of faces and bodies are often digitally altered online (by use of ‘filters’) to smooth skin, reduce waist size or in other ways increase perceived attractiveness.

This can lead to people having a distorted understanding of the attractiveness of the general population and can lead to young people having unrealistic expectations of themselves.

Body dysmorphia is a mental health condition where a person spends a lot of time worrying about the flaws in their appearance. These flaws are often unnoticeable to others. It’s most common in teenagers and young adults and can affect any gender.

Catfishing.

Catfishing is the process of manipulating someone into an online relationship, while using a fake online persona.

Challenges.

Young people may participate in dangerous challenges for excitement, to be popular, to get attention from peers, or to feel like they are part of a peer group. They may be pressured to participate, either by strangers online, or by their peer group. Some popular but dangerous challenges have been:
  • Benadryl challenge
    The Benadryl challenge encourages others to take large doses of the antihistamine to induce hallucinations. Taking too much Benadryl can be lethal.
  • “Blackout” challenge
    Also referred to as the “choking challenge” or the “pass-out challenge”, the “blackout” challenge encourages viewers to hold their breath until they pass out due to a lack of oxygen. This challenge was previously viral (popular) on TikTok. This challenge is believed to have caused over 80 deaths.
  • Cinnamon challenge
    Participants of the cinnamon challenge film themselves eating a spoonful of ground cinnamon in under 60 seconds without drinking anything, with the video being uploaded to the internet as evidence. The cinnamon coats and dries the mouth and throat, potentially resulting in coughing, gagging, vomiting, and inhalation of cinnamon, which can in turn lead to throat irritation, breathing difficulties, and risk of pneumonia or a collapsed lung.
  • Happy slapping
    Is the practice whereby a group of people assault a stranger at random while filming the incident on a mobile device, so as to circulate the images or post them online.
  • Salt and ice challenge
    Participants pour salt on their bodies, usually on the arm, and ice is then placed on the salt. This causes a “burning” sensation similar to frost bite, and participants aim to withstand the pain for the longest time. Due to the numbing sensation of the cold and possible nerve damage during the stunt, participants are often unaware of the extent of the injuries during the stunt. The challenge is recorded and posted on YouTube or other social media sites. This challenge can lead to second- and third- degree injuries.
  • Tide pod challenge
    Teenagers have recorded themselves chewing and gagging on pods then daring others to do the same and post videos online. Ingestion can cause excessive vomiting, lethargy, and gasping, and in some cases, victims stopped breathing and required ventilation support. There have been some deaths linked to this challenge.

Cyber bullying.

Using electronic communication to bully someone, typically by sending intimidating or threatening messages. This may include sharing intimate or inappropriate images.

Cyber stalking.

Using information and communication technology, particularly the Internet, to repeatedly harass an individual, group of individuals or organisations.

Dark web.

The dark web refers to an area of the internet that can only be accessed through particular software. This means networks are encrypted repeatedly, making a user anonymous. 6% of internet information is on the dark web. Accessing the dark web is not illegal, but due to its anonymity it is used for criminal purposes.

Fake news.

Is false stories that appear to be news, spread on the internet or using other media, usually created to influence political views, as a joke, or to get attention. Fake news can be used to radicalise young people.

Flaming.

A hostile and insulting interaction between internet users. It usually occurs in discussion boards.

Hacking.

Hacking is a slang term used to describe illegal access of computer systems by unauthorised users.

Internet addiction.

Excessive use of the internet (including social media) may lead to obsessive preoccupation, or uncontrollable urges to remain on devices. Removal of devices (or controls around use) may lead to anger and inappropriate behaviour. Internet addiction is likely to impact on the young person’s ability to sleep at an appropriate time and may lead to an inability to focus on other activities.

Phishing.

Phishing is the criminally fraudulent process of attempting to acquire sensitive information such as usernames, passwords and credit card details by masquerading as a trustworthy entity in an electronic communication.

Radicalisation.

Young people may seek out information about extremist groups and make contact with extremists online. See Sparks Fostering Policy ‘Preventing radicalisation and extremism’ for further information about this topic.

Sexting.

The sending of explicit pictures (often self portraits) by multimedia text message, usually via a mobile phone.

Spam.

Unwanted email, usually of a commercial nature, sent out in bulk to an indiscriminate set of recipients.

Troll.

Someone who posts inflammatory, or off-topic messages in an online community, such as an online discussion forum, chat room, or blog, with the primary intent of provoking readers into an emotional response or of otherwise disrupting normal on-topic discussion.

Platforms

Many of the platforms listed here have age restrictions; however, many young people ignore and bypass the age restrictions. A few of the most popular platforms being used are:

Dating apps (usually age 18+).

The risks for young people who access dating apps include: being manipulated and exploited online; arranging to meet strangers; having their location shared with strangers (either by disclosing their location, by being identified by school uniforms or other location information in pictures, or via the app software); exposure to inappropriate material; being exploited or abused after meeting people in person.

Facebook (13+).

Facebook is a social media platform. It is less popular with young people, and most younger people have left to join newer platforms.

Messaging.

Young people may be exposed to some risk via email, text messages or other messaging services; for example, peers may send inappropriate content directly, or to groups. Foster carers should be mindful of the online communication that young people have with others.
WhatsApp is a free messaging app that lets users make video and voice calls, send text messages, share their status, and more. Group calls and chats can also be made on WhatsApp, internationally. Users are supposed to be age 16 or above.

Instagram (13+).

Instragram is a free photo and video sharing app, and content can be shared internationally.

LinkedIn (13+).

LinkedIn is a social network that focuses on professional networking and career development. Young people may access LinkedIn to find profiles for foster carers, staff members or family members.

Pinterest (13+).

Pinterest is an image sharing and social media service designed to enable saving and discovery of information (such as images and short videos) in the form of ‘pinboards’. It’s the online version of ripping out a picture from a magazine and adding it to a corkboard. Pinterest is not very popular with young people, but may be useful to support young people’s interests.

Reddit (13+).

Reddit allows users to share content (everything from text and links to articles, images, or videos) for others to comment. Some topics are of interest to young people, such as humour and memes. However, as with other social media sites, there’s potential for sexual predators to strike up relationships with unsuspecting children and privately message them. Also, there is a large amount of inappropriate content on Reddit.

Snapchat (13+).

Snapchat is a messaging app that lets users exchange pictures and videos (called snaps). Time limits can be set on snaps, so they disappear after they’re viewed; however, recipients can take a screenshot of an image using their phones or a third-party screen-capture app. Children using this app should be reminded that nothing shared online is really temporary.

TikTok (13+).

TikTok is a video-sharing app that allows users to create and share 15-second videos on any topic.

Twitter (13+).

Twitter is an online news and social networking site where people communicate in short messages called tweets. Twitter is considered to have a large amount of harassment and abuse; people have been targeted by coordinated harassment campaigns that can involve anything from threats and spamming to account hacks and worse.

Wink (17+).

The Wink app is a social network that allows strangers from around the world to connect via video. Wink works similarly to the popular dating app, Tinder. Users can swipe left or swipe right to view profiles. Users technically have to be 17 to use the service, but there’s no age-verification process. There are concerns that some adults are contacting children for sexual exploitation via this app.

YouTube (13+).

YouTube is a free video sharing website. The main risk on YouTube is linked to inappropriate content such as explicit content, swearing and sexualised behaviour.

Safeguarding.

Allowing children to take appropriate risks.

It is not appropriate to completely prevent use of devices, because they can be useful for schoolwork, to research appropriate information, to improve peer relationships and moderated use can be a productive leisure activity.

Foster carers also need to be mindful that all children in our care have to be supported to develop their independent living skills; responsible internet use is a key aspect of supporting children to keep themselves safe.

That said, depending on the age, ability and risks to young people, use of devices can be limited, and in some cases (following consultation with the children’s social worker), devices can be withheld to prevent risk.

Risk assessments.

Appropriate use of devices should be considered for children and young people individually, according to the child’s needs, understanding, exposure to risk, and the child’s wishes and feelings.

The team around the child (led by the child’s social worker, and including the foster carers, supervising (fostering) social worker, the birth family where appropriate and other involved professionals) should work with the child to agree boundaries and expectations around device use.

Boundaries around appropriate device use may need to be outlined in the child’s care plan, risk assessment and/or safer caring policy. The assessments should consider which devices can be used, which apps (such as social media or games) can be used, how long for, and where they can be used (e.g. only in the presence of the foster carer or other appropriate adult). ‘Pay As You Go’ and limited data may be used to control internet and phone use. There are also apps which can be used to restrict or monitor device use. 

Appropriate consequences for not adhering to boundaries may also be agreed in the plan and explained to the child.

Any new concerns should be discussed with the children’s social worker and the team around the child so that the risk assessment and plans can be updated accordingly.

Supporting children to stay safe.

Schools work with children to teach them about some of the risks of social media. Schools also invite parents and carers to attend awareness workshops and they share information about concerning trends and challenges. Foster carers are expected to attend meetings offered by schools and to read any updates provided.

Foster carers, social workers and other professionals should be able to speak openly with the child about device use and online risks; however, details about new apps, challenges or other risky information should not be offered to the child, because it may lead to the child being interested in that subject.

Children should be advised to not share their real name, location, photos or any other personal information online. Children should also be advised that photos can provide identifying information, such as school uniforms or local landmarks.

Foster carers and their families and friends (and the team around the child) should also be mindful of what information they post online, because there is the potential for the child and/or their family to find the information. Where possible, profiles should be made ‘private’ or content moderated.

Foster carers may need to check the children’s profiles to monitor their activity (if this is agreed in the child’s care plan). Foster carers may need to set up ‘fake’ (using false information) profiles to do this. It may also be agreed that the foster carer should know the child’s passwords so that profiles and messages can be checked if required – this is less likely for older children who should be given the opportunity to demonstrate responsibility and trust. Foster carers would also be wise to check browsing history and/or device usage times when there are any concerns – supervising social workers can show foster carers how to find this information if required.

Foster carers should ensure that the child is engaged in productive activities that interest the child; the more time children spend on productive activities, the less time (and interest) they will have to go online. Supporting the child’s productive interests (such as arts and sports) would also raise the child’s self-esteem and confidence, which reduces the child’s vulnerability to a range of risks, including online risks.

Speaking openly and kindly with children, and supporting them in their productive interests, helps to build a trusting relationship between the foster carer and child. If there is trust, the child is more likely to speak with the foster carer about their online activities and they are also more likely to take the foster carer’s advice about staying safe.

Monitoring accounts and log-ins.

Foster carers should change their own passwords regularly, to ensure that children don’t have access to private information. 

Foster carers should not give passwords or log-ins to children. 

Foster carers should be very mindful of apps and games that allow spending, and they should ensure that children are not able to make purchases without permission. 

Friends and family of the foster carers should also be alerted to online risks if there is a possibility that the children may be using their devices. 

Additional resources (Optional)

An introduction to Report Remove, an online self-reporting tool for young people. A blog by NSPCC. 

Be Internet Legends (by Google).

Being exposed to legal, but harmful, content onlined.

CEOP website (Child Exploitation and Online Protection).

Challenging victim blaming language.

Check your email security: A free government service to help UK organisations check for cyber vulnerabilities

Childnet.

Children and screens: Reaping the benefits, reducing the harms. The short article from Parenting Matters provides some tips about managing screen time. 

The Children’s Commissioner’s view on artificial intelligence (AI).

Classifying and responding to online risk to children: Good practice guide. By ‘CORE’  – a knowledge base on children and youth in the digital age. 

Cybercrime: A guide by Virgin media: Virgin Media O2 experts have created a cyber security safety test that aims to build awareness and educate users of all ages on how to better protect themselves from online threats.

Digital Footprints.

Evidence on pornography’s influence on harmful sexual behaviour among children. From the Children’s Comissioner. 

Gaming.

How to counter online hate and extremism with young people (by Internet Matters).

Internet Watch Foundation (will remove inappropriate online content).

I’ve made a new friend online. But I’m worried. What do I do? Guidance from ‘Stop It Now’ for children with learning needs and younger children. 

Lucy Faithful Foundation: Is a UK-wide child protection charity that works to stop child sexual abuse. 

Managing risk and trauma after online sexual offending: A whole-family safeguarding guide. 

Marie Collins Foundation – ‘Our vision is that every chld harmed by technology-assisted child sexual abuse will be guided and helped throughout their recovery journey. They have the right to suffer no further harm’. Victoria Green CEO.  

 

Molly Russell inquest findings.

NSPCC guidance for online safety.

NSPCC online safety training.

NSPCC report ‘Online harm and abuse: Learning from serious case reviews.

NSPCC safeguarding 16 to 25 years olds training. 

Online harm and abuse: Learning from serious case reviews (NSPCC).

Online misogyny: what impact is it having on children? A podcast exploring the impact on children of watching pornography. By The Guardian. 

Over 75 percent of 6 to 7 year olds have joined in with an online chat. By Natterhub: Preparing children to thrive online. 

Parents’ guide to Facebook.

Parents’ guide to social media and IM apps.

Parents’ guide to Twitter.

Podcast episodes discussing online wellbeing and young people. By ‘Safe, Secure, Online’. 

Preventing sexually harmful behaviour of young people – introducing the ‘Inform and the Shore’ initiative. A recording by The Association for Child and Adolescent Mental Health. 

Revealed: almost half of British teens feed addicted to social media, study says – an article in The Guardian.

Safer Internet Day free resources for work with different age groups from the ‘UK Safer Internet Centre’.

Secret apps.

Sexting.

Sextortion – A podcast by The Guardian

Sextortion – Resources from Safer Internet.

Sharing nudes and semi-nudes online training 

Social media triggers children to dislike their own bodies, stem4 survey finds. Also including details of an app to help children who are struggling. 

Teen chat rooms.

UK Council for Child Internet Safety (includes guidance on race and faith targeted bullying, and guidance on sexting in schools).

Yham offers free safeguarding training for professionals (including foster carers) working with children in care – on the topic of the potential risks of gaming and gambling.