Acorn Newsbot
Junior Member
- Joined
- Jan 28, 2006
- Posts
- 22,612
- Reaction score
- 127
Launched in 1996, the UK’s not-for-profit Internet Watch Foundation (IWF) is a global resource providing high quality data and technical services to prevent criminals uploading, storing, distributing and trading child sexual abuse imagery around the world, and we’ve been loyal supporters of its work since 2005.
It provides Europe’s largest hotline for combating online child sexual abuse images and videos and a critical support to tech companies, governments and regulators in creating safe online environments.
We provide IWF with access to our domain portfolios, enabling the charity to identify cases of Child Sexual Abuse Images (CSAM) in relation to .UK, .Cymru and .Wales domains. Through our Countering Online Harms Innovation Fund and support of the UK Safer Internet Centre, we’ve provided over £2.5m to fund the IWF’s vital work since 2018.
This year’s key data
According to the IWF’s Annual Report for 2023, thousands of images and videos of three- to six-year-old children who have been groomed, coerced and tricked into sexually abusive acts, are now being found on the open internet.
The analysis shows how these children are now being targeted by “opportunistic” internet predators who manipulate them into sexual activities. The abuse is directed by perpetrators and often recorded without the child’s knowledge. This so called “self-generated” child sexual abuse imagery, where a perpetrator is remote from the victim, is then shared far and wide on dedicated child sexual abuse websites.
The IWF is discovering more child sexual abuse imagery online than ever before in its history. In 2023, the IWF found 275,652 webpages containing child sexual abuse – a record-breaking amount. Each webpage can contain thousands of images or videos.
What are we doing about it?
In the past year alone, Nominet’s funding enabled a new breakthrough in ‘clustering’ technology. This latest advance means the IWF can now identify and take down child sexual abuse imagery 112% faster than before.
Clustering works by linking similar images together, so that different images with the same victim can quickly be identified from different parts of the internet, then blocked and removed with the help of tech companies and law enforcement agencies. In this way, analysts can assess criminal images in bulk without having to individually assess hundreds of images.
You can find out more about this in conversation with IWF’s CTO, Dan Sexton.
And now, the IWF can immediately identify previously-seen images of child sexual abuse, without re-exposing analysts to the criminal material.
Our funding has enabled the IWF to move its internally developed platforms on to a dedicated, and highly performing server cluster. Known as a hyperconverged infrastructure (HCI), this solution has given them capabilities which they didn’t have the power to run in real time before.
Thanks to the HCI, the IWF can now run live web scraping, which is compared against its datasets, enabling known child sexual abuse material to be immediately identified wherever it is found, without human intervention.
The HCI will further enable advanced capabilities using computer vision that gathers information from images and videos, and machine learning models. The result of this means the IWF expert analysts can focus on critical classification and intelligence gathering work.
You can read this news in full in the IWF Annual Report.
Meanwhile, we are extremely active in the wider online harms space. We stepped in to fund the UK Safer Internet Centre’s (UKSIC) work following the UK’s departure from the EU – supporting it until 2025. Formed of Childnet, Internet Watch Foundation and SWGfL, it works together to identify threats and harms online and then create and deliver critical advice, resources, education and interventions that help keep children, young people and adults safe.
In 2023, we also funded Internet Matters to carry out research into the growing problem of self-generated child sexual abuse material – echoed in the IWF Annual Report. Internet Matters has worked with panels of young people, parents and teachers to co-create effective messaging to dissuade young people from requesting, creating and sharing self-generated CSAM, and understand how to best deliver this prevention messaging.
The full findings of this research is expected in early May 2024.
We also play our part in making sure the .UK namespace is safe and secure from criminal activity. In 2023, just 0.1% of child sexual abuse material the IWF reported was traced to a UK server. On the rare occasion it was found in UK hosting space, IWF works with the right companies to get it deleted – fast. In nearly half of all cases (43%), it was removed in two hours or less. The fastest removal time was just one minute.
What can you do?
The public is given this advice when making a report to iwf.org.uk/report:
Resources from the National Crime Agency’s CEOP Education programme can help you introduce online safety to young children in an accessible and enjoyable way:
Using parental controls and privacy settings can also support you to manage your child’s devices, app content and functions. Read CEOP Education’s articles on Using parental controls and Privacy settings: a guide for parents and carers for more tips.
The post #BehindTheScreens: IWF Annual Report findings – and how we’re helping to prevent online harms appeared first on Nominet.
Continue reading...
It provides Europe’s largest hotline for combating online child sexual abuse images and videos and a critical support to tech companies, governments and regulators in creating safe online environments.
We provide IWF with access to our domain portfolios, enabling the charity to identify cases of Child Sexual Abuse Images (CSAM) in relation to .UK, .Cymru and .Wales domains. Through our Countering Online Harms Innovation Fund and support of the UK Safer Internet Centre, we’ve provided over £2.5m to fund the IWF’s vital work since 2018.
This year’s key data
According to the IWF’s Annual Report for 2023, thousands of images and videos of three- to six-year-old children who have been groomed, coerced and tricked into sexually abusive acts, are now being found on the open internet.
The analysis shows how these children are now being targeted by “opportunistic” internet predators who manipulate them into sexual activities. The abuse is directed by perpetrators and often recorded without the child’s knowledge. This so called “self-generated” child sexual abuse imagery, where a perpetrator is remote from the victim, is then shared far and wide on dedicated child sexual abuse websites.
The IWF is discovering more child sexual abuse imagery online than ever before in its history. In 2023, the IWF found 275,652 webpages containing child sexual abuse – a record-breaking amount. Each webpage can contain thousands of images or videos.
What are we doing about it?
In the past year alone, Nominet’s funding enabled a new breakthrough in ‘clustering’ technology. This latest advance means the IWF can now identify and take down child sexual abuse imagery 112% faster than before.
Clustering works by linking similar images together, so that different images with the same victim can quickly be identified from different parts of the internet, then blocked and removed with the help of tech companies and law enforcement agencies. In this way, analysts can assess criminal images in bulk without having to individually assess hundreds of images.
You can find out more about this in conversation with IWF’s CTO, Dan Sexton.
And now, the IWF can immediately identify previously-seen images of child sexual abuse, without re-exposing analysts to the criminal material.
Our funding has enabled the IWF to move its internally developed platforms on to a dedicated, and highly performing server cluster. Known as a hyperconverged infrastructure (HCI), this solution has given them capabilities which they didn’t have the power to run in real time before.
Thanks to the HCI, the IWF can now run live web scraping, which is compared against its datasets, enabling known child sexual abuse material to be immediately identified wherever it is found, without human intervention.
The HCI will further enable advanced capabilities using computer vision that gathers information from images and videos, and machine learning models. The result of this means the IWF expert analysts can focus on critical classification and intelligence gathering work.
You can read this news in full in the IWF Annual Report.
Meanwhile, we are extremely active in the wider online harms space. We stepped in to fund the UK Safer Internet Centre’s (UKSIC) work following the UK’s departure from the EU – supporting it until 2025. Formed of Childnet, Internet Watch Foundation and SWGfL, it works together to identify threats and harms online and then create and deliver critical advice, resources, education and interventions that help keep children, young people and adults safe.
In 2023, we also funded Internet Matters to carry out research into the growing problem of self-generated child sexual abuse material – echoed in the IWF Annual Report. Internet Matters has worked with panels of young people, parents and teachers to co-create effective messaging to dissuade young people from requesting, creating and sharing self-generated CSAM, and understand how to best deliver this prevention messaging.
The full findings of this research is expected in early May 2024.
We also play our part in making sure the .UK namespace is safe and secure from criminal activity. In 2023, just 0.1% of child sexual abuse material the IWF reported was traced to a UK server. On the rare occasion it was found in UK hosting space, IWF works with the right companies to get it deleted – fast. In nearly half of all cases (43%), it was removed in two hours or less. The fastest removal time was just one minute.
What can you do?
The public is given this advice when making a report to iwf.org.uk/report:
- Do report images and videos of child sexual abuse to the IWF to be removed. Reports to the IWF are anonymous.
- Do provide the exact URL where child sexual abuse images are located.
- Don’t report other harmful content – you can find details of other agencies to report to on the IWF’s website.
- Do report to the police if you are concerned a child may be in immediate danger.
Resources from the National Crime Agency’s CEOP Education programme can help you introduce online safety to young children in an accessible and enjoyable way:
- 4-7s website – introduce key topics of watching videos, sharing pictures, online gaming and chatting online through play and earning online badges.
- Jessie and Friends films and storybooks – watch the animated films and read the storybooks together to help your child build the knowledge, skills and confidence to respond to risks they may encounter online.
Using parental controls and privacy settings can also support you to manage your child’s devices, app content and functions. Read CEOP Education’s articles on Using parental controls and Privacy settings: a guide for parents and carers for more tips.
The post #BehindTheScreens: IWF Annual Report findings – and how we’re helping to prevent online harms appeared first on Nominet.
Continue reading...