Safeguarding and Counter Terrorism Conference Report
We recently attended a conference held by the Police Service's Counter Terrorism Internet Referral Unit (CTIRU). The conference was intended to provide an opportunity for a wide variety of organisations involved in online safety to get together and discuss topical issues. There were delegates from online safety suppliers, such as ourselves, as well as from the Home Office, SO15 Counter Terrorism Command, CTIRU, internet service providers, the Internet Watch Foundation (IWF) and the UK Safer Internet Centre (UKSIC) to name a few.
Whilst the event was organised by the counter terrorism branch of the Police Service, all aspects of online safety were covered. We have written this report to keep schools in the loop regarding ongoing developments in the world of online safety.
Opendium provides sector-leading internet filtering and monitoring solutions that enable schools and colleges to implement a truly effective safeguarding strategy. For more information or to book a demonstration, please visit opendium.com/demo
Counter Terrorism
All schools in the UK are required to block access to websites which appear on "the police assessed list of unlawful terrorist content, produced on behalf of the Home Office". This is the block list produced by CTIRU, and we were given some good context around this list by the officers of SO15 as the event kicked off. These police officers are on the front lines in the UK's domestic anti-terror operations and they explained that the block list contains only terrorism related web sites, which have been determined to be unlawful, but haven't yet been taken down at source. CTIRU work to have the content removed entirely and if successful, the web site is also removed from the block list.
CTIRU announced a number of upcoming improvements to the block list, which is integrated into our Radicalisation filtering category. This included an intention to inject known verification addresses into the block list, to provide a safe way to verify that a school's systems are correctly blocking access to these harmful websites. There was a quick Q&A session, in which we raised some concerns regarding the inclusion of certain types of web addresses in the block list which cannot be effectively filtered. CTIRU made a commitment to follow up with us after the event to more fully discuss these concerns.
Throughout the day we had a number of breakout discussion sessions, with someone from the Home Office or Police taking notes and mediating discussions on each table. Alongside ourselves, our table had delegates from E2BN, Impero, Zen Internet, the IWF, CTIRU and the Home Office. As most of us were mainly involved with safeguarding in schools, the discussions were very productive and focussed on this aspect.
We explained that safeguarding children is all about monitoring for warning signs and providing education when necessary, rather than just blocking content. We discussed with CTIRU the possibility of having more access to the information they hold on extremist content which, whilst not illegal, is still a safeguarding concern. Although blocking such legal content may be undesirable, this kind of information is still invaluable for profiling users. This would allow online safety systems to produce earlier warnings about concerning behaviour and thus allow earlier intervention.
We were all in agreement that if a child ever intentionally accesses a website that is on the CTIRU's block list, that is a sign that a school's safeguarding is already failing. The more opportunities there are for earlier intervention, the better. Indeed, CTIRU announced that they were already putting together a keyword list to help suppliers to proactively identify extremist content [Note: as of 11th July 2023, this keyword list has not yet come to fruition and the CTIRU have asked us to clarify that this is the current position]. We immediately volunteered to beta test the list as we will be able to determine important factors such as false positive rates, etc. We also explained that it is important to retain a list of the websites which were once on the block list but have since been removed, as it provides key information regarding web pages that still contain dead links to the removed content.
CTIRU is also considering publishing on their website a list of providers that receive their block list. This is envisaged to help schools to find alternative filtering providers, and certainly can't do any harm. The provider responses1 which are published on the UKSIC website largely already provide this information though.
Encryption
Another delegate raised concerns about the possibility of the (currently experimental) DNS-over-HTTPS protocol becoming widespread. This would encrypt DNS traffic and prevent monitoring and filtering systems from inspecting it. For the most part, filtering systems do not need to inspect DNS traffic, so we have felt that this isn't a serious concern at the moment.
However, HTTPS is becoming a big problem for schools and we raised this as an important point for discussion. Schools usually install a certificate onto their devices in order to decrypt and examine HTTPS traffic. In Bring Your Own Device networks, users are usually expected to install such a certificate on their own device in order to use the network. Without this capability, there are serious safeguarding problems, such as:
- concerning web searches cannot be reported and alerted upon;
- dynamic content filtering based on the content being served rather than just the website's address becomes impossible;
- only the host name of the web site is visible to online safety systems, so it becomes impossible to allow one part of a website whilst blocking another (e.g. allowing the BBC website whilst specifically blocking https://www.bbc.co.uk/cbbc/games during lessons);
- auditing capabilities become reduced - e.g. if the school is banned from Wikipedia because a user is defacing articles, it may be impossible to determine which user is responsible and which users were legitimately using Wikipedia for research purposes;
- it becomes impossible to filter or audit access to some websites due to HTTP/2.0's connection reuse capabilities;
- virus and malware filtering at the network border becomes impossible;
- around 79% of the content on the CTIRU filtering list cannot be blocked;2 and
- up to 92% of the content on the IWF Child Abuse Image Content list may not be blocked.3
Unfortunately, many popular providers such as Facebook and Twitter have made a conscious decision to disallow this kind of inspection of their mobile apps, and Google have designed recent Android devices to be actively hostile to these technologies.4
All online safety suppliers are being affected by these problems, but they aren't being widely discussed in public. The feeling at the conference was that no one wanted to be the first vendor to stand up in public and say "we have a big problem". Well that's what we're doing - we're standing up and announcing that all of the online safety vendors, us included, are facing a huge challenge brought about by some of the biggest online businesses.
The Home Office themselves as well as big suppliers, such as RM, echoed the same experience as we have had with Google; that discussions are fairly constructive with the UK side of Google, and then it all goes quiet as soon as the US side gets involved.
To be clear, Google's actions offer only a marginal increase in the security of home users, whilst being tremendously harmful to the security and safety of students, schools and businesses.
The IWF suggested that they may be able to act as a facilitator to reopen discussions with Google, and we will be exploring that opportunity. However, in our view, it is likely that legislation will eventually be required to force the issue - at the very least to ensure that schools do not purchase devices that cannot be adequately safeguarded.
Guidance
A big concern that repeatedly came up in the breakout discussions was the rather woolly guidance from the Department for Education when it comes to online safety. Indeed, Keeping Children Safe in Education5 distils the huge subject of online safety down to just 3 pages: the infamous "Annex C".
The UK Safer Internet Centre has added some additional guidance, but fundamentally, phrases such as "to what extent does the filter system block inappropriate content via mobile and app technologies"6 and "the school should carefully consider how [3G and 4G internet access] is managed on their premises"7 are not terribly helpful. We pushed the point that schools really do need better advice.
There are a myriad of apps that simply cannot be monitored or filtered by any system. How should a school handle those? Block them and watch the students simply use them anyway over the mobile network (and lose the ability to monitor their other traffic), or allow them and worry about the school's liability were a safeguarding issue to occur on their wifi?
Fundamentally, the current guidance asks schools to assess the risks in order to determine what is appropriate in terms of filtering and monitoring. A school that has a big bullying problem may need to invest in thorough electronic monitoring systems, whilst another school may decide that the risk is low and rely on staff visually monitoring students' computer use. This risk based approach seems right to us, but without improved guidance we feel that schools often lack the knowledge needed to make informed decisions.
There was also a consensus that online safety in schools is often offloaded to an ICT department who are ill equipped to handle it, instead of being handled by the trained safeguarding staff. David Wright of the Safer Internet Centre made the point perfectly: having the ICT department handle online safety "because they understand the internet" is like sending a child that brings drugs to school to the chemistry department "because they understand chemical compounds". We felt that we could help schools an awful lot better if we were directly involved with their safeguarding staff over online safety matters, rather than the ICT technicians being expected to make the decisions regarding online safety.
Thank You
Finally we'd like to say a big thank you to the CTIRU and Home Office for organising the conference, everyone who gave talks - the Police, Sky, IWF, UKSIC - and to all the delegates who turned up and participated in very enlightening and frank discussions.
Footnotes
- UK Safer Internet Centre, Appropriate Filtering and Monitoring, Provider Responses - https://www.saferinternet.org.uk/advice-centre/teachers-and-professionals/appropriate-filtering-and-monitoring/provider-responses-0
- The July 2018 edition of the CTIRU filtering list contains 12931 URIs. Of which, 2654 of the URIs are HTTP, 10257 are HTTPS and 20 of the URIs do not contain a scheme, so could be either HTTP or HTTPS. Of the 20 URIs without schemes, 18 include a path or query string. Of the 10257 HTTPS URIs, 10235 include a path, query string or fragment. Therefore, HTTPS decryption is required to block access to between 10235 and 10253 of the URIs, or just over 79% of the content in the CTIRU filtering list.
- On 30th July 2018, the IWF CAIC list contained 7345 URIs. The IWF don't include a scheme, so we don't know how many of the URIs are HTTPS. However, 6783 of them include a path. Therefore, HTTPS decryption may be required to block access to up to 6783 of the URIs, or just over 92% of the content in the IWF CAIC list. Additionally, HTTPS decryption is required in order to perform real time content filtering of HTTPS websites using the IWF's keyword list.
- Android Developers Blog, Changes to Trusted Certificate Authorities in Android Nougat, 7 July 2016 - https://android-developers.googleblog.com/2016/07/changes-to-trusted-certificate.html
- Department for Education, Keeping Children Safe in Education: for Schools and Colleges, 19 September 2018 - https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/741314/Keeping_Children_Safe_in_Education__3_September_2018_14.09.18.pdf
- UK Safer Internet Centre, Appropriate Filtering, Filtering System Features - https://www.saferinternet.org.uk/advice-centre/teachers-and-school-staff/appropriate-filtering-and-monitoring/appropriate-filtering
- Department for Education, Keeping Children Safe in Education: for Schools and Colleges, 19 September 2018, Annex C, Filters and Monitoring - https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/741314/Keeping_Children_Safe_in_Education__3_September_2018_14.09.18.pdf