World War Internet - The codes of practice and laws that are hitting back at the tech giants
In my last blog I discussed the Online Harms White Paper from the UK goverment that seeks to make the UK the safest place to be online. But this is not the only legislation that is being tabled to take some control back from the tech giants who have been accused of creating the "Wild West" of the internet. So what are the other proposals and legislation:
The ICO's Age Appropriate
design: a code of practice for online services
When the EU's
General Data Protection Regulation was enshrined into UK law, government
officials also requested that some specific areas were addressed. One of those
areas was to provide online services with a code of practice that should be
followed when designing services that are aimed at children or could be
accessed by children.
The proposal is that
any online service that provides a service to a child OR could be accessed by a
child will need to meet the code of practice - it also explicitly addresses
online services that say that they are not designed for children but in reality
are actually used by children - back to the social media platforms that often
say that their services are for over 13/16 only but can easily be accessed by
younger children just lying about their age.
These services will need to prove how they prevent children from
accessing their services or comply with the new code of practice.
The code of practice
outlines 16 standards of age-appropriate design for online services likely to
be accessed by children. The full code
is here: https://ico.org.uk/media/about-the-ico/consultations/2614762/age-appropriate-design-code-for-public-consultation.pdf
but in summary the main areas are:
- Children should have a high amount of privacy by default. This means turning geolocation off by default and where there are different privacy settings, the highest level of privacy should be on by default. Profiling should also be off by default unless the company can prove that they can protect the child from potential harmful threats of profiling. Only data that needs to be collected to provide the service should be collected, this should be general data protection practice under the UK data protection act anyway.
- Everything should be in the best interest of the child, this must be the primary consideration when collecting and processing data on children. This includes ensuring the child understands when they may be monitored by parents
- All privacy information, agreements and explanations should be age appropriate for all the different ages of children that may access the service. It must be transparent to the child what data is being collected about them and how it will be used
- Services must always take into account the wellbeing of the child, whether that be ensuring that the child's data is not used to the detriment of the child or shared without a compelling reason to do so. The services must also not use nudge techniques that encourage a child to reduce their privacy settings.
Whilst the majority
of these things are just general good practice in data protection, strengthened
to take into account the age of the child and their understanding of online
privacy - the nudge techniques have come to be part and parcel of the social media
platforms that are popular today, with children constantly searching for more
'likes'. This may require a fundamental
change to the way that these platforms are designed or they will need to
enforce some kind of age verification to ensure that children do not use their
services. The latter is likely to be an unpopular option for those platforms
that rely on these younger users to maintain user numbers and associated
advertising revenue.
Once again
Instragram could possibly lead the way in this area, with some experimentation
of posts where 'likes' remain hidden to everyone but the poster.
Are we likely to see
an approach similar to that of some US companies when GDPR came into effect
where EU citizens were effectively barred from accessing services from those
companies? EU citizens were identified
through their IP address and told in no uncertain terms that the company
couldn't guarantee that they confirm with GDPR and therefore weren't welcome.
This could be one
approach, although for the tech companies that are looking for global reach -
the UK market is not an insignificant one that could be ignored easily.
Pornography Age Verification
The UK's age
verification for pornography is aiming to prevent under 18's from being able to
access commercial pornography. The main aim is to prevent children from
'stumbling' across this kind of pornography.
The children's
commissioners report into online pornography identified that most young people
under 11 have never seen porn but where they have it has been by accident,
through a pop-up advertisement for example.
But as the child
gets older, they are more likely to have seen pornography - 3 in 10 children
aged 11-12 that were surveyed had seen pornography but this increased to 7 in
10 with 15-16 year old children.
The report
highlighted concerns that pornography gave young people a false reality and may
also encourage young people to generate their own content in the form of naked
selfies or sexting.
One of the
recommendations of the report was to enforce age verification before porn could
be viewed online to prevent young children from accessing it accidently.
Privacy campaigners
have also raised concern about this system as adults that want to access online
pornography will need to prove that they are over 18. This could lead to
commercial providers of pornography storing more private information about the
individuals that use that service - any kind of data breach in this area could
be highly embarrassing for those individuals and potentially harmful if the
data breach also included details of their browsing habits within the service -
potentially revealing sexual preferences. Those that remember the Ashley
Maddison data breach may think twice about allowing their data to be stored by
one of these platforms.
There will however
be alternative ways of proving age with tokens being purchasable from shops
where they will need to verify the age of the person purchasing that token,
just as they would any other age restricted purchase.
This system will be
regulated by the British Board of Film Classification and they will have the
power to demand that UK ISPs block access to platforms that don't enforce age
verification.
There have also been
concerns that age verification on commercial sites will drive those young
people determined to access pornography to the unregulated world of the dark
net or image hosting sites - potentially stumbling across illegal content in
their quest to view pornography.
Article 13 of the EU's new
copyright rules
This new EU
legislation is not about protecting users from harm and so does not fall into
the scope of the UK government making the UK the safest place to be online, it
was however fully supported by the UK government and is another way that the
world's biggest tech companies are being challenged to make changes to their
platforms.
These rules are
there to protect copyrighted material and place the onus on the platform to
detect and remove copyrighted material. In the context of a platform like
YouTube, this could be a huge task and may result in uploaded material going
through some kind of filter to check for copyrighted material - it certainly
wouldn't be possible with a human moderator approach given the scale
involved.
This could also
result in EU internet users once again getting different search results from
the rest of the world with only content that is known to be non-copywrite
appearing and viewable. This would
follow a similar approach that some US firms have taken over new EU data
protection regulations, seeing it as easier to block EU users from accessing
the services than complying with the new laws.
Lawyers blog I loved it
ReplyDelete