Skip to content
This repository has been archived by the owner on May 6, 2020. It is now read-only.

Can we enable <16s to use matrix.org with parental consent? #190

Open
lampholder opened this issue Jun 1, 2018 · 0 comments
Open

Can we enable <16s to use matrix.org with parental consent? #190

lampholder opened this issue Jun 1, 2018 · 0 comments

Comments

@lampholder
Copy link
Member

GDPR says (Article 8, section 2):

The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.

As to what 'reasonable efforts' are, the ICO has this to say:

This varies depending upon the risks inherent in the processing and the
technology that is available.... if your ISS (online service) allows individuals to post personal data via an unmonitored chat room, it becomes more risky to allow a child to participate. You therefore need to adopt more stringent means to verify the consent you’ve obtained. For example, you may decide to use a third party verification service - to verify that the child is old enough to provide their own consent, or to check the identity of the person claiming parental responsibility and confirm the relationship between them and the child.

In terms of reasonable efforts, and in light of the ICO's description above, it seems that a simple checkbox asserting that a <16 user has parental consent might not be sufficient.

If we want <16s to be able to use matrix.org, it's important that they and their 'holder of parental responsibility' understands how matrix.org will process their data. The ICO recommends having two versions of your privacy policy - one for adults and one for children.

What are other companies doing?

  • Microsoft is charging adults a small fee to verify themselves, after which they can provide consent for children accounts within their 'Microsoft family'.
  • WhatsApp has a blanket policy like our current one, disallowing <16 year olds from using the service.

There are three choices:

  1. Ask children to indicate that they have parental consent (either via a checkbox, or by providing contact details for said parent) and make a case for that being 'reasonable' based on the available technology and risk profile
  2. Implement or integrate with a third party service establishing parental consent in a robust, defensible manner
  3. Leave the situation as it is

Options 1. and 2. above would also require us to write another, child-friendly privacy policy (and get legal sign-off on this).

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

1 participant