We often say in our texts that we will do things without compromising privacy or the like. So what is the idea behind that?
The truth is: It is probably impossible to create a system that is both open for public and personal benefit and 100% privacy-preserving. But we are aiming for the best standards and will make sure that it
- gives users control over who accesses their personal information,
- provides transparency about what personal data exists,
- ensures that only non de-anonymisable data leaves the user’s property,
- allows trusted services to use their users’ personal data, but under the rules of privact.
- is a trusted partner with no conflict of interest in governing the system,
This requires both technological and organisational measures. Perhaps the easiest way to understand this is to start by looking at the two main use cases:
A service is used here in a broad sense. It could be an online shop, your personal health coach, or a website that wants to serve you an ad. Each service must apply to privact to join our ecosystem. In doing so, the service must accept our terms and conditions. In return, they will be allowed and enabled to store and process data in the personal database.
It is important to recognise that there is an individual agreement between the service provider and the user. The user has granted the service exclusive rights to some or all of the personal data, e.g. as part of that company’s terms and conditions. This would typically happen when the user installs an application that requires certain data to function.
There are few restrictions on what the service can do with the data, but because it is privact-compliant, everything is governed by privact’s rules. And that includes not storing any of the users’ data in the organisation unless it is necessary for the service to function, such as transaction data or the delivery address. It also prohibits combining data from the privact data pool with other data sources to prevent fingerprinting and similar techniques, and in some cases the service may even transfer data to the service provider or even third parties. This transfer is also covered by the privacy rules and requires explicit user consent for each transfer.
In essence, someone needs to check that services are compliant and do so on an ongoing basis. Policies also need to evolve and adapt as new services, innovations and best practices emerge.
This auditing process involves much more than just looking at the software code. It also involves auditing the handling of legitimate data and ensuring that it is not improperly combined with data from other sources, potentially compromising privacy.
None of this is 100% secure. The system can be abused and privacy can be violated. Some trust is still required. But we have turned the tables. Services have to comply with the terms and conditions of privact. If something looks suspicious, we have the right to investigate, ban or even take it to court.
Research and innovation often require the analysis of large amounts of data. In our framework, there will be a very large pool of high quality personal data that can bring many benefits to society.
It is important to recognise that there is no individual agreement between the survey issuer and the user. Therefore, access to personal data has to be managed differently.
Organisations that comply with Privact are given limited access to personal data. Privact calls this “donation of personal data” - although no personal data is ever transferred.
To be compliant, the survey must respect the privacy settings of all users (e.g. if someone does not want to disclose their religion or sexual orientation, this must be respected) and all other privact rules, such as publishing carbon neutrality (for entities other than individuals). The results are only available via a proxy server provided by privact. privact in turn ensures that the results of the survey are fully anonymised and only used for statistical purposes. privact does not store any personal data except for the duration of the compilation of the required statistics. Once the data is fully anonymised in a way that it cannot be de-anonymised, it is no longer personal data (even under GDPR) and can be published.
How will this work? Organisations can register with the NGO to get access to the data pool. In return, they have to disclose certain information, such as when they plan to become carbon neutral. Again, it is up to the NGO (and our partner NGOs) to verify this information.
Once registered, the organisation can create a survey. They will have to do this on a service that we provide. So they cannot freely code against the data pool, but are limited to what privact allows. Depending on the confidentiality of the data accessed, the survey will either be published immediately or it will have to be reviewed.
Publishing the survey means that we host a public list of running surveys. Each user’s device will now periodically check to see if a survey is available that matches the user’s privacy settings. If so, it will add the necessary user data to a database hosted by privact. This has to be done as securely as possible and without any knowledge. The results will only be published when they can no longer be de-anonymised and are statistical in nature. Only the results will be published, not the dataset itself. Instead, the dataset will be destroyed when the results are published.
This will not allow the same options for data analysis that are currently available to researchers working with their own dataset. If they want to look at a sub-population in more detail, they will have to create a new survey. This is acceptable because our asset is the large, high quality and easily accessible data pool. But it allows us to guarantee that each survey cannot be de-anonymised. We will encourage research into new evaluation methods to extend the possibilities for researchers, but the privacy of our data owners must always be our first priority.
At the end of the day, it takes some trust in privact to do things right.
privact’s role is to provide a trusted marketplace for personal data. As the system relies on privact to manage the compliance of market participants, it is crucial that trust in privact is rock solid, and for good reason. Transparency and audit of privact itself will be essential to build this trust.