Last Updated on
Canada is one of the many nations using -or attempting to use- contact tracing as a way to combat the COVID-19 pandemic. Earlier in the year, Ontario launched a health data platform called the PANTHR system, an AI that intended to collate de-identified medical data to help track the pandemic.
At the time, many wondered if this focus on contact tracing would expand into an app similar to the one used in South Korea, it now seems predictions were correct. The question on everyone’s lips then is, where does the tracing stop?
Contact Tracing App and the Privacy of Canadian Citizens
Contact tracing apps are designed to map the spread of COVID and catch cases before a person (often unknowingly) spreads it further.
They work by tracking symptoms through questionnaires on the app, then checking if mobile devices belonging to people with symptoms have come into close proximity with other app-using devices, giving a picture of the threat of the spread to the owners of said devices.
Naturally, this requires access to ambient personal data in the form of tracking your locations and movements. For the medical necessity of COVID, many are happy to help the health sector in its colossal task by sharing info; but there has already been an undercurrent of concern about who could access and how they might use this data.
Canada’s Commitments to Privacy Limits
Though the concern is merited, the government isn’t blind to the danger, and if anything, sound concerned about it themselves, stating: “Some of these measures will have significant implications for privacy and other fundamental rights”.
In response to the implication, the office of the privacy commission released a joint statement with all local and national branches, outlining the use, limits, and precautions taken during the rollout of the contact tracing app.
The statement is titled ‘Supporting public health, building public trust: Privacy principles for contact tracing and similar apps’ so let’s delve into it and see how they intend to balance the needs of the COVID-emergency with privacy.
Both technically and legally, the commission promised strong, appropriate safeguard standards. These will be imposed on both the government themselves and any outside developers, this would ensure they were legally liable if they mishandled the data.
The only use appropriate is for the current medical emergency, anything beyond that is strictly forbidden and made difficult technically. On top of all this, the authorities handling the data have been given guidelines, so they are aware of the threats to the data in their hands.
With both the risk and the public hesitation, constant open dialogue is important to allay fears and ensure transparency. To this end the privacy commission has promised to be accountable, publicly publishing a regular monitoring and evaluation plan that covers both the medical efficacy and the usage of data.
Added to this is the oversight of a third party, likely from the privacy commission itself as it monitors the program, but itself has no influence on its working.
These oversight activities could come in a number of ways, such as independent audits; but all ensure a high degree of accountability and immediate highlighting of issues
Furthermore, the privacy commission recommended that, should any failures of the app be noticed, it should be: “decommissioned and any personal information collected should be destroyed”
Not only will the entire app’s monthly audits be published, but the inner workings of the third-party auditors themselves will be open to public scrutiny and freely available.
This includes info on storage, use, retention, and destruction of data; so that the public can be informed from start to finish of the app’s inner workings.
To make sense of the data, plain language, understandable digests called Privacy Impact Assessments (PIA’s) will be published frequently.
So that data can’t be kept or continually monitored indefinitely, the commission has instructed that all information collected by the app be destroyed with the end of the COVID-19 crisis, and the app itself decommissioned.
Much like the preceding and concurrent PANTHR system, wherever possible the app will use de-identified data, meaning it cannot (easily) be correlated to the person it comes from.
That said, the commission went on to say it will put systems in place to lessen the risk of re-identifying de-identified data, implying one should still be wary and look to options to strengthen your online privacy.
The data gathered has a limited remit, only being used for the public health sector, and then only within the context of COVID-19.
The degree of data should be strictly limited to only the absolutely necessary, said the privacy commissioner.
All data chosen must not only be done so with the intent of minimal intrusion, but with a report that publicly communicates the government’s rationale for the type of information they are gathering.
Necessity and Proportionality
As the previous points make very clear, the intention is to make sure the data gathered is strictly that which is necessary, with nothing gathered that isn’t relevant to the crisis.
Everything the government uses has to go through a rigorous qualification process, the commission using phrases such as “science-based”, “evidence-based”, and “rationally connected”.
One of the main focuses of this is to always have a pre-planned reason as to why data needs to be gathered. The commission offers a series of rhetorical questions for the app managers to always ask when they decide to collect data, such as:
- “Is the purpose to notify users and advise them to take certain actions?”
- “Is it to assist public health authorities to better understand local conditions for resource allocation purposes?”
- “Is it for another purpose?”
If the data gathered is pre-planned to only be usable for a specific purpose, then the privacy commission hopes it is then not able to be misused and will have an in-built proportionality. Without the ability to even attempt misuse, the sole purpose will remain more strongly on effectiveness.
Everything proposed must not only be carried out at the action level but enshrined legally and contractually, says the commission.
This would lay out the specific remit of the usage of the app, and theoretically allow legal recourse if there was overstepping of these boundaries.
Consent and Trust
The final point is consent. The app is not intended to be a ‘for your own good’ instance of a nanny state, but rather a cooperative effort between citizens and government.
If this will hinder the effectiveness of the app remains to be seen, in the UK such voluntary measures have met with obstacles, whereas in South Korea the effort has been a major success.
It will rely on peoples trust in government to work, and therefore in the track record of the government itself; the privacy commission at least had proven its worth in fighting Facebook over privacy violations
So far, the app and its corresponding Privacy Impact Assessments have only been fully rolled out in Alberta. How effective it will be and how it will affect the privacy of citizens will depend on how well the privacy commissioner’s guidelines are followed.
There is an obvious reason for caution, and now might be as good a time as any to get a good VPN, but the Privacy commissioner’s guidelines are forward-thinking and cautious, always with a mind of the online security of Canadian citizens.
If properly implemented, this app can combat COVID-19 without sacrificing the nation’s digital autonomy.
Hi, I’m Ludovic. I created this site as a consumer resource to help fellow Canadians better understand the changing world of cybersecurity. Before creating this resource I saw two fundamental problems with the B2B consumer privacy industry. First, education – the majority of people don’t realize the importance of their own data. Second, nefarious marketing practices – there are a wide array of self-proclaimed security solutions that are doing nothing other than brokering user data without consent.