Online privacy and security — whose responsibility?

Early in the Pokemon Go craze, several articles warned that the game’s Terms & Conditions offered poor protection for the private details of those who downloaded the app, as it provided full access to users’ Google accounts. Clearly the majority of players did not read this policy (or care, if they did). At last check, it has over 75 million downloads.

This continual lack of concern about privacy by digital users worries some experts and regulators, who fear the consequences of corporations (and other unseen entities) being able to track the location, interests, relationships, purchases and political ideologies of their online communities.

As Michael Geist recently noted, “The use of augmented reality is at a very early stage, but given the massive popularity of Pokémon Go, there is every reason to believe that the technology – and the legal issues that come with it – are here to stay.”

And it’s not just on social media and video games that people are being lax with their privacy and security. For example, how secure is the average password? (Not very). Are people storing their passwords in secure places? (Not really). Are we getting better at spotting and rejecting spam and phishing emails? (No). Are we taking every step to securely store confidential information, and not transmit it over public email or open networks? (No).

So, what can we do to make people safer online and still protect their privacy?

One suggestion: stop expecting end-users to act as the front-line of defence.

We are now seeing a movement away from putting the onus on end-users to secure their online movements, and instead giving that ethical (and legally required) duty to corporations and employers.

A German consumer advocacy group recently launched legal action against the makers of Pokemon Go, calling on it to change its Terms & Conditions and privacy policy to better protect consumers. This follows similar actions against internet giants such as Google, Facebook and Amazon to improve their privacy settings.

Companies and post-secondary institutions are also starting to realize that they are fighting a losing battle trying to enforce security rules (including: don’t click on phishing emails, don’t use USB stick you find on the ground, don’t use your personal devices for work, don’t send files over public websites such as Google or Dropbox, etc). Rather, they are now focusing on building up their back-end security to deal with the inevitability of these things happening.

So now the question is becoming: what steps should organizations take to ensure the safety and privacy of their stakeholders and employees? Is it enough to simply comply with (often outdated) government legislation, or should organizations take the initiative to build new policies and procedures that reflect the actual needs and habits of their communities?

This is an ongoing discussion that will be covered from multiple perspectives at the 2016 Cyber Summit in Banff, AB October 27-28. The Summit will also look at the ethical implications of new technologies (including the Internet of Things and Artificial Intelligence), and how we can prepare for what’s to come.


For more details on the event and how to register to attend, visit: cybera.ca/cybersummit.