Peter Coroneos: Cambridge Analytica, Facebook and Who Controls Your Data

CyAN

By Peter Coroneos*, CyAN Head of APAC Region

– The views and opinions expressed in this article are those of the author and do not necessarily reflect the official policy or position of CyAN –

We should not be surprised by the revelation that Facebook facilitated massive privacy breaches through harvesting users’ data or by turning a blind eye to the practices of the app developers they supported. Their model is based on encouraging users to reveal vast amounts of personal data then mining that data for profit. They talk about their mission of promoting more transparency in society through the sharing of information, but there is very little transparency in how they capture and monetise our information.

Recent shock admissions by former senior Facebook executives that they knowingly manipulated the dopamine response to trigger and addict users’ neurological reward systems through the Facebook “Like” feature represents one of the most pernicious examples of applying psychology for mass exploitation. It’s getting very close to the drug dealer’s modus operandi.

The ethics of these practices are yet to be accounted for, but I imagine more will soon come to light as we once again grapple with the perils of the Information Age. And whether the utility of social media in this case is worth the trade off in the loss of sovereignty of our personal information.

The EU General Directive on Privacy Regulation (GDPR) which begins in May comes on the heels of Australia’s Mandatory Data Breach Reporting regime which started a month ago. Some say these measures are too little too late. Others complain they represent a massive impost on business and slant the playing field against legitimate access to information.

I certainly believe that in an age of cybercrime, it’s not good enough to hide behind purely commercial arguments to justify mass data collection. The breaches we’ve seen over the last few years should put everyone on notice that whatever you collect is a honeypot for the bad guys, even if your own intentions are relatively benign.

Back in 1998 as industry leader for the Internet, we wrote to the then Prime Minister urging him to consider extending the Privacy Act to cover the private sector. We did so in the belief that abuse of privacy would undermine confidence in the nascent digital economy. We saw an opportunity for Australia to be better than other nations, to be reputationally known as a safer place to do digital business, the “Swiss bank” of the international e-commerce community.

The subsequent reforms to the Act did indeed occur, once businesses came to realise that good privacy is good for business.

Regrettably, not all cultures are so privacy respectful and the US example stands out. This is not to say that people are less concerned there, but the powerful interests that stand behind the doctrine of “commercial free speech” are unrelenting in the assertion of their right to extract commercial advantage where they can.

The dilemma for Australians is that the large social media and search platforms we have come to rely upon are largely US based and follow US law. For many practical reasons, they argue they cannot comply with everyone’s privacy laws, so we are left with the lowest common denominator.

The impacts go beyond economic exploitation. We have seen the manipulation of the democratic process itself by mining and aggregating vast amounts of personal data using advanced machine learning to profile our thoughts, opinions and attitudes. Campaigns based on misinformation are skewing democratic outcomes. We should be very concerned about this.

Political parties do this and they hire very skilled technologists to help them. The Cambridge Analytica debacle is the most high profile example, but there will be more.

As a technology industry leader, I was often asked to comment on the interplay between technology and society. Observing the impacts, I became more and more convinced that just because technology can enable certain processes to be done does not mean that they should be done.

We are seeing a parallel in the use of artificial intelligence for autonomous warfare with grave concerns raised by many of the researchers whose work has given rise to the use and misuse of drone technology. It seems like we are entering a new era of the genie escaping the bottle.

However this dilemma is not new. The early creators of nuclear technology voiced similar misgivings about the future application of their work, and were horrified to see the consequences of the first atomic bombs dropped on civilian populations.

So where do we go from here? Should we get off all social media? Should we demand more rights and more transparency in how our personal information is collected and used by foreignbased organisations? Should we examine our own practices as companies to build a bridge of trust to our customers?

I recommend we take a hard look at our own values and begin in earnest the discussion about what kind of society we want to become. It’s hard to understand the trade offs we are unconsciously making precisely because no one, not even the creators of technology can predict future applications. So instead we need to set parameters on collection and use. The Privacy Act does that. How well do you understand the law and the principles it is based on? Are you having that discussion internally and with your customers? Are you seeing trust as fundamental not only to business, but to society? Are you prepared to step up and proclaim a commitment to respecting privacy as a basic human right? Are you prepared to be held accountable to that standard?

For its time for us all to evaluate these questions and stake out a position. Our customers and our citizens will demand that of us. It’s time to examine our own consciences. Perhaps it’s not too late to bring a better alignment between human values and corporate practices. Technology is after all neutral. It’s what we do with it that matters.


* As CEO of the Internet Industry Association from 1997-2011, Peter Coroneos championed best practice across range of issues, from privacy to cyber security to child protection. He served six years alongside the current  privacy commissioner, Timothy Pilgrim, on the Privacy Advisory Committee, a ministerial panel which advised government and industry on emerging social and technological threats to privacy. Peter was a prime mover in the passage of Australia’s anti-spam laws, heralded as the strongest in the world, which removed us from the top 20 list of spamming nations. He also helped secure changes to the Privacy Act which brought the private sector under its remit. He twice represented Australian industry at APEC on standards for privacy protection throughout the Asia Pacific. Based in Sydney, he is now CEO of Icon Cyber (iconcyber.com) and Regional Head for the global Cybersecurity Advisors Network