Last week, 90,000 Mastercard loyalty programme customers in Germany and Austria had their names, addresses and credit card numbers compromised after a possible data breach – but the news isn’t unusual.
In fact, it is “peanuts,” said Thomas Lammer, Principal Market Infrastructure Expert, European Central Bank.
“I think we have become, to some extent, already numb to all the headlines which we are seeing all over the place; every other week or so there’s a small or a bigger leakage of data or a data breach,” Lammer said at this week’s ITU Workshop on Fintech Security.
The workshop discussed privacy threats in Fintech services and best practices for mitigating such risks. Speakers also shared insight into global policy initiatives relevant to Fintech security and efforts to harmonize related regulations across borders.
Recent examples of ‘data spills’ include Equifax and Capital One, in which 147 million and 100 million customers were affected, respectively. But neither made it into the top five biggest data breaches worldwide, according to the BBC.
Trust in financial institutions is key to market stability – a data spill or data breach of a large financial service provider can have real economic effects and potentially cause major market disruption, said Lammer.
“As a Dutch proverb says, ‘trust comes by foot and leaves on horseback’. So once trust has been lost in a payment or electronic payments, or in the worst case in the currency, it’s hard to be regained,” said Lammer.
For the financial services sector, the protection of privacy is a challenging balancing act. The sector requires high cyber resilience but also the flexibility to innovate payment systems and pursue greater financial inclusion. With more data points to identify people and ‘Know Your Customer’, more people could be granted access to formal financial services. But at what cost to privacy?
Experts agreed that discussions around privacy must give equal weight to security, the technical mechanisms required to protect privacy.
“Privacy rules protect data from abuse from those who collect the data, but security is equally or more important because if you don’t have security, then it just negates all of the other protections. It doesn’t matter if you protected the data from abuse by the person who collects it or the people who process it if it’s hacked,” said Virginia Cram-Martos, CEO of Triangularity Sarl.
There are some 120 jurisdictions with some kind of data protection legislation in place. Although people reference the EU’s General Data Protection Regulation (GDPR) as being the strictest privacy legislation internationally, Cram-Martos pointed out that there are aspects of privacy legislation in the Republic of Korea, for instance, that are stricter.
However, these national and regional regulations can be difficult to follow, even for companies operating in the country or region in question, said Cram-Martos.
“There’s not a lot of jurisprudence, even in individual countries, about how different aspects will be interpreted and how to reconcile conflicting or similar but not quite the same requirements across countries,” Cram-Martos said.
But the privacy debate has to go further than data control and mitigating the consequences of a data breach, said Cram-Martos.
“How can we ensure that privacy protection does not become a barrier to trade and services from developing countries that want to sell to countries that have relatively heavy privacy protection rules?” she asked.
These regional differences in privacy regulations can make it difficult for enterprises – especially small and medium-sized enterprises – that want to do business internationally to evaluate their adherence to the varying regulations.
Big Data poses some big questions for regulators, said Cram-Martos.
“What is the impact of privacy legislation on biases in big data, because it can be shown if you ask permission for using data, the profile of people who will say ‘no’ is different from the profile of the people who say ‘yes’,” Cram-Martos said. “It introduced some kinds of biases that I don’t think anyone has looked at or analyzed yet into big data, which is used for machine learning and artificial intelligence.”
Security experts may be able to quantify such effects, suggested Cram-Martos.
“Organizations like ITU and groups like SG17 are full of experts, and these are the experts who are needed to look for answers,” said Cram-Martos. “And so I’d encourage you to be one of those people who looking for answers to some of the questions that I’ve posed.”
Futurecasters 2020 Young Global Visionaries – youth bring their energy and their voice to ITU debates
Send this to a friend