Artificial Intelligence (AI) has been termed a “Digital Quake.” This term powerfully illustrates the impact of AI on our lives.
There is a tendency for people to think that AI does not impact them. But anyone who uses a smartphone or even just e-mail is impacted by AI several times a day.
‘Everyone must exercise a duty of care when using digital technologies.’ – Moira de Roche, Chair of IFIP IP3
In South Africa alone, the estimated number of smartphones in use is 20.3 million, which is approximately one-third of the population. Safe to say then, that a significant portion of any population – in a developing or developed country – is using AI to some extent.
The Fourth Industrial Revolution brings us cyber-physical systems, driven by AI and robotics – these systems connect the physical and the digital.
Not many years ago, if an average user or consumer was infected by a virus or had their system breached in some way, it only affected one device. The likelihood of essential information – such as bank account details – being compromised was small, unless the user gave someone these details after a phishing attack for example.
Not so today, where our homes and devices are interconnected – and it’s not just data the “black hat” hackers can steal. It is increasingly more common for alarm systems to be connected to a digital device, giving the user the convenience of activating or deactivating alarms remotely. However, if their system is hacked, thieves can do the same.
‘Privacy and security is about more than just strong passwords and virus protection.’
Everyone must exercise a duty of care when using digital technologies. Understanding privacy and security is an essential skill in the 21st century and has become even more critical in the ‘Fourth Industrial Revolution.’
It’s astounding that many (probably most) people still have very insecure passwords, and don’t have even basic virus protection on their devices. Perhaps people in work have been accustomed to someone else being responsible for the security of information.
But there is no one who takes care of personal digital security. I always liken it to keeping yourself and your family safe in the physical world. You do all you can to protect yourself, and only call on the police when things go wrong. You don’t say “Oh, the police force has a duty of care to keep me safe, so I don’t have to do anything.”
Privacy and security is about more than just strong passwords and virus protection, though. Users should find ways to ensure that their service providers are trustworthy. They can and should expect their governments to pass laws that ensure compliance and accountability: but again, the final duty of care is with the individual.
IFIP IP3 launched the iDOCED (ifip Duty of Care in Everything Digital) campaign at the end of 2016. We are passionate about spreading this message. We believe a duty of care, together with trustworthy computing that is expected from IT professionals, will deliver the significant advantages of AI, which includes economic benefits, and mitigate any potential harm.
I look forward very much to exploring this further in the discussion on “Artificial Intelligence: impact and ownership” at ITU Telecom World 2018 in Durban this September.”