It’s no surprise that in recent years artificial intelligence has become quite the powerhouse of resources. We have a little robot to do everything, from waking us up to reminding us to go to sleep, AI is everywhere and the rumor has it, It’s always listening. Privacy on the other hand is a disputed right in a lot of countries but the notion remains our entire lives are interconnected to what we do and don’t do on the internet.
The closer AI gets to performing complicated tasks without human supervision the more powerful it becomes. This begs the question- What is the future of Artificial Intelligence and Privacy.
The Relationship Between AI & Privacy
There is a reason why these two have a special relationship, AI has made gathering information a very interesting thing. AI is also quick and efficient at collecting data. After the collection of it comes the computation which AI is capable of being faster than human beings. AI is also able to sort through large sums of data and it does it in record time. The overall efficiency of data collection and processing is advanced through AI that enables Privacy in these various ways-
- Exploitation of data
- Voice Recognition and Face Recognition
- Tracking and Profiling
- Probability Factors
- Lack of consent
Let’s discuss these factors in detail.
1. The exploitation of data
A lot of smart devices have the feature of AI and this feature comprises the data of the individual who owns these devices. Other than that the general unawareness of users of how these devices eavesdrop on their users is also one of the reasons why it becomes quite hard to trust AI.
2. Voice & Facial Recognition
This is a grave risk to a person’s identity and we are seeing how slowly it will become an imminent threat. Identity theft is going to get easy once voice and facial recognition become a tool used by the masses. Currently, the governments and companies using it or without the consent of the normal folks are quite daunting too. Some countries are using this to recognize protestors to further crush public rage towards the government.
3 . Probability factors
AI can learn algorithms to predict information that may be sensitive or otherwise. The fact that the keyboard’s typing patterns can determine a person’s nervousness, stress, confidence, etc is a threat enough. If that’s not enough then keep in mind that the algorithm can also predict a person’s sexual orientation, ethnic identity, and political views. You can obtain a person’s data too which includes their location and activity logs.
4. Tracking and Profiling
AI can also track various devices at various locations be it home, work, or a public setting, even if your data is anonymized once it becomes a part of a large data set AI can easily de-anonymize it. Once that is done your data no longer remains personal, you can be tracked and identified based on your data information. Your data in the wrong hands can lead to very dangerous outcomes. AI also classifies the data that ranks people and classifies them. This is something that China’s government is using to limit the credit, housing, and employment access of their citizens. The citizens have no control over the outcomes. Moreover, China’s scoring system is based on the idea of control and submission towards the government.
5. Lack of consent
The scandal of Cambridge data analytics was a clear indication of how little the giants care about the privacy of their users. If a company can use the data of 87 million without their consent only to manipulate the political outcome of a country’s election then very little hope is left. The discard of consent is a trend that a lot of companies follow, the users never really know what they are consenting to give a show to an app or a device till it’s too late.
Sure the fast and exponential growth of technology is interesting to witness and yes in most cases it is doing mankind a lot of good. However AI’s relationship with privacy still needs to be under discussion, there are almost no rules and no boards that can decide where technology should draw a line. There are numerous examples of this and yes it can be used to demand more accountability from the powerful but it can very easily be also used to prosecute and manipulate the ones demanding accountability.