Widget Image
Connect with:
Thursday / March 23.

Can an algorithm judge your character?

Algocracy: governance through algorithms

By Muhammad Aurangzeb Ahmad

In the late 17th century Gottfried Leibniz conceived of a machine that could be used to settle arguments so that instead of arguing people will just settle dispute by saying “let us calculate.” On closer inspection this idea has an uncanny resemblance to deciding disputes by delegating the decisions to algorithms. This is no longer the realm of Science Fiction as not only do algorithms already make decision on our behalf but they also make biased decisions on our behalf.

Welcome to the world of Algocracy, which refers to a system of governance based on rule by algorithms.

Where does the allure of Algocracy come from? What Algocracy offers us is an “opportunity” to absolve us of moral responsibility by outsourcing it to machines, a point raised multiple times by the Philosopher Evan Selinger.

The problem of Algocracy has been brought to the fore recently when reporters from ProPublica did an investigative analysis of a prisoner scoring software and determined that it was negatively biased towards black people. Consider two people, one black and the other one white, given the same criminal record, a commercial tool called COMPAS employed by law-enforcement agencies, would give a higher risk score for the black person. This would result in tougher convictions and longer sentences for Black people. ProPublica found a large number of examples where the non-black person with a lower risk score went on to commit more crimes but the black person did not commit any crime. Even Eric Holder weighed in on this debate by cautioning that such scoring systems are biasing the system against certain minority groups. One of the implications here is that algorithms already have much say in how our society is run. Given the proliferation of big data the role of algorithmic governance is only going to get bigger not smaller. We are already living under an Algocracy, it’s just that it is not evenly distributed yet.

Where does the allure of Algocracy come from? What Algocracy offers us is an “opportunity” to absolve us of moral responsibility by outsourcing it to machines, a point raised multiple times by the Philosopher Evan Selinger.

The use of algorithms to determine preconceptions

While much social progress has been made in the US since the end of the Jim Crow era and the civil rights movement institutional racism is much harder to eradicate. With laws that are on the books one can point towards individual people or groups who drafted such laws but with algorithms one can absolve oneself of the responsibility and point towards the alleged impartiality of algorithms. Even if we assume that the algorithms themselves can be unbiased, at the very least, the data that is fed to the algorithms can introduce bias. I have argued elsewhere that the data which is fed into the algorithms can make them take certain political ‘stance.’

To drive home this point consider what Google’s suggest function return’s when one searches for information about different ethnic and religious groups. Notice that the terms associated with white people are neutral but that is not the case when searching for black people or Muslims.

Now it is not the case that Google or other search engines are biased against certain groups but rather the suggestions are based on what users search for. The bias shown by the search algorithms is actually the bias of the people who are using the search engine.

Google Search - 'White people are'

A Google search for ‘white people are’ | Photo courtesy: 3quarksdaily

Google image search: 'Black people are'

A Google image search: ‘Black people are’ with algorithmic autocomplete. | Photo courtesy: 3quarksdaily

Google image search: 'Why do white people'

A Google image search: ‘Why do white people’ with algorithmic autocomplete. | Photo courtesy: 3quarksdaily

Google image search: 'Why do black people'

A Google image search: ‘Why do black people’ with algorithmic autocomplete. | Photo courtesy: 3quarksdaily

Google image search: 'Why do Muslims'

A Google image search: ‘Why do muslims’ with algorithmic autocomplete. | Photo courtesy: 3quarksdaily

The problem is that one can pick and choose data to ‘prove’ any point, the same goes for data that goes inside the machine.
What this small example shows is that stating that Algorithms are always unbiased is problematic at best. If usage data can bias suggestions then imagine what would happen if the data was ‘hand-crafted’ to bias the system. Thus there is one thing that Leibniz did not anticipate – algorithms are designed by people who may be biased themselves. While Leibniz may have been the grandfather of this concept the latest incarnation of this conception was brought to fore by A. Aneesh in 2006 with his book Virtual Migration who observed the threat of computer based systems to constrain human decision making.

Algorithms penetrating infrastructure and institutions

Lets not lull ourselves to the misconception that it will just be governments who will be availing themselves of the opportunity to automate.

Largely hidden from the public conscious Algocracy already penetrated large parts of our social infrastructure. Consider advocacy organizations and lobbying groups routinely rank congressmen on their favorability based on their past voting records. Software can be buggy and we already have instances where bugs or mistakes in the code were responsible for wrongful foreclosures. However reversing such decisions were difficult because nobody expects a computer to be wrong. The problem is that one can pick and choose data to ‘prove’ any point, the same goes for data that goes inside the machine. Also, the data being collected may biased to begin with but the people who are collecting the data may not be aware of it.

Even if one argues that bias cannot be eliminated, we can at least agree that it can be greatly reduced. A case in point is the Google automatic image tagging fiasco from last year which mistakenly tagged Black folks as Gorillas. While it may be that none of the programmers who worked on the system was racist but the net effect of using biased data resulted in a situation with strong racist undertones.

Governance through machines

The use of algorithms in our everyday | Photo Courtesy: Physiotherapymatters

Human and machine governance

Lets not lull ourselves to the misconception that it will just be governments who will be availing themselves of the opportunity to automate. Corporations already have enough resource and in some cases even more computing resources than most governments, the infamous episode where Target was able to predict that a teenager was pregnant is a case in point. Now imagine the Saudi Department of Propagation of Virtue and Elimination of Vice run by a “virtuous” computer: by monitoring all of your activities one could even device a virtue score analogous to credit ratings. Even worse scenario would be an Algocracy modeled after the mind of the Dear Leader of North Korea where punishment for thought crimes would indeed become a reality. These scenarios may sound far-fetched but one only has to look at the People’s Republic to see that it is already rolling out such a system to keep the people in line.

Once we start going this route then it may next to impossible to draw the line of demarcation for human vs. machine governance. Why even have human lawmakers make decisions? After all algorithms can always be more efficient than people. So let’s have each congressional district code its own lawmaker bot and then let them decide laws on our behalf. Of course reality will be different, just as Wikipedia spawned Conservapedia as its counterpoint one can imagine Liberal and Conservative versions of Algocracy.


Muhammad Aurangzeb Ahmad is Senior Data Scientist at Groupon and Researcher at the Department of Computer Science at University of Minnesota.
This article was originally published on 3quarksdaily
Featured image courtesy: Redorbit 
Fresh insights delivered to your phone each morning. Download our Android App today!