Having watched with interest since season 1 I have to say Person Of Interest is one of my favourite TV shows. It combines a level of geekness, Seal-esque violence and gritty reality not seen since the days of Edward Woodwards The Equalizer.
Talking of Edwards, did you know that a season 1 episode of POI actually involved an NSA whistleblower who was trying to get the word out that “your Government is spying on you”? Pursued by assassins he sought the assistance of a journalist to help him expose the truth, in the end he was helped by Harold and John. That was EIGHTEEEN months before the real Edward Snowden exposed the NSA for who they really are.
The most recent episode was an interesting one. Season 4 Episode 6, “Pretenders” presented to us an alter-ego supercop by the name of ‘Detective Jack Forge’. This was an amusing side plot in of itself, however the bigger picture was being played out behind the scenes. Samaritan seems to want to grow and build, showing a specific interest in some new and clever algorithms developed by the shows newest character, Beth Bridges, the head of Geospatial Predictive Analytics LLC. Beth and Harold meet in Hong Kong where Harold is due to deliver a paper titled “Precautionary Principles in Neuro-Evolutionary Decision Making Systems”. It seems Beth disagrees with his precautionary approach and Harolds statement that “ethics requires that we proceed slowly in predictive analytics”.
Harold is talking from experience of course, his Machine did try and take a measure of self control during its initial programming and in most of those cases he had to cut the hard lines and remove power.
As a more relevant application in real life, the use of raw data by computational systems to make predictive decisions is flawed from the outset. Humans are distinctly unpredictable by nature, simply having all information available to you is not enough to predict their next course of action. Even if you know someones routine, their medications, their job role & title, the car they drive, their health statistics, all their personal and social information (including that of the people they associate with) you are always guessing their next move. A computational algorithm is still taking raw data and making a raw guess. Nine out of ten times it may be right.
But what about when it is wrong? The reason it WILL get it wrong at times is simply that it cannot bring human emotions into the picture during its number crunching. Emotion cannot be expressed as levels of serotonin in your brain.
It starts now, here. In London. Replace Decima for Accenture and the world will be theirs: Software to Predict London Gang Activity