The Jurassic Park Problem is one of my favourite go to explanations. Now, with the imminent release of Jurassic World, I finally seem slightly topical rather than horribly out of date. I most often employ it to question HR’s positions around analytics and big data – although it probably applies to most people’s use of technology.
It’s actually a two part issue – the first part being the context and the second part being the problem.
Jon Hammond builds a theme park on an island that is full of dinosaurs (he nabbed their DNA from resin). For the sake of simplicity we’ll call this Jurassic Park. He invites a select group of people to come to the island in advance of it opening. These include (quite sensibly) a hunter and some experts in dinosaurs. He also invites Ian Malcolm, a rock star mathematician who is an expert in chaos theory. This, I will concede, is a less obvious choice. If you read the book of Jurassic Park the concept of chaos theory is actually a central theme.
In the film Malcolm is played by Jeff Goldblum and is all ‘charismatic’. When asked to comment on the park he say’s this
“Um, I’ll tell you the problem with the scientific power that you’re using here, it didn’t require any discipline to attain it. You read what others had done and you took the next step. You didn’t earn the knowledge for yourselves, so you don’t take any responsibility for it. You stood on the shoulders of geniuses to accomplish something as fast as you could, and before you even knew what you had, you patented it, and packaged it, and slapped it on a plastic lunchbox”
The rapid commercialisation of new technology and the lack of understanding of the hard work and principles it is built upon combine to form a flurry of ill thought out activity. Code is built on code and then given a nice front end. It isn’t necessarily that it is wrong… more that the lack of reflection and effort involved mean that we are accept an easy solution without understanding the workings of it.
We stand apart from, and yet reliant upon, complex systems. That means that when systems fail we may not even notice. Matt Buckland wrote an excellent piece on the fad of the magical algorithm here. If you don’t believe in magic then be enquiring enough to understand how the magician pulls off his tricks.
“your scientists were so preoccupied with whether or not they could that they didn’t stop to think if they should”.
The problem in our pursuit of the new is that we often pursue functionality over ethics. For instance our rush towards quantification of individuals (heart rate, steps, productivity, performance) ignores the research we already have about people and their desire i) not to be reduced to numbers ii) to have a sense of identity and agency outside of a system iii) to feel as though they have a high level of self determinism/autonomy.
There is a sacrifice and benefit to every piece of information we give to a system – individuals and organisations will have different views on how justifiable that sacrifice is and how beneficial it is. If I offer a leader more information about his team he may naturally think that is a good thing, but for the individuals it may be a differing dynamic.
As organisations rush to map as much of their employees’ lives and interactions as possible (through wearable tech, social network monitoring and other means) they see an opportunity for control and insight that is at once beguiling through one lens and frightening through another.
It doesn’t mean that we shouldn’t do this activity – but to pause long enough to genuinely consider whether we should be doing it seems the lowest possible ethical requirement of a profession that is supposed to be all about people.