A book about Wikipedia, Google, “Web 2.0” and Virtual Reality in a leadership and organisational development blog? Well, yes. Jaron Lanier’s book will be a demanding read for anyone who thinks Silicon Valley is a euphemism for part of a celebrity, or who glazes over at the first mention of ‘cloud computing’. But it will also be a challenging read – with powerful reasons – for not just the many millions who Google answers from Wikipedia and collect ‘friends’ on FaceBook, but for the many industries who derive their livelihood from human creativity (embracing not just the arts, but skills such as journalism) or from the value of proprietary information. And it’s an important read for anyone who uses systems and databases to arrive at judgements or evaluations. It’s not the best written or argued book in human history, but – importantly – Lanier has been an important player in the ‘digital revolution’ since its early days. Often sharply critical of the way our relationship with technology is evolving, this is not the rant of a Luddite: this is a rant of an IEEE Lifetime Achievement Award winner and one of Encyclopaedia Britannica’s 300 greatest inventors.

For the minority who devour Bill Gates’ latest book or follow Steve Jobs on Twitter, this is a controversial book, and knowingly so. Lanier’s concern is with the impact that a nexus of technologies (the Internet generally, the web, social networks, and so on) that we have embraced with incredible speed and with far-reaching consequences. His perception is that we are engineering ourselves – and software development (which enables these technologies to function) is a form of engineering – out of our own equation. Computerised systems start with engineering decisions and a model – like any model, a simplification of reality – into which its users must then squeeze themselves and their lives. Technologies have to, by and large, be used their way, a point Lanier emphasises in considering the music technology MIDI:

Before MIDI, a musical note was a bottomless idea that transcended absolute definition. It was a way for a musician to think, or a way to teach and document music. It was a mental tool distinguishable from the music itself. Different people could make transcriptions of the same musical recording, for instance, and come up with slightly different scores.

After MIDI, a musical note was no longer just an idea, but a rigid, mandatory structure you couldn’t avoid in the aspects of life that had gone digital. The process of lock-in is like a wave gradually washing over the rulebook of life, culling the ambiguities of flexible thoughts as more and more thought structures are solidified into effectively permanent reality.”

As Lanier then points out, MIDI is now effectively ‘the lattice on which almost all the popular music you hear is built’. His perception as a practising professional musician as well as a music fan is that the growing effect is stultification: much popular music is now made within the confines of MIDI, within the ways of working that MIDI allows. As MIDI is an artificial, digital protocol, this is inherently a subset of all available musical expressions. Moreover, as other digital products profilerate (such as hard-disk recorders and samplers, all of which can interact using MIDI) , originality in music is similarly in retreat: the ‘mash-up’ (ie rehashing of existing music, often literally) is taking precedence over the original composition, performance, or production, as the mash-up is the way of working that MIDI encourages. As a technology – not just MIDI, but the web itself, or Facebook formats – gains popularity, it achieves ‘lock-in’. And, as Lanier points out:

Lock-in removes ideas that do not fit into the winning digital representation scheme, but it also reduces or narrows the ideas it immortalizes, by cutting away the unfathomable penumbra of meaning that distinguishes a word in natural language from a command in a computer program.”

As technologies approach ubiquity – and Lanier makes the point ‘there can be only one player occupying Google’s persistent niche’ – the effects begin to knock on. The legacy of ‘free information’ that came with the early days of the web (partly as it was designing for physicists to enable them to share research, but equally partly because of the ideologies of some of the key players, and partly as the intention was to evolve a universal system for what we now call ‘user-generated content’) has started to have serious impacts on the media industries. With news available freely online, newspapers across much of the world are struggling. The recorded music industry has faced similar ‘upheaval’ (although ‘devastation’ is probably closer to the mark). These industries’ suppliers – journalists, freelance writers, composers, musicians and performers – are suffering too: the web expects their work for free, and their former paymasters struggle to afford them. We are moving to an era of unpaid, amateur content, and we’re seeing only its early impact so far. (The web as we know it has, after all, existed for less than 20 years.)

Lanier – like the third episode of the BBC’s current TV series, The Virtual Revolution – is deeply worried by this: as he sees it, those who seek to make a living by mental activity (other than software engineering or the like, of course) will eventually need to find a means of sponsorship to be able to afford to continue. Consider the fate of the film industry and of cinemas once higher speed file sharing is a widespread reality:

Once file sharing shrinks Hollywood as it is now shrinking the music companies, the option of selling a script for enough money to make a living will be gone.”

The implication also remains that patronage or sponsorship will shape or determine their output: a patron or sponsor will back only what they approve of or like – which isn’t too difficult to see as a form of censorship. What we need is alternative – a pay as you go approach to web content – that enables creators to be rewarded (does ‘reward and recognition’ ring any bells for those HR types that have read this far?), and encourages them to continue to innovate and create. To date, however, we’ve shown little interest in paying (other than being physical goods) over the web, and even less interest in ‘micro-payments’, one of the web’s many ‘great ideas’ that never came about.

(Lanier misses the BBC series’ more worrying point: the amount of information Google accumulates on those using it to search the web – so that it can sell advertising to fund itself – would stun even the most dedicated format Warsaw Pact secret services informant, yet we give this information about ourselves away willingly, with little or no thought to the use that may be made of it – or to who may own it in twenty or forty years time.)

The speed of our embrace of all things technological is extremely rapid – so fast, that we rarely catch a glimpse of the speedometer. To give some context, here are some figures from a recent survey by rightmobilephone.co.uk:

  • The average ‘pay as you go’ mobile user in the UK will spend £28,200 over the years in which they actively use a handset – for which they could buy a Lotus Elise
  • 68% of survey respondents felt they spent more time texting and calling their friends than actually spending time with them
  • 41% felt the same about their partner or love interest
  • 54% said their mobile phone was their main means of communication
  • 31% were more used to staying in touch via Facebook
  • Only 4% said their main method of communication was face to face.

Surprised? Shocked? Alarmed? You might find yourself agreeing more with Lanier after those figures, especially his concern that Facebook – like the ‘teaching to test’ that he sees as the impact of an over-reliance on educational information systems in the US – is an example of what happens when ‘life is turned into a database’:

Both degradations are based on the same philosophical mistake, which is the belief that computers can presently represent human thought or human relationships. These are things computers cannot currently do.”

For Jaron – and surely for anyone interested in human resources and the development of our individual potential (even when we narrow that definition with the suffix ‘for commercial advantage’ – there is far too much precious value in those ‘unfathomable penumbra’ that escape the digitisation process. (I don’t recall him mentioning Star Trek, but if a populist example – and populism is another web-enhanced trend he greatly fears – makes this argument more accessible, bear in mind that Spock was only second-in-command. There was value in an approach that lead him to repeatedly explain that ‘that would be illogical’ or ‘that does not compute’, but the face, skin, ship and civilisation saving moments were usually left to messy, illogical, over-emotional Captain Kirk.) Arguing that the network itself is meaningless, and the true value is in the people it connects, he remains hugely optimistic about us.

In what is often an angry or depressing book, it would be right to finish with an optimistic quote:

… from a technological point of view, it is true that you can’t make a perfect copy-protection scheme. If flawless behavior restraints are the only potential influences on behavior, we might as well not ask anyone to ever pay for music or journalism again. According to this logic, the very idea is a lost cause.

But that’s an unrealistically pessimistic way of thinking about people. We have already demonstrated that we’re better that that. It’s easy to break into physical cars and houses, for instance, and yet few people do so. Locks are only amulets of inconvenience that remind us of a social contract we ultimately benefit from. It is only human choice that makes the human world function. Technology can motivate human choice, but not replace it.”

This is not a perfect book. To review it in its own terms, however, it was written by an individual, not a web-based collective: it has personal opinions, a point of view and a distinctive voice. But that is one of its strengths. There is much here that will confound or bewilder (the final chapters where he expressed his hopes for the future of virtual reality often lost me, particularly when octopuses made several appearances), and there is much here that will anger many – not least the equation of a lot of the philosophy underlying web development with ‘digital Maoism’.

And to be fair to the author, he acknowledges as much – and sticks to his (verbal) guns. For anyone who would appreciate – or benefit – from taking a step back to consider how the human relationship with technology is going (and particularly where it is heading), it’s a vital read in every sense. By encouraging us to seek out, hope for and demand a future where technology enhances rather than diminishes our humanity, it’s also a book to applaud.

Add to: Facebook | Digg | Del.icio.us | Stumbleupon | Reddit | Blinklist | Twitter | Technorati | Furl | Newsvine

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to Furl