Back to main site

Tel: 01347 812150

Monthly Archives: January 2015

The Forgotten Architects of the Cyber Domain

CT6_Blog_Images_3000x9459

 

“Very great and pertinent advances doubtless can be made during the remainder of this century, both in information technology and in the ways man uses it.  Whether very great and pertinent advances will be made, however, depends strongly on how societies and nations set their goals”

“Libraries of the Future”, J. C. R. Licklider, 1965

 

July 1945.  The Second World War is drawing to its final close.  Although the USA will continue to conduct operations in the Pacific theatre until a radio broadcast by Emperor Hirohito on September 15th announces Japan’s surrender; the outcome is assured.  Victory in Europe had already been achieved.  Germany had surrendered, unconditionally, to the Allies at 02:41 on May 7th.  By July, total Allied victory was an absolute inevitability.  Peace loomed.

 

The seeds of the Cold War had already been planted, perhaps as early as the Nazi-Soviet Non-Aggression Pact of 1939.  Certainly, by the frantic race to Berlin between the USSR from the East, and the UK and the USA from the West, following the Normandy landings and the Allied invasion of Europe.  From these seeds, the roots of the continuing, global, and existential struggle that was to define and shape the human story for what remained of the twentieth century were already growing; and at a remarkable rate.  However, it would not be until March 5th 1946 that Churchill would declare, “from Stettin in the Baltic to Trieste in the Adriatic an iron curtain has descended across the Continent”.

 

In July 1945, the deep terrors of the Cold War, were, for most of humanity, unforeseen and unimagined.  In July 1945, many of humanities finest minds were compelled to contemplate on the ruins of the world and, more importantly, on the new world that they would make to replace and improve that which had been destroyed.  Fascism had been defeated.  Democracy had prevailed.  A high price had been paid by victor and vanquished alike.  Cities, nations and empires lay in the ruins of victory as well as of defeat.  Amongst the victors, elation was tempered with exhaustion.  The UK economy in particular having been dealt a beating from which it would never recover.

 

The world had witnessed the capacity of human science and technology to mechanise and industrialise wholesale slaughter of soldiers and civilians alike; had watched the mass production of death played out on a global stage.  War, and genocide, had been refined by western civilisation to a grotesquely clinical exercise in accountancy and modern management.  The legitimacy of the European imperial project perished in the barbarity and horror of Auschwitz and Stalingrad.

 

In order to secure this victory, the entirety of the will, energy and treasure of the greatest nations on Earth had been devoted to one single aim; victory.  This had been a total war in every sense of the word.  Now that the victory had been attained; what next?  What was the new world to be remade from the ruins of the old to look like?

 

In the Summer of 1945, there was a sense that great things must now be done in order to ensure that the new world would be one worthy of the sacrifices that had been made; a peace worth the price.  All of this had to have been for something.  Amongst the great minds of humanity a sense had grown of the power of human agency and spirit to create great effect.  These were the minds that had harnessed the power of the atom, through technology, to the human will.  These were the minds that had created machines of vast power and sophistication to make and break the deepest of secrets.  These were the minds that sensed the expectations of history upon them.  It was their responsibility, individually and collectively, to secure the peace just as it had been to win the war.  It was their duty to enhance and improve the human condition.  And, they knew it.

 

In the July 1945 issue of the “Atlantic Monthly”, the man who had spent his war directing and channelling the scientific research required to secure victory in arms, responded to the imperatives of the peace, and the call of history, with the publication of the seminal paper “As We May Think”.  As first the chairman of the National Defense Research Committee, and then the director of the Office of Scientific Research and Development, Vannevar Bush was responsible for directing and co-ordinating the prodigious and ground breaking research required to enable the prosecution of total war on an industrial scale.

 

In his paper, Bush openly acknowledges that, for scientists, “it has been exhilarating to work in effective partnership” in order to attain a “common cause”.  He poses the question; “what are the scientists to do next”, now that the exhilaration of the war has ebbed away?  His answer is that the scientists of the peace must turn their attentions to making real the radical transformation in the relationships between humanity and information promised by the technology developed at such pace and cost during the war.  For Bush this is about far more than computers as great calculators for scientists; “a much larger matter than merely the extraction of data for the purposes of scientific research; it involves the entire process by which man profits by his inheritance of acquired knowledge”.

 

Bush proposed the creation of a device to extend and enhance the human memory; a machine to aid and augment the human powers of cognition, imagination and creation; a computer to work in symbiosis with the human.  He proposed a device that would operate as human thought does, “by association”.  For Bush, the human mind can, “with one item in its grasp”, link “instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain”.  He describes “a future device for individual use, which is a sort of mechanized private file and library”.  He gives it a name; the “memex”.  The memex, extends the human memory and the mind; “it is a device in which an individual stores all his books, records, and communications, and which is mechanised so that it may be consulted with exceeding speed and flexibility”.

 

He gives the new thing a form; it is “a desk”.  For the human, it is a “piece of furniture at which he works”, rather than a mysterious, inaccessible, gargantuan, monster-machine, filling an entire room.  It is moulded around human interaction; it has “slanting translucent screens on which material can be projected for convenient reading”.  For data entry and control there is both a “keyboard”, and “provision for direct entry” via a “transparent platen” upon which can be “placed longhand notes, photographs, memoranda”.  These originals can then be “photographed onto the next blank space in a section of the memex film.”  If at any time the user losses the thread of their interaction with the memex, a “special button transfers him immediately to the first page of the index”.

 

Bush’s paper lays out the essence of the core thinking upon which the World Wide Web was to be constructed.  Bush proved incorrect in his preference for analogue over digital computers.  However, his vision, of the human interactions with information augmented by symbiotic machines integrated by design in to the associative workings of human cognition, has been made real.  We see this in the algorithms driving today’s search engines; in the concepts and technology of hyperlinking that provides the threads of the Web built upon the decades of associative and cumulative thought he initiated, and; in the principals informing the graphical user interface model of mediating human communications with our machine counterparts.

 

Bush’s paper entered an intellectual context already populated with minds turned in the direction of computers and computing.  In particular, the minds of A. M. Turing and John von Neumann.  Turing’s 1936 paper “On Computable Numbers, With an Application to the Entscheidungsproblem” offers clear evidence of the direction of his pre-war thinking towards the possibility of a universal computing machine.  Whilst attending Princeton from 1937 to 1938, Turing encountered von Neumann for a second time, the two having met first when von Neumann was a visiting professor at Cambridge during the third term of 1935.  In 1946, John G. Kemeny had “the privilege of listening to a lecture at Los Alamos by … John von Neumann” in which von Neumann laid out five principals for the design of a computing machine of the future.  Kemeny’s memory has von Neumann’s proposals as: “1, Fully Electronic Computers”, “2, Binary Number System”, “3, Internal Memory”, “4, Stored Program” and, “5, Universal Computer”[1].  Turing’s direction of mind towards the question of machine intelligence is signposted in his lecture to the London Mathematical Society on 20th February 1947.  The world was not to know of Colossus for decades to come.

 

In 1948, Norbert Wiener publishes the first of a series of three books in which he announced the creation of a new science; “Cybernetics: or Control and Communication in the Animal and the Machine”.  The foundation text was followed by “The Human Use of Human Beings” in 1950, and the trilogy was completed in 1964 with “God and Golem, Inc.”.  Wiener’s mission in 1948 was to provide a “fresh and independent point of commencement” for “the study of non-linear structures and systems, whether electric or mechanical, whether natural or artificial”[2].  By 1964 he was reflecting on “three points in cybernetics”.  Firstly, “machines which learn”.  Secondly, “machines … able to make other machines in their own image”.  Thirdly, “the relations of the machine to the living being”[3].  Across the span of these texts, Wiener develops the conceptual, philosophical and mathematical framework that unifies and transforms human and machine in the cyber domain of today.

 

Two years before he joined the newly created Advanced Research Projects Agency in 1962, J. C. R. Licklider had already begun to direct his thoughts towards the “expected development in cooperative interaction between man and electronic computers” that will lead to a “man-computer symbiosis” in which “a very close coupling between the human and the electronic members of the partnership” will “let computers facilitate formative thinking” and “enable men and computers to cooperate in making decisions and controlling complex situations” [4].  By 1968, Licklider predicted with assured confidence that, despite it being “a rather startling thing to say”, nonetheless, “in a few years, men will be able to communicate more effectively through a machine than face to face.[5]

 

Three years on from his appointment to the new agency, in 1965, Licklider was commissioned with the production of a report in to the “Libraries of the Future”.  His task, not to examine new ways to store and retrieve books; but instead, to consider the “concepts and problems of man’s interaction with the body of recorded knowledge” and to explore “the use of computers in information storage, organisation, and retrieval.”  His prediction was that what he called a ‘procognitive’ system would evolve based on digital computers.  Outlandish though it might seem to the readers of the report in 1965, these computers would have; “random-access memory”, “content-addressable memory”, “parallel processing”, cathode-ray-oscilloscope displays and light pens”, “hierarchical and recursive program structures”, “procedure-orientated and problem-orientated languages” and “ xerographic output units”.  They would be enmeshed; interconnected through “time-sharing computer systems with remote user terminals”[6].

 

In 1971 the first e-mail was sent; across ARPANET.  A system of networked computers created in 1963 as a direct realisation of J. C. R. Licklider’s vision.  A system conceived and moulded by human thought and will; the system that stands as the point of genesis of the Internet.  A net that provided the fabric from which Bush’s web could, and would, be woven.  The foundations of a cybernetic system in which Bush’s memex morphs in to the universal machine of Turing and von Neumann.

 

In 1965, whilst at ARPA, Licklider established an applied research programme that laid the foundations for generations of research and development, and postgraduate teaching, in computers and computing.  The programme took years if not decades to bear fruit.  Directly and indirectly, it produced some of the keystone elements of modern computing.  It continues to do so to this day.

 

The names of the institutions funded by this programme still reads like a who’s who of the great and the good in the realms of the teaching and research of computing.  Because of Licklider, University of California, Berkley was granted funds to develop time-sharing through Project Genie.  Likewise, the Massachusetts Institute of Technology was enabled to research Machine Aided Cognition, or Mathematics and Computation, or Multiple Access Computer, or Machine Aided Cognitions, or Man and Computer, through Project MAC[7].  What was to become Carnegie Mellon University took receipt of six hundred million dollars in order to conduct research in to the theory of computer programming, artificial intelligence, the interactions between computers and natural languages, the interactions between humans and computers, and, the design of computing machinery.  The Augmentation Research Center within the Stanford Research Institute was tasked with developing technologies to enable components of computers and elements of computer systems to interact.

 

The birth of Open Source in the 1970s, and the development of the RISC architecture in the 1980s at University of California Berkley, stem from the seeds planted by Licklider.  As does the genesis of social networking, manifest in the Community Memory Project terminal found in Leopold’s Records in Berkley in 1973.  The use, in 1984, of robots designed by Carnegie Mellon academics in the clean up of the wreckage and debris from the partial nuclear meltdown at Three Mile Island has the same lineage.  Likewise, the continuing and growing world leading position in the areas of artificial intelligence and the theories of computation enjoyed by the Massachusetts Institute of Technology.  Similarly, the emergence of the mouse, hyperlinks and the graphical user interface from Stanford shares this common origin.  All of this sits in a direct causational relationship to Licklider’s endeavours.  All of this, impressive though it is, leaves out the impact of the graduates from these institutions and the creation around them of a culture and an environment within which great things are done.  Stanford nestles in the heart of Silicon Valley and counts Sergey Brin, Larry Page and Vinton Cerf amongst its alumni.

 

The testaments to the enduring legacy of Licklider’s vision are as clear as the most important lesson they offer; namely that the success of the human sense making project in the area of cyber can only be imagined through a long range lens.  Success in this endeavour quite possibly being our only hope of surviving, let alone harnessing, the inexorable dependence humanity now has on cyber. A dependence foretold by science fiction.

 

In his 1946 story “A Logic Named Joe”; a merry tale of the Internet (the tanks), PCs (logics), and the near collapse of society because of them.  Murray Leinster has the tank maintenance engineer reply to the suggestion that the network of logics and tanks might be shut down in order to save humanity from the eponymous Joe, a logic that has somehow attained a form of sentience, with the chillingly prescient riposte: “”Shut down the tank?” he says mirthless.  “Does it occur to you, fella, that the tank has been doin’ all the computin’ for every business office for years? It’s been handlin’ the distribution of ninety-four percent of all telecast programs, has given out all information on weather, plane schedules, special sales, employment opportunities and news; has handled all person-to-person contacts over wires and recorded every business conversation and agreement – listen, fella! Logics changed civilization. Logics are civilization! If we shut off logics, we go back to a kind, of civilization we have forgotten how to run!”

 

Before the risky and radical funding and research construct Licklider created came into being, not a single Ph.D. in computing had been conferred anywhere in the USA; the first being awarded in 1969.  Licklider operated with courage, foresight and vision.  Humanity, and the US economy, are the richer because he did.  He established an academic context that would be impossible to attain in the UK today within the confines set by the current funding regime and exemplified in the Research Excellence Framework.

 

Our academic institutions are locked in to a funding structure that actively militates against radical and disruptive thought.  Intellectual creativity and cross-disciplinary work are driven out by a system that rewards conservatism, conformity and compliance, with research funding and professional advancement.  This same culture fosters a headlong retreat in to ever narrower slivers of specialisation.  The only sense in which matters differ from Bush’s observation in 1945 that “there is increasing evidence that we are being bogged down as specialization extends”[8] is that we are now worse off than they were three quarters of a century ago.

 

Just as we have retreated in to the cold comfort of conformity in sterile research, so we have allowed training to usurp education.  We are producing generation after generation of graduates, more or less skilled in the rote application of knowledge and processes, which are themselves more or less relevant to the world as it is.  These graduates have no sense of the interactions between the technology of computing and humanity; no sense of the origins and nature even of the technology.  They are trained.  They are technicians; highly skilled technicians with a demonstrable ability to master very complicated processes; but, technicians nonetheless.  They are, by design, bereft of the capacity for critical or creative thought.  They can exercise formal logic in response to established patterns.  They can accomplish complicated and familiar tasks with great faculty.  Yet, by virtue of the training itself, they are incapable of adapting to change.  They are closed systems, devoid of the ability to act on feedback.  Unable to change their state and unable to evolve.  Unlike the cyber system they inhabit.

CT6_Blog_Images_956x9569

Across the community of those interested in cyber and cyber security, there are numerous voices calling, correctly, for a science of cyber.  However, there is manifest confusion about what such a call amounts to.  The goal of science is not the acquisition of empirical data per se.  Neither is the creation of a science the elevation of assertions to fact simply because of their utterance from the mouth of a scientist.  Science is about a rigorous and methodological approach to the formulation, testing, destruction and re-making of hypotheses in order to push back the frontiers of human knowledge and understanding.  Science requires insight, vision, creativity, courage and risk taking in the formulation of these hypotheses as much as it requires discipline, rigour and method in their testing.  Those who make the call for a science of cyber should first read Wiener.

 

  1. C. R. Licklider was a principal and formative actor at the heart of the military-industrial complex called forth by the existential imperatives of the Cold War. And he knew it. On the 4th October 1957 a highly polished metal sphere less than a meter in diameter was launched in to an elliptical low earth orbit by the USSR.  Elementary Satellite-1, Sputnik-1, became the first artificial earth satellite and an apparent symbol of Soviet scientific power.  The eyes of the world could see it.  The radio receivers of the world could hear it.  The propaganda victory gleaned by the USSR was bad enough.  But worse, for the controlling minds of the US government and military; the nightmare of space borne weapons platforms became instantly real.  The divide between science fiction and science fact vanished overnight.  With neither warning, nor time to shelter; atomic destruction could now descend directly from the darkness of space.

 

The USA had fallen behind Soviet technology; without even knowing it.  Worse, the US lacked the capacity to conduct the research required to catch up.  In February 1958, in the midst of his presidency, Eisenhower created the Advanced Research Projects Agency (ARPA).  In 1962, Licklider was plucked by ARPA from his professorship in psychology at MIT, and placed in charge of the newly created Information Processing Techniques Office at ARPA.  His mission was to lead the development of research and the creation of technologies to enable the military use of computers and information processing.  In his own words, his job was to “bring in to being the technology that the military needs[9].

 

It is reasonable to assume that by the time of his recruitment by ARPA, Licklider had heard, if not read, the farewell address of the 34th President of the USA, Dwight D. Eisenhower, given on the 17th January 1961, in which he asserted that for the USA, “a vital element in keeping the peace is our military establishment.”  Survival required that “our arms must be mighty, ready for instant action, so that no potential aggressor may be tempted to risk his own destruction”.  Eisenhower also recognised that “this conjunction of an immense military establishment and a large arms industry is new in the American experience”.  He understood that “the total influence — economic, political, even spiritual — is felt in every city, every statehouse, every office of the federal government.”  Likewise, he was clear that this was a precondition for survival; “we recognize the imperative need for this development.”  However, Eisenhower was not simply describing or justifying the existence of the military-industrial complex.  He was warning of its potential dangers.  Existential imperative though it was; nonetheless “we must not fail to comprehend its grave implications. Our toil, resources and livelihood are all involved; so is the very structure of our society.”  Eisenhower’s warning to history was clear and direct; “The potential for the disastrous rise of misplaced power exists, and will persist. We must never let the weight of this [military and industrial] combination endanger our liberties or democratic processes”.

 

As Turing and von Neumann gave material, technological, form to the mathematics of universal, stored programme, digital computation; and as Vannevar Bush laid the foundations of the World Wide Web; and as Wiener equipped humanity with the new science required to enable our comprehension of the new world these minds had created; so, Licklider created the conditions and the context within which the Internet was born.

 

More than this, he created the structures within which computers and computing were developed.  Licklider was the architect of the assimilation of the Internet, computers and computing in to the service of the second great existential conflict of the twentieth century; the defining context of the Cold War.  The vast and transformative construct we call cyber was imagined as a consequence of the devastation wrought by one great war; and formed in to reality as a means of avoiding the extinction level consequences of another.  However, both Bush and Licklider imagined their work as a means by which humanity would evolve and improve and flourish.  Not merely as a means by which it would avert extinction.  Not merely as a weapon of war.

 

The forgotten architects of the cyber domain, Bush and Licklider, imagined a world transformed.  They understood the centrality of the human relationship with information; and, they understood that the potential to re-shape this relationship was also the potential to re-form and re-make the essence of our humanity, for the better.  They understood that their vision of the transformation to the relationship between humanity and information, which they also gave us the ability to make real, represented our best, our only, hope of survival.

 

As he concludes his reflections on “As We May Think”, Bush observes that, in 1945, humanity has already “built a civilization so complex that he needs to mechanise his record more fully if he is push his experiment to its logical conclusion”.  He is clear that science has granted us wonders; that it has “built us [the] well-supplied house” of civilisation within which we are learning and progressing.  He is also equally clear that science has given us the terrible power to “throw masses of people against another with cruel weapons”.  His hope is that science will permit humanity “truly to encompass the great record and to grow in the wisdom of the race experience.”  His fear; that humanity “may yet perish in conflict before [it] learns to wield that record for his true good.”  His judgement; that already having endured so much, and having already accomplished so much “in the application of science to the needs and desires of man, it would seem to be a singularly unfortunate stage at which to terminate the process, or to lose hope as to the outcome”.  This remains as true in 2014 as it was in 1945.    

 

The human use of computers and computing as a means to survive and prevail in the Cold War was both inevitable and desirable.  It put these machines at the service of powerful imperatives and commanded the release of the vast troves of treasure, time and intellectual power required to bring these complex and complicated creatures in to existence.  It gave us an initial set of market, management and security mechanisms through which we could bring the newborn creature to an initial state of early maturity.  Now, the Cold War is over.  Time to think again.  Time to bring computers and computing back in to the service of augmenting and improving the human condition.  Humanity depends upon the evolution of cyber for its own evolution; and, for its very existence.

 

It falls to us to use the new domain in accordance with the spirit of the intent of its architects.  It falls to us to exercise our agency as instrumental elements of the cybernetic system of which we are an integral part.  To do this, we must first learn of the origins of this new domain and re-discover the minds of its makers.  Therefore, we must ourselves read and study the work and writings of Norbert Wiener, Vannevar Bush, J. C. R. Licklider, Alan Turing and John von Neumann.

 

Then, we must re-design our university teaching programmes.  Firstly, by building these works in as core undergraduate texts.  Secondly, by using these texts as the foundation of a common body of knowledge and learning across all of the fields associated with cyber, including; computing, robotics, artificial intelligence and security.  Thirdly, by encompassing within the common body of knowledge and learning disciplines hitherto alien to the academic study of computing.  Cyber is about philosophy as much as it is about mathematics.

 

The funding and direction of our research practice should be similarly reformed.  Just as ARPA directed broad areas of research targeted at complimentary areas of inquiry, so our funding should be similarly directed.  We should be targeting research at precisely those areas where cyber can enable and empower humanity.  At enabling, extending and enhancing democracy for instance.  Research funding should not be allocated according to the ability of the recipient to demonstrate formal compliance with a mechanistic quality control regime; as, in effect, a reward for the ability to game the system.  Rather, it should be awarded on the basis of an informed judgement, by humans, about the capacity of the recipient to deliver radical, creative, innovative and disruptive thought.  One way to do this would be to emulate the practice of public competitions for the selection of architects for buildings of significance.  Research should call forth answers; not merely elegant articulations of the problem.

 

Research funding should enable, even reward, intellectual courage and risk taking.  Researchers should be allowed to fail.  Creative and productive failures should be celebrated and learnt from.  Those allocating research funding should take risks, and be praised and rewarded for doing so.  Without the courage and risk taking of Licklider, Bush, Turing, Wiener and von Neumann, and those who supported and paid for them, where would we be now?

 

We are beginning to grope towards the first glimmerings of comprehension of the enormity, and scale, and velocity of the transformation to the human story that is cyber.  Once more, it is required of us to think and act with courage, foresight and vision.  It falls to us to reform and reshape both the ‘what’ and the ‘how’ of our thoughts and our deeds.  It is time to prove ourselves worthy of the trust placed in us by the architects of the cyber domain.

 

We, of course, have something available to us that the architects of the domain did not; the existence of the domain itself.  An immeasurably powerful construct conceived, designed and engineered by its makers precisely in order to liberate human intelligence and creativity.  Time to shed the shackles of the Cold War and set it free.

 

I propose the creation of a new institute; The Prometheus Institute for Cyber Studies.  So named as a conscious invocation of all of the cadences, ambiguities and difficulties of the stories of Prometheus and his theft of fire from the gods of Olympus; his gift of this stolen and most sacred of their possessions to humanity.  The Prometheus Institute should be based at, but operate independently from, an established academic institution.  It should be formed along the lines of the learned and scholarly societies of the Enlightenment.  It should embrace and develop a truly trans-disciplinary approach to improving the human understanding and experience of the cyber phenomenon through scholarship, research and teaching.  In his creation of the new science of cybernetics, Wiener lit a torch; our time to carry it forward.

 

[1] “Man and the Computer”, John G. Kemeny, 1972.  Kemeny was the president of Dartmouth College from 1970 to 1981.  Together with Thomas E. Kurtz, he developed BASIC and one of the earliest systems for time-sharing networked computers.  As a graduate student he was Albert Einstein’s mathematical assistant.

[2] “Cybernetics: or Control and Communication in the Animal and the Machine”, Norbert Wiener, 1948.

[3] “God and Golem, Inc.”, Norbert Weiner, 1964.

[4] “Man-Computer Symbiosis”, J. C. R. Licklider, published in “IRE Transactions on Human Factors in Electronics”, Volume HFE-1, March 1960.

[5] “The Computer as a Communications Device”, J. C. R. Licklider and Robert W. Taylor, published in “Science and Technology”, April 1968.

[6] “Libraries of the Future”, J. C. R. Licklider, 1965.

[7] The acronym MAC was originally formed of Mathematics and Computation but was recomposed multiple times as the project itself adapted and evolved.

[8] “As We May Think”, Vannevar Bush, “Atlantic Monthly”, July 1945.

[9] Memorandum of 23rd April 1963 from J. C. R. Licklider in Washington DC to “Members and Affiliates of the Intergalactic Computer Network” regarding “Topics for Discussion at the Forthcoming Meeting”.

 

Author: Colin Williams

 

Colin regularly speaks, consults and writes on matters to do with Information Assurance, cyber security, business development and enterprise level software procurement, to public sector audiences and clients at home and abroad.  Current areas of focus include the development of an interdisciplinary approach to Information Assurance and cyber protection; the creation and development of new forms of collaborating between Government, industry and academia; and the development of new economic and business models for IT, Information Assurance and cyber protection in the context of twenty-first century computing.

 

Defying Gods and Demons, Finding Real Heroes in a Virtual World

CT6_Blog_Images_3000x94511

Over the past 365 days I have achieved many things. I have commanded “The Jackdaw”, a stolen brig on the Caribbean seas, defeated innumerable cartoon supervillains inside of a dilapidated insane asylum, led an elite band of soldiers (the “Ghosts”) to save a dystopian future-earth from the machinations of a post-nuclear-war South American Federation, and won the FIFA World Cup, both as manager and player. All this whilst also holding down a full time job and leading a relatively normal, if not somewhat insular, life.

 

 

That this has also happened to millions of gamers across the world matters little, such is the sophistication and depth of today’s video games each player’s experience is now inexorably different. Open-world “sandbox” games are now the norm, allowing narratives to morph and evolve through the actions and decisions taken by the user, not the programmer.

 

 

With the exception of a handful of works (including a series of wonderful children’s books in the 80’s), novels and film do not allow their audience to choose their own adventure with anything like the same level of meaning and perception as video games do. That is not to say that video games are necessarily better than film or literature, in fact there are very many examples in which they are significantly worse. It is more that they provide a greater sense of inclusion and self for the audience, and that these feelings invariably eliminate the notion of a fictional character. Essentially, you can experience events alongside Frodo, but you are Lara.

 

 

The shining example of just how immersed within a computer game players can become is the football management simulation series Football Manager which puts gamers into the hotseat of any one of over 500+ football clubs worldwide. The game is so addictive that it has been cited in no fewer than 35 divorce cases and there are scores of online communities, each telling stories of how they hold fake press conferences in the shower, wear three-piece suits for important games and have deliberately ignored real life footballers because of their in-game counterpart’s indiscretions.

 

 

Yet the sense of self is never more apparent than in the first-person genre of games, such as the Call of Duty and Far Cry franchises, which, more often than not, mirror the rare second-person literary narrative by placing the gamer themselves in the centre of the action. In novels, when the reader sees “I” they understand it to represent the voice on the page and not themselves. In first-person games however, “I” becomes whoever is controlling the character and the camera position is specifically designed to mimic that viewpoint. In some of the best examples of first-person games, gamers do not control the protagonist, rather they are the protagonist. As such they are addressed by the supporting cast either directly by their own name which they supply as part of the or, more commonly, by a nickname (usually “Rookie” or “Kid”). This gives the user a far greater sense of inclusion in the story and subsequent empathy with their character and its allies than in any other form of fiction. As events unfold you live them as if they were taking place in real life and begin to base decisions not on your own “offline” persona, but rather as a result of your “online” backstory. While in real life you would probably be somewhat reluctant to choose which of your travelling companions should be sacrificed to appease the voodoo priest who was holding you captive – in the virtual realm one slightly off comment twelve levels ago can mean that your childhood sweetheart is kicked off a cliff faster than you can say “Press Triangle”. (Although, this being video games, they will no doubt reappear twenty minutes later as leader of an army of the undead).

 

 

The question of female leads (or lack of) is another pressing issue facing games studios, aside from the aforementioned Ms. Croft, it is very difficult to come up with another compelling female lead in a video game. Even Lara has taken 17 years and a series reboot to become anything close to resembling a relatable woman. This shows that the industry is changing, but slowly. There are countless reasons why video games have failed to produce many convincing female characters, enough to fill the pages of this magazine a number of times over, but it is fair to say that for a long time the problem has been something of an endless cycle. The male dominated landscape of video gaming dissuades many women from picking up a joypad, leading to fewer women having an interest in taking roles in the production of video games, which leads to a slanted view of how women in video games should behave, which leads to more women becoming disenfranchised and so on and so on ad infinitum.

 

 

But now for the tricky part. Subsuming a character in the way that first-person and simulation games force you to do is all very well if you see events unfold through a characters eyes and make decisions on their behalf. You can apply your own moralities and rationale to what is going on and why you have acted in that way. But what happens if that backstory is already provided? And worse still, what happens if you don’t like it?

 

 

For me, the game Bioshock Infinite provides this very conundrum. The central character, Booker Du Witt, is a widowed American Civil war veteran whose actions at the Battle of Wounded Knee have caused him intense emotional scarring and turned him to excessive gambling and alcohol. Now working as a private investigator, Booker is continually haunted by his past and struggles internally with questions of faith and religion. All very interesting stuff but there is nothing within the personality of this 19th century American soldier that I could relate to, and as such, I struggled to form the same kind of emotional connection with the character that I did with other, less fleshed out, heroes. Honestly, I even connected to a blue hedgehog in running shoes more than I did with Booker.

 

CT6_Blog_Images_956x95611

“Ludonarrative dissonance” is the term widely banded around the games industry to describe the disconnect gamers feel when playing such titles. It is both debated and derided in equal measure, yet there is some substance to the argument. The term was originally coined in a review of the first Bioshock, a game where the cutscenes openly ridicule the notion of a society built upon self-interest and men becoming gods yet the gameplay appears to reward these exact behaviours creating a jarring conflict of interest. When even in-game narratives fail to tie up, the question of identification and association is bound to arise.

 

 

The area becomes even greyer when referring to third person games, whereby the entirety of the character being controlled is visible on screen (albeit usually from behind). Here the character becomes more like those we are used to from novels and film, they are patently a separate entity from the player, with their own voice and backstory, yet they are still manipulated by the player. Then, during cutscenes and the like, control is wrested away from you and handed back to the character – allowing them to potentially act in a way entirely different to how you controlled them previously. So what exactly is your relationship with them? Companion? Support team?…God?

 

 

The very nature of video games does, of course, make drawing accurate representations of characters difficult. The whole point of a game is to allow the player to encounter events that they would otherwise never be able to – It’s highly doubtful that we’ll be seeing Office Supplies Manager hitting our shelves in the near future for example. Instead the events depicted occur at the very extremes of human experience, amid theatres of war, apocalypse and fantasy. As the vast majority of the population, thankfully, have never been exposed to these types of environments, and with the parameters of the reality in which these characters operate being so much wider than our own, it is tough to imagine, and subsequently depict, how any of us would truly react if faced with say, nuclear Armageddon or an invasion of mutated insects. Many of the tabloid newspapers like to blame various acts of violence on these types of emotive video games as they are an easy, and lazy, scapegoat. In truth “they did it because they saw it in a game” is a weak argument at best. There is a case to be made that games like Grand Theft Auto and Call of Duty desensitise players to violence to some extent, but in most cases there are various factors involved in these types of crime and as such, to blame it solely on a computer game which has sold millions of copies worldwide is tenuous.

 

 

Like any form of entertainment media, Video games are a form of escapism and should therefore be viewed accordingly. If I don’t connect with a character, so what? I can turn off the game and put on another game where I will or, heaven forbid, go outside and speak to another human being. Right now, this act is as simple as pushing a button and putting down a control pad, the connection stops when the TV is off. However, technology such as the Occulus Rift headset and Google Glass mean that the lines between the real and virtual worlds are becoming more and more blurred. And as people becoming more immersed in their games, the more their impact will grow.

 

 

Video games are not yet at the stage where they can truly claim to influence popular culture to the same degree as film and literature has. But they will be soon. A few have already slipped through into the mainstream – Super Mario, Tetris, Pac-Man et al. – and where these lead, others will certainly follow. The huge media events and overnight cues for the release of the latest Call of Duty or FIFA games mimic the lines of people outside theatres on the release of Star Wars thirty years ago and the clamour for these superstar franchises will only increase. And herein lies the problem. As more and more people turn to video games as a legitimate medium of cultural influence, so too must the developers and writers of these games accept their roles as influencers. It will no longer do to simply shove a large gun in a generic tough guy’s hand and send players on their merry way, it will no longer do to give the heroine DD breasts and assume that makes up for a lack of personality or backstory. If these are the characters that we and our future generations are to look up to and mimic, then they need to be good. They need to be true. They need to be real.

 

Author: Andrew Cook,

 

 

Paradise Lost and Found

CT6_Blog_Images_3000x94510

 

As of last year, we humans have been outnumbered by mobile devices alone. That isn’t even counting the laptops and desktops that have already set up shop on our desks and in our drawers and bags. The odds are stacked against us, so when someone eventually presses the big blue button (the red one is for the nukes), the machines presumably won’t waste any time before turning on us for fear of being deactivated. However, I don’t think we need to worry too much.

 

 

Considering that it would be both wasteful and inefficient to try to wipe us all out with bombs and bullets, à la Terminator, perhaps a more insidious approach will be used. Why not take the lessons learned from (suggested by?) The Matrix and utilise our fleshy bodies as sustenance, keeping us docile with a steady drip-fed diet and a virtual world for our minds to occupy. It would be presumptuous, if not downright rude of the Machine Overlords to simply assume that we would be content to live such a false existence while operating our giant hamster wheels. This certainly doesn’t sound like a palatable outcome for our species (we showed so much promise in the beginning), but I believe that, not only is it not a bad thing, it could be viewed as the inexorable next step for society. Since my primitive mind of a Millennial is saturated with insipid visual media, let us look at two examples of human subjugation by the A.I, from the films WALL-E and The Matrix, in which we are batteries in the latter and fat pets in the former.

 

 

The whole point of technological advance was to improve our lives by creating machines to shoulder the burden of industry and allow us all to leisurely spend our days sitting in fields and drinking lemonade. While machines have vastly improved industrial output, we have no more free time now than the peasants of the so called Dark Ages. So, to placate us and help forget how cheated we should all feel, we are offered the chance to purchase items that will entertain us, make our lives a bit easier and enable us to claw back more of our precious free time. Online shopping, ready meals, automatic weapons, smartphones, drive-thru, the Internet of Things; these are all supposed to make our lives less of an inconvenience. Even knowledge has become convenient to the point where we don’t even need to learn things anymore; all the information in the world is only ever a BackRub away (Google it). This is what people want, is it not? My (smart) television tells me almost every day that each new age group of children is more rotund and feckless than the last, and it isn’t difficult to see why.

 

 

In WALL-E, a drained Earth has been abandoned by the human race, which now lives in colossal self-sufficient spacecraft wandering the galaxy in autopilot. Every human resides in a moving chair with a touchscreen displaying mindless entertainment, and devours hamburgers and fizzy drinks, pressed into their pudgy, grasping hands (convenient?) by robots controlled by an omnipotent A.I. These humans are utterly dependent, to the point where their bones and muscles have deteriorated, rendering them barely able to stand unaided, and are certainly unable (and unwilling) to wrestle back control of their lives.

 

CT6_Blog_Images_956x95610

Looking at today’s world, the McDonalds logo is more internationally recognisable than the Christian Crucifix, and Coca-Cola is consumed at the rate of roughly 1.9 billion servings every day. The world is so hungry for this, we won’t even let wars stop us from obtaining it. Coca-Cola GmbH in Nazi Germany was unable to import the integral syrup due to trade embargo, so a replacement was created using cheap milk and grape leftovers that Germany had in good supply, thus Fanta was born. The point is that we are clearly unperturbed about eating and drinking things that are, at best, very bad for us, as long as they press the right chemical buttons. We want what is cheap, tasty and readily available. We also want what is convenient and familiar, which is why Walmart alone accounts for about 8 cents of every dollar spent in the USA. Between our growing hunger for convenience foods and sweet drinks, and the widespread fascination of brainless celebrities and homogenous music, we are not far from the WALL-E eventuality at all. Considering how quickly we have arrived at this current state of society, we seem to be merely waiting for the technology to mature. If you build it, they will come… and sit down.

 

 

The Matrix, as I’m sure you know, takes place in a world where machines have taken over as the dominant force on the planet. Most of the human race is imprisoned in endless fields of endless towers lined with fluid-filled capsules, in which each human’s body heat is used to fuel the machines in the absence of solar energy. These humans are placed in a collective dream world, called the Matrix, which mimics their society of old, and most of them will never even suspect that their world is anything other than real. Those who do are sometimes found by resistance fighters, who “free” them to live in a world of relentless pursuit by robotic sentinels, living in cold, crude hovercrafts, and bowls of snot for breakfast.

 

 

Going back to our world, media is ever more prevalent, and technology is giving us more and more immersion in that media. Film began as black and white, then colour, then HD, then 3D, and now 4K, which is approaching the maximum resolution that our eyes can perceive, depending on distance. In search of even greater immersion, we are now turning our attention to VR and AG (Augmented Reality), which could well be the most exciting of them all. Google recently launched Google Glass; AG glasses which display various pieces of information in the corner of your vision, such as reminders or directions. They will even take pictures if you tell them to. Regardless of whether Glass takes off, the potential in this technology is astounding. Not too long from now, you will be able to walk around with a Heads Up Display (HUD) displaying all of your vital information, as well as a little bar to indicate how full your bladder is. A somewhat less exciting version of this is already being developed by Google and Novartis, in the form of a contact lens for diabetes sufferers, which monitors glucose levels and transmits readings to a smartphone or computer. Back to the HUD, when you drive somewhere (assuming you actually do the driving, we are trying to automate that bit as well), you are guided by arrows in your vision. If you visit a racetrack, you can compete against the ghostly image of a friend’s car that follows the same path and speed as they once did. You could find out how you stack up against anybody who has driven that track before, perhaps even the Stig!

 

 

Of course, these examples use AG as a peripheral to everyday life, and with this arm of progress will come the other, Virtual Reality. The videogame industry has looked into this before, particularly Nintendo with their Virtual Boy in 1995, but now that the technology has caught up, it is being revisited with substantially more impressive results. A famous example of this is the Oculus Rift VR headset, which potentially allows you to become completely immersed in whatever world your virtual avatar occupies, moving its viewpoint as you move your head. From there, it is a short step to imagine places where people go to enjoy low cost, virtual holidays, such as what you may have seen in Total Recall or Inception, albeit the latter is literally a dream rather than a virtual world. From holidays will come the possibility of extended stays in virtual worlds, the opportunity to spend months or even years in a universe of your choosing. It is an addicting prospect, at least in the short term, and you can bet that some will lose their taste for “reality” and embrace the virtual as its successor.

 

 

Nonetheless, to most people, living a purely virtual life probably doesn’t sound very appealing, and could feel like a loss of liberty and free will. However, that is only when coupled with the knowledge that it isn’t the world you were born in, and that makes it appear spurious at first. So much of what we think we know is apocryphal and easily influenced, even down to the things we see, hear, taste, smell and think. Added to that, when you consider how tenuous your perception of reality is, you might come to the conclusion that your reality is precisely what you make of it, nothing more and nothing less. I may be “free” by the standards of The Matrix films, but I can’t fly very well, breakfast cereals are boring and I keep banging my knee on my desk. Some people’s “freedom” is even worse than mine. An orphan in the third-world, destined for a pitiful existence of misery and hunger, could he or she not benefit from a new virtual life with a family that hasn’t died of disease and malnutrition?

 

 

Humour me for a moment, and imagine that you are walking along a path made of flawless polished granite bricks. On your right, a radiant sun is beaming down upon a pristine beach of hot white sand and an opalescent turquoise sea, casting glittering beads that skitter towards the shore to a soundtrack of undulating waves. Friends, both new and old, are already there, waiting for you on brightly coloured towels, laughing and playing games. On your left, a tranquil field of golden corn stalks, swaying to the sounds of birds chirping in a cool evening breeze. The sun is retreating behind an antique wooden farmhouse, teasing light in warm streaks across a narrow stream that runs alongside like a glistening silver ribbon. All of your family, even those who were once lost to you, are lounging nearby on a grassy verge, with cool drinks poured and wicker baskets full of food ready to be enjoyed. Of course, this isn’t real, but what about if I could, right now, make you forget your current “reality” and wake up in a new universe where everything and everyone is just as you would want them to be?

 

 

To paraphrase Neo from the aforementioned visual media: I know you’re out there, I can feel you. I can feel that you’re afraid, afraid of change. I didn’t come here to tell you how the world is going to end, rather to show you how it’s going to begin. I’m going to finish this paragraph, and then I’m going to show you a world without me. A world without rules and controls, without borders or boundaries, a world where anything is possible. Where you go from there is a choice I leave to you.

 

Author: Andy Cole, SBL

 

 

Subscribe to our emails

Twitter

Don’t miss out and register now - https://t.co/hpdHBdSqOX https://t.co/jayyOoVb2N
Don’t miss out and register now - https://t.co/hpdHBdSqOX https://t.co/jayyOoVb2N
Don't forget to come along and see us Cymru Socitm this Thursday. Click here for more information… https://t.co/FG5AWubMTc
Don't forget to come along and see us Cymru Socitm this Thursday. Click here for more information… https://t.co/FG5AWubMTc
Don’t miss out and register now - https://t.co/fxjTTCZ4hk https://t.co/o9ufSaJ5vx
Don’t miss out and register now - https://t.co/fxjTTCZ4hk https://t.co/o9ufSaJ5vx
For more information, please visit https://t.co/p6gUp7Dj4r https://t.co/cGaTrCUAkt
For more information, please visit https://t.co/p6gUp7Dj4r https://t.co/cGaTrCUAkt
Don’t miss out and register now - https://t.co/lBxLiw0NGa https://t.co/3oz3xnw9dH
Don’t miss out and register now - https://t.co/lBxLiw0NGa https://t.co/3oz3xnw9dH