Back to main site

Tel: 01347 812150

The Forgotten Architects of the Cyber Domain

CT6_Blog_Images_3000x9459

 

“Very great and pertinent advances doubtless can be made during the remainder of this century, both in information technology and in the ways man uses it.  Whether very great and pertinent advances will be made, however, depends strongly on how societies and nations set their goals”

“Libraries of the Future”, J. C. R. Licklider, 1965

 

July 1945.  The Second World War is drawing to its final close.  Although the USA will continue to conduct operations in the Pacific theatre until a radio broadcast by Emperor Hirohito on September 15th announces Japan’s surrender; the outcome is assured.  Victory in Europe had already been achieved.  Germany had surrendered, unconditionally, to the Allies at 02:41 on May 7th.  By July, total Allied victory was an absolute inevitability.  Peace loomed.

 

The seeds of the Cold War had already been planted, perhaps as early as the Nazi-Soviet Non-Aggression Pact of 1939.  Certainly, by the frantic race to Berlin between the USSR from the East, and the UK and the USA from the West, following the Normandy landings and the Allied invasion of Europe.  From these seeds, the roots of the continuing, global, and existential struggle that was to define and shape the human story for what remained of the twentieth century were already growing; and at a remarkable rate.  However, it would not be until March 5th 1946 that Churchill would declare, “from Stettin in the Baltic to Trieste in the Adriatic an iron curtain has descended across the Continent”.

 

In July 1945, the deep terrors of the Cold War, were, for most of humanity, unforeseen and unimagined.  In July 1945, many of humanities finest minds were compelled to contemplate on the ruins of the world and, more importantly, on the new world that they would make to replace and improve that which had been destroyed.  Fascism had been defeated.  Democracy had prevailed.  A high price had been paid by victor and vanquished alike.  Cities, nations and empires lay in the ruins of victory as well as of defeat.  Amongst the victors, elation was tempered with exhaustion.  The UK economy in particular having been dealt a beating from which it would never recover.

 

The world had witnessed the capacity of human science and technology to mechanise and industrialise wholesale slaughter of soldiers and civilians alike; had watched the mass production of death played out on a global stage.  War, and genocide, had been refined by western civilisation to a grotesquely clinical exercise in accountancy and modern management.  The legitimacy of the European imperial project perished in the barbarity and horror of Auschwitz and Stalingrad.

 

In order to secure this victory, the entirety of the will, energy and treasure of the greatest nations on Earth had been devoted to one single aim; victory.  This had been a total war in every sense of the word.  Now that the victory had been attained; what next?  What was the new world to be remade from the ruins of the old to look like?

 

In the Summer of 1945, there was a sense that great things must now be done in order to ensure that the new world would be one worthy of the sacrifices that had been made; a peace worth the price.  All of this had to have been for something.  Amongst the great minds of humanity a sense had grown of the power of human agency and spirit to create great effect.  These were the minds that had harnessed the power of the atom, through technology, to the human will.  These were the minds that had created machines of vast power and sophistication to make and break the deepest of secrets.  These were the minds that sensed the expectations of history upon them.  It was their responsibility, individually and collectively, to secure the peace just as it had been to win the war.  It was their duty to enhance and improve the human condition.  And, they knew it.

 

In the July 1945 issue of the “Atlantic Monthly”, the man who had spent his war directing and channelling the scientific research required to secure victory in arms, responded to the imperatives of the peace, and the call of history, with the publication of the seminal paper “As We May Think”.  As first the chairman of the National Defense Research Committee, and then the director of the Office of Scientific Research and Development, Vannevar Bush was responsible for directing and co-ordinating the prodigious and ground breaking research required to enable the prosecution of total war on an industrial scale.

 

In his paper, Bush openly acknowledges that, for scientists, “it has been exhilarating to work in effective partnership” in order to attain a “common cause”.  He poses the question; “what are the scientists to do next”, now that the exhilaration of the war has ebbed away?  His answer is that the scientists of the peace must turn their attentions to making real the radical transformation in the relationships between humanity and information promised by the technology developed at such pace and cost during the war.  For Bush this is about far more than computers as great calculators for scientists; “a much larger matter than merely the extraction of data for the purposes of scientific research; it involves the entire process by which man profits by his inheritance of acquired knowledge”.

 

Bush proposed the creation of a device to extend and enhance the human memory; a machine to aid and augment the human powers of cognition, imagination and creation; a computer to work in symbiosis with the human.  He proposed a device that would operate as human thought does, “by association”.  For Bush, the human mind can, “with one item in its grasp”, link “instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain”.  He describes “a future device for individual use, which is a sort of mechanized private file and library”.  He gives it a name; the “memex”.  The memex, extends the human memory and the mind; “it is a device in which an individual stores all his books, records, and communications, and which is mechanised so that it may be consulted with exceeding speed and flexibility”.

 

He gives the new thing a form; it is “a desk”.  For the human, it is a “piece of furniture at which he works”, rather than a mysterious, inaccessible, gargantuan, monster-machine, filling an entire room.  It is moulded around human interaction; it has “slanting translucent screens on which material can be projected for convenient reading”.  For data entry and control there is both a “keyboard”, and “provision for direct entry” via a “transparent platen” upon which can be “placed longhand notes, photographs, memoranda”.  These originals can then be “photographed onto the next blank space in a section of the memex film.”  If at any time the user losses the thread of their interaction with the memex, a “special button transfers him immediately to the first page of the index”.

 

Bush’s paper lays out the essence of the core thinking upon which the World Wide Web was to be constructed.  Bush proved incorrect in his preference for analogue over digital computers.  However, his vision, of the human interactions with information augmented by symbiotic machines integrated by design in to the associative workings of human cognition, has been made real.  We see this in the algorithms driving today’s search engines; in the concepts and technology of hyperlinking that provides the threads of the Web built upon the decades of associative and cumulative thought he initiated, and; in the principals informing the graphical user interface model of mediating human communications with our machine counterparts.

 

Bush’s paper entered an intellectual context already populated with minds turned in the direction of computers and computing.  In particular, the minds of A. M. Turing and John von Neumann.  Turing’s 1936 paper “On Computable Numbers, With an Application to the Entscheidungsproblem” offers clear evidence of the direction of his pre-war thinking towards the possibility of a universal computing machine.  Whilst attending Princeton from 1937 to 1938, Turing encountered von Neumann for a second time, the two having met first when von Neumann was a visiting professor at Cambridge during the third term of 1935.  In 1946, John G. Kemeny had “the privilege of listening to a lecture at Los Alamos by … John von Neumann” in which von Neumann laid out five principals for the design of a computing machine of the future.  Kemeny’s memory has von Neumann’s proposals as: “1, Fully Electronic Computers”, “2, Binary Number System”, “3, Internal Memory”, “4, Stored Program” and, “5, Universal Computer”[1].  Turing’s direction of mind towards the question of machine intelligence is signposted in his lecture to the London Mathematical Society on 20th February 1947.  The world was not to know of Colossus for decades to come.

 

In 1948, Norbert Wiener publishes the first of a series of three books in which he announced the creation of a new science; “Cybernetics: or Control and Communication in the Animal and the Machine”.  The foundation text was followed by “The Human Use of Human Beings” in 1950, and the trilogy was completed in 1964 with “God and Golem, Inc.”.  Wiener’s mission in 1948 was to provide a “fresh and independent point of commencement” for “the study of non-linear structures and systems, whether electric or mechanical, whether natural or artificial”[2].  By 1964 he was reflecting on “three points in cybernetics”.  Firstly, “machines which learn”.  Secondly, “machines … able to make other machines in their own image”.  Thirdly, “the relations of the machine to the living being”[3].  Across the span of these texts, Wiener develops the conceptual, philosophical and mathematical framework that unifies and transforms human and machine in the cyber domain of today.

 

Two years before he joined the newly created Advanced Research Projects Agency in 1962, J. C. R. Licklider had already begun to direct his thoughts towards the “expected development in cooperative interaction between man and electronic computers” that will lead to a “man-computer symbiosis” in which “a very close coupling between the human and the electronic members of the partnership” will “let computers facilitate formative thinking” and “enable men and computers to cooperate in making decisions and controlling complex situations” [4].  By 1968, Licklider predicted with assured confidence that, despite it being “a rather startling thing to say”, nonetheless, “in a few years, men will be able to communicate more effectively through a machine than face to face.[5]

 

Three years on from his appointment to the new agency, in 1965, Licklider was commissioned with the production of a report in to the “Libraries of the Future”.  His task, not to examine new ways to store and retrieve books; but instead, to consider the “concepts and problems of man’s interaction with the body of recorded knowledge” and to explore “the use of computers in information storage, organisation, and retrieval.”  His prediction was that what he called a ‘procognitive’ system would evolve based on digital computers.  Outlandish though it might seem to the readers of the report in 1965, these computers would have; “random-access memory”, “content-addressable memory”, “parallel processing”, cathode-ray-oscilloscope displays and light pens”, “hierarchical and recursive program structures”, “procedure-orientated and problem-orientated languages” and “ xerographic output units”.  They would be enmeshed; interconnected through “time-sharing computer systems with remote user terminals”[6].

 

In 1971 the first e-mail was sent; across ARPANET.  A system of networked computers created in 1963 as a direct realisation of J. C. R. Licklider’s vision.  A system conceived and moulded by human thought and will; the system that stands as the point of genesis of the Internet.  A net that provided the fabric from which Bush’s web could, and would, be woven.  The foundations of a cybernetic system in which Bush’s memex morphs in to the universal machine of Turing and von Neumann.

 

In 1965, whilst at ARPA, Licklider established an applied research programme that laid the foundations for generations of research and development, and postgraduate teaching, in computers and computing.  The programme took years if not decades to bear fruit.  Directly and indirectly, it produced some of the keystone elements of modern computing.  It continues to do so to this day.

 

The names of the institutions funded by this programme still reads like a who’s who of the great and the good in the realms of the teaching and research of computing.  Because of Licklider, University of California, Berkley was granted funds to develop time-sharing through Project Genie.  Likewise, the Massachusetts Institute of Technology was enabled to research Machine Aided Cognition, or Mathematics and Computation, or Multiple Access Computer, or Machine Aided Cognitions, or Man and Computer, through Project MAC[7].  What was to become Carnegie Mellon University took receipt of six hundred million dollars in order to conduct research in to the theory of computer programming, artificial intelligence, the interactions between computers and natural languages, the interactions between humans and computers, and, the design of computing machinery.  The Augmentation Research Center within the Stanford Research Institute was tasked with developing technologies to enable components of computers and elements of computer systems to interact.

 

The birth of Open Source in the 1970s, and the development of the RISC architecture in the 1980s at University of California Berkley, stem from the seeds planted by Licklider.  As does the genesis of social networking, manifest in the Community Memory Project terminal found in Leopold’s Records in Berkley in 1973.  The use, in 1984, of robots designed by Carnegie Mellon academics in the clean up of the wreckage and debris from the partial nuclear meltdown at Three Mile Island has the same lineage.  Likewise, the continuing and growing world leading position in the areas of artificial intelligence and the theories of computation enjoyed by the Massachusetts Institute of Technology.  Similarly, the emergence of the mouse, hyperlinks and the graphical user interface from Stanford shares this common origin.  All of this sits in a direct causational relationship to Licklider’s endeavours.  All of this, impressive though it is, leaves out the impact of the graduates from these institutions and the creation around them of a culture and an environment within which great things are done.  Stanford nestles in the heart of Silicon Valley and counts Sergey Brin, Larry Page and Vinton Cerf amongst its alumni.

 

The testaments to the enduring legacy of Licklider’s vision are as clear as the most important lesson they offer; namely that the success of the human sense making project in the area of cyber can only be imagined through a long range lens.  Success in this endeavour quite possibly being our only hope of surviving, let alone harnessing, the inexorable dependence humanity now has on cyber. A dependence foretold by science fiction.

 

In his 1946 story “A Logic Named Joe”; a merry tale of the Internet (the tanks), PCs (logics), and the near collapse of society because of them.  Murray Leinster has the tank maintenance engineer reply to the suggestion that the network of logics and tanks might be shut down in order to save humanity from the eponymous Joe, a logic that has somehow attained a form of sentience, with the chillingly prescient riposte: “”Shut down the tank?” he says mirthless.  “Does it occur to you, fella, that the tank has been doin’ all the computin’ for every business office for years? It’s been handlin’ the distribution of ninety-four percent of all telecast programs, has given out all information on weather, plane schedules, special sales, employment opportunities and news; has handled all person-to-person contacts over wires and recorded every business conversation and agreement – listen, fella! Logics changed civilization. Logics are civilization! If we shut off logics, we go back to a kind, of civilization we have forgotten how to run!”

 

Before the risky and radical funding and research construct Licklider created came into being, not a single Ph.D. in computing had been conferred anywhere in the USA; the first being awarded in 1969.  Licklider operated with courage, foresight and vision.  Humanity, and the US economy, are the richer because he did.  He established an academic context that would be impossible to attain in the UK today within the confines set by the current funding regime and exemplified in the Research Excellence Framework.

 

Our academic institutions are locked in to a funding structure that actively militates against radical and disruptive thought.  Intellectual creativity and cross-disciplinary work are driven out by a system that rewards conservatism, conformity and compliance, with research funding and professional advancement.  This same culture fosters a headlong retreat in to ever narrower slivers of specialisation.  The only sense in which matters differ from Bush’s observation in 1945 that “there is increasing evidence that we are being bogged down as specialization extends”[8] is that we are now worse off than they were three quarters of a century ago.

 

Just as we have retreated in to the cold comfort of conformity in sterile research, so we have allowed training to usurp education.  We are producing generation after generation of graduates, more or less skilled in the rote application of knowledge and processes, which are themselves more or less relevant to the world as it is.  These graduates have no sense of the interactions between the technology of computing and humanity; no sense of the origins and nature even of the technology.  They are trained.  They are technicians; highly skilled technicians with a demonstrable ability to master very complicated processes; but, technicians nonetheless.  They are, by design, bereft of the capacity for critical or creative thought.  They can exercise formal logic in response to established patterns.  They can accomplish complicated and familiar tasks with great faculty.  Yet, by virtue of the training itself, they are incapable of adapting to change.  They are closed systems, devoid of the ability to act on feedback.  Unable to change their state and unable to evolve.  Unlike the cyber system they inhabit.

CT6_Blog_Images_956x9569

Across the community of those interested in cyber and cyber security, there are numerous voices calling, correctly, for a science of cyber.  However, there is manifest confusion about what such a call amounts to.  The goal of science is not the acquisition of empirical data per se.  Neither is the creation of a science the elevation of assertions to fact simply because of their utterance from the mouth of a scientist.  Science is about a rigorous and methodological approach to the formulation, testing, destruction and re-making of hypotheses in order to push back the frontiers of human knowledge and understanding.  Science requires insight, vision, creativity, courage and risk taking in the formulation of these hypotheses as much as it requires discipline, rigour and method in their testing.  Those who make the call for a science of cyber should first read Wiener.

 

  1. C. R. Licklider was a principal and formative actor at the heart of the military-industrial complex called forth by the existential imperatives of the Cold War. And he knew it. On the 4th October 1957 a highly polished metal sphere less than a meter in diameter was launched in to an elliptical low earth orbit by the USSR.  Elementary Satellite-1, Sputnik-1, became the first artificial earth satellite and an apparent symbol of Soviet scientific power.  The eyes of the world could see it.  The radio receivers of the world could hear it.  The propaganda victory gleaned by the USSR was bad enough.  But worse, for the controlling minds of the US government and military; the nightmare of space borne weapons platforms became instantly real.  The divide between science fiction and science fact vanished overnight.  With neither warning, nor time to shelter; atomic destruction could now descend directly from the darkness of space.

 

The USA had fallen behind Soviet technology; without even knowing it.  Worse, the US lacked the capacity to conduct the research required to catch up.  In February 1958, in the midst of his presidency, Eisenhower created the Advanced Research Projects Agency (ARPA).  In 1962, Licklider was plucked by ARPA from his professorship in psychology at MIT, and placed in charge of the newly created Information Processing Techniques Office at ARPA.  His mission was to lead the development of research and the creation of technologies to enable the military use of computers and information processing.  In his own words, his job was to “bring in to being the technology that the military needs[9].

 

It is reasonable to assume that by the time of his recruitment by ARPA, Licklider had heard, if not read, the farewell address of the 34th President of the USA, Dwight D. Eisenhower, given on the 17th January 1961, in which he asserted that for the USA, “a vital element in keeping the peace is our military establishment.”  Survival required that “our arms must be mighty, ready for instant action, so that no potential aggressor may be tempted to risk his own destruction”.  Eisenhower also recognised that “this conjunction of an immense military establishment and a large arms industry is new in the American experience”.  He understood that “the total influence — economic, political, even spiritual — is felt in every city, every statehouse, every office of the federal government.”  Likewise, he was clear that this was a precondition for survival; “we recognize the imperative need for this development.”  However, Eisenhower was not simply describing or justifying the existence of the military-industrial complex.  He was warning of its potential dangers.  Existential imperative though it was; nonetheless “we must not fail to comprehend its grave implications. Our toil, resources and livelihood are all involved; so is the very structure of our society.”  Eisenhower’s warning to history was clear and direct; “The potential for the disastrous rise of misplaced power exists, and will persist. We must never let the weight of this [military and industrial] combination endanger our liberties or democratic processes”.

 

As Turing and von Neumann gave material, technological, form to the mathematics of universal, stored programme, digital computation; and as Vannevar Bush laid the foundations of the World Wide Web; and as Wiener equipped humanity with the new science required to enable our comprehension of the new world these minds had created; so, Licklider created the conditions and the context within which the Internet was born.

 

More than this, he created the structures within which computers and computing were developed.  Licklider was the architect of the assimilation of the Internet, computers and computing in to the service of the second great existential conflict of the twentieth century; the defining context of the Cold War.  The vast and transformative construct we call cyber was imagined as a consequence of the devastation wrought by one great war; and formed in to reality as a means of avoiding the extinction level consequences of another.  However, both Bush and Licklider imagined their work as a means by which humanity would evolve and improve and flourish.  Not merely as a means by which it would avert extinction.  Not merely as a weapon of war.

 

The forgotten architects of the cyber domain, Bush and Licklider, imagined a world transformed.  They understood the centrality of the human relationship with information; and, they understood that the potential to re-shape this relationship was also the potential to re-form and re-make the essence of our humanity, for the better.  They understood that their vision of the transformation to the relationship between humanity and information, which they also gave us the ability to make real, represented our best, our only, hope of survival.

 

As he concludes his reflections on “As We May Think”, Bush observes that, in 1945, humanity has already “built a civilization so complex that he needs to mechanise his record more fully if he is push his experiment to its logical conclusion”.  He is clear that science has granted us wonders; that it has “built us [the] well-supplied house” of civilisation within which we are learning and progressing.  He is also equally clear that science has given us the terrible power to “throw masses of people against another with cruel weapons”.  His hope is that science will permit humanity “truly to encompass the great record and to grow in the wisdom of the race experience.”  His fear; that humanity “may yet perish in conflict before [it] learns to wield that record for his true good.”  His judgement; that already having endured so much, and having already accomplished so much “in the application of science to the needs and desires of man, it would seem to be a singularly unfortunate stage at which to terminate the process, or to lose hope as to the outcome”.  This remains as true in 2014 as it was in 1945.    

 

The human use of computers and computing as a means to survive and prevail in the Cold War was both inevitable and desirable.  It put these machines at the service of powerful imperatives and commanded the release of the vast troves of treasure, time and intellectual power required to bring these complex and complicated creatures in to existence.  It gave us an initial set of market, management and security mechanisms through which we could bring the newborn creature to an initial state of early maturity.  Now, the Cold War is over.  Time to think again.  Time to bring computers and computing back in to the service of augmenting and improving the human condition.  Humanity depends upon the evolution of cyber for its own evolution; and, for its very existence.

 

It falls to us to use the new domain in accordance with the spirit of the intent of its architects.  It falls to us to exercise our agency as instrumental elements of the cybernetic system of which we are an integral part.  To do this, we must first learn of the origins of this new domain and re-discover the minds of its makers.  Therefore, we must ourselves read and study the work and writings of Norbert Wiener, Vannevar Bush, J. C. R. Licklider, Alan Turing and John von Neumann.

 

Then, we must re-design our university teaching programmes.  Firstly, by building these works in as core undergraduate texts.  Secondly, by using these texts as the foundation of a common body of knowledge and learning across all of the fields associated with cyber, including; computing, robotics, artificial intelligence and security.  Thirdly, by encompassing within the common body of knowledge and learning disciplines hitherto alien to the academic study of computing.  Cyber is about philosophy as much as it is about mathematics.

 

The funding and direction of our research practice should be similarly reformed.  Just as ARPA directed broad areas of research targeted at complimentary areas of inquiry, so our funding should be similarly directed.  We should be targeting research at precisely those areas where cyber can enable and empower humanity.  At enabling, extending and enhancing democracy for instance.  Research funding should not be allocated according to the ability of the recipient to demonstrate formal compliance with a mechanistic quality control regime; as, in effect, a reward for the ability to game the system.  Rather, it should be awarded on the basis of an informed judgement, by humans, about the capacity of the recipient to deliver radical, creative, innovative and disruptive thought.  One way to do this would be to emulate the practice of public competitions for the selection of architects for buildings of significance.  Research should call forth answers; not merely elegant articulations of the problem.

 

Research funding should enable, even reward, intellectual courage and risk taking.  Researchers should be allowed to fail.  Creative and productive failures should be celebrated and learnt from.  Those allocating research funding should take risks, and be praised and rewarded for doing so.  Without the courage and risk taking of Licklider, Bush, Turing, Wiener and von Neumann, and those who supported and paid for them, where would we be now?

 

We are beginning to grope towards the first glimmerings of comprehension of the enormity, and scale, and velocity of the transformation to the human story that is cyber.  Once more, it is required of us to think and act with courage, foresight and vision.  It falls to us to reform and reshape both the ‘what’ and the ‘how’ of our thoughts and our deeds.  It is time to prove ourselves worthy of the trust placed in us by the architects of the cyber domain.

 

We, of course, have something available to us that the architects of the domain did not; the existence of the domain itself.  An immeasurably powerful construct conceived, designed and engineered by its makers precisely in order to liberate human intelligence and creativity.  Time to shed the shackles of the Cold War and set it free.

 

I propose the creation of a new institute; The Prometheus Institute for Cyber Studies.  So named as a conscious invocation of all of the cadences, ambiguities and difficulties of the stories of Prometheus and his theft of fire from the gods of Olympus; his gift of this stolen and most sacred of their possessions to humanity.  The Prometheus Institute should be based at, but operate independently from, an established academic institution.  It should be formed along the lines of the learned and scholarly societies of the Enlightenment.  It should embrace and develop a truly trans-disciplinary approach to improving the human understanding and experience of the cyber phenomenon through scholarship, research and teaching.  In his creation of the new science of cybernetics, Wiener lit a torch; our time to carry it forward.

 

[1] “Man and the Computer”, John G. Kemeny, 1972.  Kemeny was the president of Dartmouth College from 1970 to 1981.  Together with Thomas E. Kurtz, he developed BASIC and one of the earliest systems for time-sharing networked computers.  As a graduate student he was Albert Einstein’s mathematical assistant.

[2] “Cybernetics: or Control and Communication in the Animal and the Machine”, Norbert Wiener, 1948.

[3] “God and Golem, Inc.”, Norbert Weiner, 1964.

[4] “Man-Computer Symbiosis”, J. C. R. Licklider, published in “IRE Transactions on Human Factors in Electronics”, Volume HFE-1, March 1960.

[5] “The Computer as a Communications Device”, J. C. R. Licklider and Robert W. Taylor, published in “Science and Technology”, April 1968.

[6] “Libraries of the Future”, J. C. R. Licklider, 1965.

[7] The acronym MAC was originally formed of Mathematics and Computation but was recomposed multiple times as the project itself adapted and evolved.

[8] “As We May Think”, Vannevar Bush, “Atlantic Monthly”, July 1945.

[9] Memorandum of 23rd April 1963 from J. C. R. Licklider in Washington DC to “Members and Affiliates of the Intergalactic Computer Network” regarding “Topics for Discussion at the Forthcoming Meeting”.

 

Author: Colin Williams

 

Colin regularly speaks, consults and writes on matters to do with Information Assurance, cyber security, business development and enterprise level software procurement, to public sector audiences and clients at home and abroad.  Current areas of focus include the development of an interdisciplinary approach to Information Assurance and cyber protection; the creation and development of new forms of collaborating between Government, industry and academia; and the development of new economic and business models for IT, Information Assurance and cyber protection in the context of twenty-first century computing.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Subscribe to our emails

Twitter

RT @York_Col_BDU: We've got a #Cyber #Security taster day on 27th June with @SBL_UK and a 2 day course on 17-18th July: https://t.co/CiPBG…
Find out about Google's approach to security in the cloud, in the main conference room now @csp_york
Find out what security professionals can learn from Olympic Gold winning British sports in the main conference room now @csp_york
Go beyond next gen with Trend Micro X-Gen https://t.co/P4UtMbGSfj
Trend Micro are supporting CSPs Charity Partner with a fun cyber security quiz. Trend Micro will donate £1 for ever… https://t.co/tpf8P4WTtM