Back to main site

Tel: 01347 812150

Category Archives: Encryption

Codebreaker: Alan Turing’s Life and Legacy

British Science Museum, London

 

The Thin White Duke may be stealing the headlines, but as CyberTalk discovers, he’s not the only show in town…

CODEBREAKER copy

Walking along the South Kensington underpass, connecting the Underground station with London’s museum district, this summer you notice the faces of two, very different, British icons covering the Victorian brick walls. It is of no doubt that one dominates the other – his flame red hair contrasting sharply against his pale, impeccably white skin while his piercing, ice blue stare follows you up and down the crowded tunnels. This is David Bowie, resplendent in his iconic “Aladdin Sane” attire, driving visitors towards “David Bowie Is…” at the Victoria & Albert museum.

 

The second is far more reserved, almost apologetic in comparison. A single monochromatic image of a man many of those passing do not even recognise. His hair is immaculately tidy; his eyes gaze off into the distance as if in deep thought. This is Alan Turing, the father of modern computing, promoting “Codebreaker” at the British Science Museum.

 

Whilst both exhibitions cover the life and work of their protagonist, in many ways they reflect the personalities of their subjects also. Bowie, a unique and ubiquitous influence on popular culture, is bold, radical and enigmatic. Likewise, his exhibition has filled thousands of column inches and crams its vast space with towering video screens, outlandish costumes and intricate, in-depth details of a career spanning almost five decades.

 

By contrast “Codebreaker” is more condensed, more understated and yet equally as absorbing. Centred around the rebuilt Pilot ACE computer, reconstructed precisely to Turing’s original design, it tells the story of, not only the man himself, but also that of computing as a whole in six distinct sections, allowing visitors to move around at their own leisure. Though this means that objects and stories are not always presented in chronological order, the exhibition is still able to expertly weave key moments of Turing’s life with the impact they had in his thinking and scientific career, providing a fully rounded view of his personality, views and the events that shaped them.

CODEBREAKER

In addition to the ACE computer, the exhibition holds a number of artefacts central to the story of computing. Alongside early calculating machines from the late 1930’s and the wreckage of a 1950s Comet jet aircraft lie a number of original Enigma machines, including U-Boat and early military decoders as well as one on loan from the private collection of (as the caption below states) “the musician and film producer, Sir Michael Jagger”, proving there are some people whose achievements eclipse even Jumping Jack Flash himself.

 

Unlike his counterpart across the road, very little remain of Turing’s personal belongings. However, on display here, are a collection of heart breaking letters from Turing to the mother of his close friend, Christopher Morcom, who died tragically aged 18 from tuberculosis. Here Turing demonstrates not only his intense feelings and personal beliefs but also the first concepts about the nature of thinking which will be fundamental to the future developments of artificial intelligence.

 

The most striking exhibit of all, however, comes when you venture into section 5, “A Matter of Life and Death”. Here Turing’s work on morphogenesis – growth and patterns in plants and animals – on one side is painted in stark contrast on the other with a small bottle of oestrogen pills and a solitary piece of paper, detailing the findings of Turing’s post-mortem examination and its conclusion of suicide by cyanide poisoning. Together they provide a saddening vision of the abhorrent treatment Turing was subjected to in his final years. It is a powerful display which once again combines in a unique way his personal life with his scientific insights, and let us fully appreciate the extent of how important his thinking has been.

 

“Codebreaker” has been a critical success for the British Science Museum and its curator David Rooney and rightly so. The exhibit skilfully pieces together the history of one of this Britain’s finest minds and the many areas of research he influenced in a relatively small space, providing just enough interest and detail to enthral both keen enthusiasts and those with only a passing interest alike.

 

“Codebreaker: Alan Turing’s Life and Legacy” was a free exhibition at the British Science Museum to celebrate the centenary of Alan Turing’s birth.

 

Author:  Andrew Cook, SBL

 

All opinions are those of the Author and not those of SBL as a whole

The Alan Turing Legacy – the cycle of missed computing opportunities

mcafee-resized-for-emailBritain exited the Second World War with a wealth of scientific expertise and an industrial base that whilst in a slightly dishevelled state cosmetically, could be considered world-beating.  Computing, radar, manufacturing and ordinance in 1945 saw Britain leading the way.  Truth be told, the inheritance was squandered, by and large by a country that was financially broke, obsessed with its declining Empire and tired after the best part of 40 years of war.

 

What’s curious is just how badly the legacy was squandered in computing terms.  Name me a British hardware manufacturer that did well in the latter half of the 20th Century?  Ummmmm.   Amstrad?  Ha Ha! No, seriously? Acorn and Sinclair maybe?

 

Most of us were bought computers like this by our parents. Some of the time was spent playing Jetpac, Horace goes Skiing and Chuckie Egg, all with a 5 minute load time (usually followed by a crash and starting again).  But time was spent programming in a version of BASIC, writing programmes, exploring what you could get the machine to do.  The BBC B Micro went into schools, replete with educational games so dull as to render these things pariah status.   But again, kids programmed and shared their programmes.  I actually remember being in the First year (that’s Year 7 in modern coinage) and being in awe of one of the Sixth Formers who had games PUBLISHED for the ZX Spectrum.  Not quite rock star status, but genuinely an achievement to be proud of in 1984, and an inspiration to me and my nerdy chums.

 

Apologies if you are under 35 and this means nothing to you.  If you are over 35, it represented a golden era of programming and learning about programming.  If you are under 35, it may have meant a Sega Mega Drive.

 

I had one of these, and can claim with pride completing Sonic the Hedgehog without cheating, and without losing a life.  Halcyon days at University there.

 

The point was that computing had gone from being an interactive process, promoting programming skills, sharing of home-written code and a mild rise of the nerds to a world of closed OS, and an entirely gaming focused platform.  Who wants to write code when you could plug a copy of Streetfighter2 into your Super Nintendo and spend four consecutive hours kicking butt with Ryu’s special fireball move?

 

Parallels can be drawn with home PC’s.  As the computing power/price point made it affordable to have a PC/laptop at home, there was a brief flowering of HTML and JavaScript programming amongst users.  But this has been superseded by tablets with closed OS’s and smartphones.  You could argue that the App has potentially charged a renaissance, but the reality is you write Apps for money, and anyway, you have to get it approved by the OS vendor before it’s published.  There’s no joy in programming for the hell of it, no messing about with machine code just to see what you could get the thing to do.  The motivation has changed because of the grip of the manufacturers on your home-written code.

The UK Government has made noises about encouraging schools to teach programming for years, and has recently announced that it’s about to start teaching it (kind of) in schools.  It’s years too late.  There is a generation (or possibly two) that had access to affordable computing, clearly came from a nation that had the nouse and willingness to design and build great technology, but weren’t given the opportunities, incentives and encouragement to do so.  People from the engineering sector have moaned about the same thing for years.

 

McAfee is doing its bit to help.  At a local level, we do a lot of outreach activities including coding days in schools, as well teaching the kids about online safety.  We’ve also found that following up on these activities presenting at parents evenings generates a positive response, although the focus tends to be on what the parents can do to support the kids, since it’s them we’re really concerned about.

 

On a more national level, McAfee are helping BCS with its efforts to build a curriculum for schools teaching computer science.  This is a critical time for reversing the trend of the last 20 years of teaching (or non-teaching) of this topic, and McAfee are investing time heavily in this process.  It’s in everyone’s interest that this process succeeds, and McAfee believes it has a good long term return for all parties involved.

 

But back to my overarching point.  The cycle of technology, from open to closed repeats itself.  Kids right now don’t get ICT, they consume it.  And it means that the correct Legacy of Turing where a country of people played and programmed with equal measure is currently lost, and we are definitely worse off as a result.

 

Author: Graeme Stewart, McAfee

Bletchley Park’s Forgotten Heroes

Bletchley Parks Forgotten Heroes

Whenever anybody mentions code breaking, Bletchley Park and the Second World War, many of those who know the story instantly think of one man: Alan Turing. The Alan Turing story has become the overarching tale to come out of Bletchley Park since the secrets were finally unlocked in the 1970s. His achievements in breaking the ‘impenetrable’ Enigma code during the Second World War is no doubt outstanding, but what about the achievements of those whose stories have not made it into the mainstream?

 

During World War Two, there were over 6,000 people stationed at “Churchill’s House of Secrets” – Bletchley Park. Known as Station X, one of the most vital roles the site had to play was that of a code breaking station used to receive messages from interceptors and aim to crack the code, usually within hours of it being sent by the German Wehrmacht. The site didn’t feature on maps of the area, and for decades afterwards everyone who was there was sworn to strict secrecy after signing the Official Secrets Act when they commenced work.

 

In total, there were 118 code breakers based at Bletchley Park, doing various different functions. The high level intelligence decrypted from the German machines was called ‘Ultra’ and according to many sources it was thanks to this information that the war ended when it did. Had it not been for Ultra, the war may have continued for up to four years, at the cost of millions of lives.

 

When details of the work undergone at Bletchley Park did emerge, only a handful of the people who were there got the recognition that they deserved, and some of the others who changed the world, disappeared from history.

 

It seems that Alan Turing was only half of the story.

 

Enigma wasn’t the only machine used to encrypt messages during the Second World War. A German cipher system called Lorenz, nicknamed Tunny by the British, was used by the German High Command and was their most top-level code. Tunny was encrypted to an entirely different level, a level that nothing was known about. No one in Britain had even seen a Lorenz machine before, let alone tried to crack the codes it produced.

BLETCHLEY PARKS FORGOTTON HEROES copy copy (1)

Tunny presented a brand new challenge to the British code breakers, and it was 24 year old mathematician Bill Tutte who was called upon to help solve the code. Tutte had been stationed at Bletchley Park early in the war and had been given the responsibility of deciphering Italian Navy messages as part of the research station after being turned down for a position on Turing’s Enigma group.

 

In summer 1941, Tutte was transferred to working on the Tunny code and despite the difficult encryption levels and foreign machines, he managed to work out the logic behind the system using pieces of paper, a pencil and his brain.

 

Once the logic behind the code had been discovered it enabled the nine cryptanalysts in the Testery who were working on the Tunny code to decipher some of the most important messages, some of which were from Adolf Hitler himself.

 

In a 2010 interview with Computer Weekly (computerweekly.com), Captain Jerry Roberts, the last surviving member of the Testery said: “Bill Tutte was an astonishingly brilliant man. He was a 24 year old mathematician, and by sheer iron logic he worked out how the [Tunny] system worked. I was working in the same office as Tutte and I used to see him staring into the middle distance and twiddling his pencil and making endless counts. I used to wonder whether he was getting anything done, but he most emphatically was. When you consider that there were three levels of encryption, it was an extraordinary performance,” he says. “It has even been called the outstanding mental feat of the last century, and if you take into consideration everything that happened in the last century…”

 

Following the discovery made by Bill Tutte, the code breakers needed a machine to help them decipher the Tunny codes at a faster speed. The first machine produced, named Heath Robinson, was deemed too unreliable and slow for the job, so an engineer named Tommy Flowers was called upon to design a replacement.

 

In the coming months, Flowers and his team designed and built a machine at the Post Office Research Centre in Dollis Hill. The machine was dubbed Colossus, due to its gigantic size. The machine was to become the first all-electronic, digital and programmable computer and was a feat of design and engineering at the time.

 

The first Colossus machine became active in January 1944 and proved to be so useful in deciphering the Tunny codes, that ten machines were commissioned by Churchill’s Government to be used during the war. It was thanks to the decrypting of a German Tunny code, that vital information was received about the D Day Landings in June 1945.

 

After the war ended later in 1945, all but two Colossi were dismantled immediately, and Flowers was forced to destroy the blueprints linked to the design of the machine. Like Tutte, all of Flowers’ war work at Dollis Hill and Bletchley Park was top secret and couldn’t be spoken of until it became unclassified.

 

The final two Colossi were shipped to GCHQ and were ordered to be destroyed in 1960. It was at this time that Flowers knew his work was going to be lost to history, which is recalled in Sinclair McKay’s book, The Secret Life of Bletchley Park (2010):

“When interviewed some years ago, Dr Flowers himself recalled with some sadness the moment in 1960 when the orders came through to destroy the last two remaining Colossus machines, which had been shipped to GCHQ. ‘That was a terrible mistake,’ said Flowers. ‘I was instructed to destroy all the records, which I did. I took all the drawings and the plans and all the information about Colossus on paper and put it in the boiler fire. And saw it burn.’

BLETCHLEY PARKS FORGOTTON HEROES copy 2

For all of his efforts during the war, Flowers was awarded £1,000, which didn’t cover the personal investment he had made into the machine.  Shortly after the war was over, Flowers applied for a loan to build a computer with similar technology to Colossus. He was turned down as the bank didn’t believe such a machine was possible. Little did they know the hand that Flowers had had in winning the war a few years earlier!

 

 

The release of information about Flowers’ warwork came much too late to give him the full recognition he deserved. Even his family had no idea to what level he had been involved in the war, having only known he had done some ‘secret and important work’ prior to publication.

 

Recognition Flowers did receive included an honorary doctorate from Newcastle University in 1977, and another from De Montfort University in Leicester. It also became known that he was being considered for a knighthood, however these plans came too late for Tommy and he sadly died aged 92 in October 1998.

 

Likewise, after the war ended, Bill Tutte made a career for himself in academia that took him to Canada where he lived for most of his adult life. In 1987 his wartime effort was recognised and he was appointed a Fellow of the Royal Society of London, however this came long after his appointment as a Fellow of the Royal Society of Canada, which occurred in 1958. In 2001 Tutte won the CRM-Fields-PIMS Prize, a Canadian award in mathematical sciences. The recognition for his work happened late in his life, as Bill died in May 2002 aged 84.

 

During the course of World War Two, the British code breakers were responsible for the breaking of up to 4,000 German messages per day and helped to keep the allied forces one step ahead of the Nazi war movement. Their contribution to the British war effort was decisive in the outcome of the war.

 

Bill Tutte and Tommy Flowers are just two people who helped to change the course of World War Two. There is no doubt that there are indeed more men and women who worked remarkably hard to aid Britain’s war effort who simply haven’t had the recognition they deserve for their work.

 

Author: Helen Morgan, CyberTalk

Opinions are those of the author and not those of SBL as a whole

The Enigmatic Alan Turing – A Biography

THE ENIGMATIC ALAN TURING

Alan Turing provided a rare and remarkably rich contribution to society, for which the human race will be eternally indebted. Within only 20 years between his university graduation and death in 1954, he was able to pioneer ground-breaking ideas in the fields of mathematics, philosophy, biology and, most significantly, computer science. On top of all this, he was instrumental in changing the course of World War II, preventing the deaths of millions. Yet, he was unknown in life and remained relatively unknown in death, until 1974 when details about his work at Bletchley Park were finally released.

Alan Turing was born on 23 June 1912 to Julius and Ethel Turing. He showed signs of genius from a very early age and was extraordinarily quick to learn new skills. It’s said that he taught himself to read within just three weeks. Numbers fascinated the young Turing, so much so that he developed a habit of stopping at every street light in order to find its serial number.

Turing had a varied experience with the English school system and, although his aptitude was recognised as “genius” by some teachers, he was uninterested in classical education in the curriculum, as he was fixated by science and mathematics. He also encountered bullies during his school life, and proclaimed that he learned to run fast in order to “avoid the ball.” Running would later become a dedicated pastime, which Turing found gave him for clarity of thought. He even went on to achieve world-class marathon standards, as his best time of 2:46 hours was only 11 minutes slower than the Olympic winner in the same year, 1948.

 

Turing began to enjoy his school life more when he moved into Sherborne Sixth Form (pictured), where he was allowed to specialise in science and mathematics. His fascination in science even led him to want to further prove the Earth’s rotation, by building a replica of the Foucault Pendulum in the dormitory stairwell.

 

Awarded a major scholarship to King’s College, Cambridge in 1931, Turing read theoretical mathematics and excelled, obtaining a distinction upon graduation. He was elected a Fellow of King’s College in 1935. Only a year later he presented his first paper to the London Mathematical Society.

 

The following years saw the pinnacle of Turing’s genius and innovation. After graduating, Turing focused his efforts on a widely known fundamental problem in mathematics, known as the Decidability Problem, a product of the work of David Hilbert, a German mathematician. In 1900, Hilbert appealed to his contemporaries to find an algorithmic system to answer all mathematical problems – a system which must be complete, decidable and consistent. These three conditions would mean that every mathematical statement could either be proven or disproven by clear steps that would reach the correct outcome every time.

 

Kurt Godel had already disproved Hilbert in 1931 by showing that consistency and completeness could never exist in such a system. However, the requirement of decidability (i.e. that definite steps are followed in order to prove or disprove a statement) was left unanswered.

 

Turing went on to solve the Decidability Problem as a result of his ideas for his famous Turing Machine. He conceived of a powerful computer that would only understand the digits 0 and 1. The idea involved an infinite tape of these numbers that would be written and read by the machine. Different sequences of numbers would lead to a variety of functions and the solving of various problems. The machine’s design was a breakthrough in that it defined and codified algorithms. In developing this heavily mathematical concept, Turing found that problems did exist that algorithmic systems couldn’t solve. This thereby proved that Hilbert’s final condition of decidability could not be obtained.

 

Turing, at the age of just 23, had solved a problem that had baffled experts in the field for over 30 years. His name would go down in history books and the title “genius” began to be ascribed to him once again. No one could predict, however, that the principle that Turing conceived, the Universal Turing Machine, would be inherent in the development of technology for decades to come.

 

After several more years of study in America, Turing was awarded a PhD from Princeton University. On September 4, 1939, the day after the declaration of war by Prime Minister Chamberlain, Turing reported for duty at the wartime headquarters of the Government Code and Cypher School (GC&CS) at Bletchley Park, Buckinghamshire. It was here that Turing developed his preformed computing designs and adapted them to the field of cryptanalysis to decipher Enigma, the code used by the German armed forces to protect their radio communications.

 

Before Turing started work at Bletchley Park, progress on decryption had been poor, with no Enigma-coded message being successfully decoded in almost 10 months. Turing contributed crucially to the design of electro-mechanical machines (known as ‘bombes’), the first prototype of which, named Victory, began successfully decoding Enigma in Spring 1940.

 

By 1943, Turing’s machines were cracking an astounding total of 84,000 Enigma messages each month – two messages every minute. These operations were kept absolutely confidential, and Nazi forces were completely unaware that their strategic conversations were being read by the English Admiralty, sometimes within 15 minutes of being transmitted.

 

A pivotal use of this advantage was to stifle the efforts of U-boats, the prevalence of which was the only thing that Churchill said “ever really frightened” him. U-boats had destroyed 701 Allied ships and 2.3m tons of vital cargo in the first nine months of conflict alone. Turing et al. were able to significantly weaken the U-boat’s hold over the Pacific Ocean by intercepting their locations. Allied ships could therefore dodge the U-boats in the vast openness of the Pacific, allowing Allies to transport fuel, food, troops and ammunition from America to Britain.

 

If the U-boats had been allowed to halt the movement of this cargo, such an effective attack at Normandy would never have been possible. Without the D-Day Landings, the war could have gone on for 2 – 3 more years which, analysts say, could have meant a further death-toll of up to 21 million.

 

Having played a key role in the winning of the war by the Allies, Turing was later awarded the OBE for his wartime services.

 

In 1946, Turing was invited to the National Physical Laboratory where he designed the first stored-program computer, the Automatic Computing Engine (ACE). Turing went on to work at the University of Manchester from 1948, where he developed software for the Manchester Mark 1 computer, a truly ground-breaking piece of technology.

 

Meanwhile, Turing was fascinated by the potential of computer technology. His seminal 1950 article “Computing machinery and intelligence” set the philosophical world alight with the proposal that machines can, in principle, achieve a level of consciousness equal to that of humans. This sparked a debate that still rages today, and paved the way for the Artificial Intelligence movement, which works tirelessly to improve the capacity of computers with the ultimate goal of consciousness.

 

In 1952, Turing moved into a new area when he became enamoured with mathematics in the natural world, and pioneered the theory that the Fibonacci sequence is ubiquitous in morphogenesis (the formation of an organism’s shape). This astounded biologists, and quickly Turing established himself as a highly esteemed, revolutionary thinker in the field.

 

Sadly, his ground-breaking work around morphogenesis was cut short. Through most of his academic and professional life, Turing was open to his friends and colleagues as homosexual. In 1952 however, Turing began a new relationship, and this became an issue, eventually leading to his social and emotional collapse. Just weeks after his relationship with Arnold Murray began, Turing’s home was burgled. Turing naturally reported the crime to the police, who proceeded to arrest Turing and Murray on grounds of gross indecency due to their homosexuality, illegal at the time.

 

Turing was convicted in March 1952 and had the choice of imprisonment or probation under the condition that he underwent hormonal treatment to reduce his libido and cause impotence. He chose the latter option. Disgraced by the British judicial system and undergoing enforced treatment to reduce his masculinity, he was naturally depressed and it is widely believed that this led to his suicide in June 1954.

 

When the history of Turing’s legacy and contribution during WW2 became better known, public anger at his treatment grew. After a petition with thousands of signatures in 2009, Gordon Brown apologised on behalf of the Government for the “utterly unfair” way in which Turing was treated.

 

The Government has also recently announced its support for granting a posthumous pardon to Turing for his conviction. In October 2013, the third Parliamentary reading of the Alan Turing (statutory pardon) Bill will take place. If the bill is passed, it will help reflect public regret for the way Turing was treated and the high level of esteem in which he is now held.

 

Modern society is indebted to this remarkable man in more ways than we can imagine. Turing’s contributions to code-breaking in World War II were arguably instrumental in the Nazi’s defeat, preventing the deaths of millions and changing the course of history. Furthermore, his revolutionary work in computing provided the foundation for an infinite array of ever-emerging technologies which the human race will continue to rely on for many years to come.

Author: Tom Hook

The dawn of the GPMS – A Brave New World

Not a particularly snappy and exciting subject for a first blog.  Nor is it a subject free from danger of upsetting anyone, in fact I would say it’s a political hot potato!   That said, one of my favourite quotes is from Aristotle and is simply this “to avoid criticism, say nothing, do nothing, be nothing”, so in that spirit I’ll give it a go and offer some slightly opinionated commentary on the subject.

 

For the uninitiated, and in summary, GCHQ (the National Technical Authority) are in the process of a dramatic overhaul of the existing data classification programme.   The formulation of this programme (GPMS) is still progressing and subject to change before it is officially unleashed upon the public sector next year. However, indications are that PRIVATE (Impact Level 2), RESTRICTED (IL3), & CONFIDENTIAL (IL4) classifications will move across to a classification called “OFFICIAL”.   The higher levels of classification SECRET (IL5) and TOP SECRET (IL6), the apex of the security pyramid, will prevail.   Or so we believe.   I caveat that the situation still has a degree of fluidity, and I can only comment upon the detail that has so far been discussed within the various forums.

 

There is no doubt that despite the final detail; this will have profound and far reaching implications across all security decisions throughout the public sector.   There will a huge degree of complexity and with it, market confusion.     That said I have chosen the subject of encryption as it provides a good lens through which we can start to understand these implications because the choice of encryption technology is/was fixed and aligned to the old/current CESG Accreditation Scheme (CAPS) and the GPMS will act as the proverbial nuke to that system.

 

Common criticisms of the CAPS programme (with maybe one or two of my own thrown in) are:

 

  • It was too expensive, and the costs of accreditation and certification were invariably passed on to customers who had to pay more for certified technology.
  • It was far too slow.   It lagged behind the pace of technology innovation, which meant that customers buying CAPS products were often buying old technology.   This was not as new,   good or  capable as that of their potential adversaries for example.  And, the gap was widening at an exponential rate.
  • To attain certification some functionality may have been restricted or removed which compounds the above, because customers often knew they were paying more money for less functionality and less capability.
  • It created a culture whereby IT and security practitioners would use these products in anger, and then defend bad decisions or inappropriate use of technology by producing certificates that would in some way validate their choices.
  • Where there were gaps, areas that had no certified products, nothing was used!  Email encryption is a well know and used example.  This culture of defend the decision with a certificate, meant that it was deemed safer to ignore a problem, rather than use a commercially available product that would function perfectly well up to a certain level, albeit uncertified.   There are a collection of rather embarrassing anecdotes doing the rounds on this specific subject.   This subject is a whole blog in and of itself, however here is one to whet your appetite regarding email encryption: http://www.computing.co.uk/ctg/news/2120226/blunkett-france-tapped-uk-government-emails
  • The scheme was not commensurate with the reality of handling RESTRICTED data as a whole.  For example a Baseline CAPS product would need a strong HMG algorithm, long and complex passwords, and GCHQ generated key material, yet once data was out of the digital and into the analogue printed domain, to say the controls were somewhat weaker would be an understatement.     The effects of this would often manifest themselves in some bizarre decision making when it came to the classification of data and data types.  Decisions that would focus on making processes easier as opposed to classifying the data appropriately.

 

There are more, but I think the point is made, a point to which the National Technical Authority actually pretty much agree, or at least their recent actions would seem to suggest:  It’s broken, no longer relevant, falling further behind and needs fixing with some urgency.

 

To compound this further GCHQ’s customer base expands exponentially as we all scramble to get UK PLC ready to defend itself in the brave new interconnected world of the Cyber Domain.

 

In light of the above then, do I think GPMS is a good thing?   On balance and from what I’ve heard the answer is an emphatic yes!     My main reasons for this view are:

 

  • It will move the public sector away from the unhealthy bureaucratic culture of defending decisions based solely upon certification.
  • It will require the public sector to make its own decisions regarding security.   Decisions that will be local, and with local context.    For example, decisions that will take into account the risks, the threats, the impacts, and crucially the business requirements of each organisation.
  • It will enable better data classification decisions to be made.   This will open up new choices for customers, e.g. can we use COTS technology?  Can we implement a Cloud Service?  Can we utilise technology we may already own e.g. Bit Locker?   There!  I’ve said it!  Tin hat goes on.
  • Crucially and above all, accountability moves to the business and data owners.   Where it should be, locally, and where context can be applied.   As with everything thing in life context is critical in order to provide rationale and reasoning.

 

Speaking from the industry perspective, another effect of the GPMS will be a rapid reconfiguration of the technology and service market in this space.   This is great!  It has been needed from a long time, and I am convinced will come as a breath of fresh air to the benefit of us all.    We will all need to up our game; we will all need to innovate; we all have a more consultative role to play because the National Technical Authority need to have a wider affect across a much wider customer base, and to do so they need to complete their evolution to the organisation that provides guidance and support rather than mandate and certification.

 

GPMS will make customers responsible and accountable for data classification and data protection decisions.    This will enable them to explore new technologies and techniques.   It will give them access to new ideas and new technology.   It will enable them to access capability in a much more cost effective way, which will be driven also from the direction of the economic and budgetary pressures that they are under.

 

To succeed in this new environment they will need our experience and advice.   They will need our help to implement it properly, appropriately and to their best advantage.     Within industry we all have a duty to make the changes necessary to support these initiatives.   UK PLC will have to change and adapt at some pace to support our electronic safety, security and our economy in the Cyber Domain.  The GPMS is merely a component in a wider program of change vital to this sector, and we all now have an instrumental role in making this the success it needs to be.

Author: Scott Cattaneo

Subscribe to our emails

Twitter

RT @York_Col_BDU: We've got a #Cyber #Security taster day on 27th June with @SBL_UK and a 2 day course on 17-18th July: https://t.co/CiPBG…
Find out about Google's approach to security in the cloud, in the main conference room now @csp_york
Find out what security professionals can learn from Olympic Gold winning British sports in the main conference room now @csp_york
Go beyond next gen with Trend Micro X-Gen https://t.co/P4UtMbGSfj
Trend Micro are supporting CSPs Charity Partner with a fun cyber security quiz. Trend Micro will donate £1 for ever… https://t.co/tpf8P4WTtM