Can machines think? Lessons from Alan Turing
August 5, 2019
By Andrew Peacock, Account Manager
In his seminal paper ‘Computing Machinery and Intelligence’, Alan Turing said “We can only see a short distance ahead, but we can see plenty there that needs to be done.”
How right he was.
Almost 70 years after publishing the paper, Turing – who is broadly regarded as the father of theoretical computer science and artificial intelligence – will be immortalised as the face of Britain’s latest £50 note.
During Turing’s lifetime (1912-1954), someone hearing the term ‘computer’ would not picture the electronic devices we use today. Instead, they would likely think of a mathematical clerk or rote-worker, whose roles shared the same name at the time. (I imagine them wearing a classic green eyeshade visor.)
Turing’s work set us on a course which led to the transformative computing revolution that has defined the modern age, the evolutionary pace of which shows no sign of slowing.
While investigating the Entscheidungsproblem – which essentially questioned the viability of a formal system that would reduce mathematics, in its entirety, to set of methods that human computers could complete – our subject invented the universal Turing machine.
Although only an abstract idea at the time, this revelation surmised the central logical principles of today’s digital devices. Essentially, Turing, with support and input from Church, claimed that anything humanly computable could also be computed by his machine. According to the Church–Turing thesis, Turing machines and the lambda calculus (as developed by Church in the early 1930s) were capable of computing anything that is computable.
Turing had a remarkable academic career.
Born in London in June 1912, he became an alumnus of Cambridge where, following research into probability theory and hypothesising ‘On Computable Numbers, with an Application to the Entscheidungsproblem (Decision Problem)’ he earnt his MA.
From September 1936 – July 1938, he spent most of his time studying under Alonzo Church at Princeton University where he ultimately earnt his PHD.
A year later, following the outbreak of war between Britain and Germany, Turing reported to Bletchley Park, the wartime station of Government Code and Cypher School. (To find out more about Turing’s remarkable wartime career, visit the Bletchley Park website or better still, in person.)
On 8 June 1954, Turing was found dead by his housekeeper, beside him lay the half-eaten apple which many believe to be the vehicle used to deliver the cyanide that cut short his life.
60 years following his death, Turing is still admired and respected by today’s tech world and the accolades given to him by the community are substantial. To name a few, in addition to his face on the most valuable British currency note, he is commemorated in:
Some even hypothesise that Apple’s famous icon is a homage to Turing, though this is now widely disputed.
Beyond the epithets, by proposing a machine that could be given changeable instructions (in a series of 0s and 1s) in order to perform a task, Turing’s work has contributed to humanity’s greatest successes – such as putting man on the moon – and impacts our day-to-day lives – that’s right, even your fancy coffee Instagram photos depend on this man’s genius.
Looking to the future it is important to remember Turing’s imitation game. Created to test whether a human interrogator would be able to discern between the responses of a human and a computer, the game has ultimately led to the creation of systems such as Microsoft’s Cortana and Apples Siri.
As the ceaseless trek to further improve our modern lives soldiers on, Turing’s vision and work will continue to enhance our digital world. He not only deserves to feature on the new £50 notes, he deserves instant recognition as without him, it’s fair to say today’s world would be unrecognisable.
Pride and prejudice
To find out more about the tragic prejudice and persecution that lead to Turing’s death, please see the Touchstone blog here.
Categorised in: Blog