Duc Pham, chance professor of engineering, Department of Mechanical Engineering, University of Birmingham
Unless you are a hermit, you cannot escape the fact that artificial intelligence (AI) is now all the rage. Everyone is talking about it; you can hear it discussed ad nauseam on the radio and television, and there are countless articles mentioning it in the print media and on the internet. People from all spheres of life—pauper1, prince2, pope3 and prime minister4—have been making various pronouncements on it, and governments and large corporations have been throwing large sums of R&D money at it.
In the UK, as part of its Industrial Strategy, the government has declared ‘AI and data’ one of its Grand Challenges and pledged to put the country at the forefront of the AI and data revolution. Last May, in a speech at Jodrell Bank, in Macclesfield, UK, Prime Minister Theresa May set her government on a mission to “use data, artificial intelligence and innovation to transform the prevention, early diagnosis and treatment of diseases like cancer, diabetes, heart disease and dementia by 2030.”4
Other European countries have been no less enthusiastic on the subject of AI. For example, Finland, together with Estonia and Sweden, is planning to become “Europe’s number one laboratory for AI trials”5. In a strategy report last year, the Finnish government estimated that approximately one million of its citizens would eventually need to update their AI skills. Already, an initiative has educated one percent of Finns, without knowledge of programming, to understand AI and the opportunities it offers.
So, what is AI and why is there now such a fuss about it? After all, it has been around a long time. In fact, the American computer scientist John McCarthy coined the term over sixty years ago, during a summer workshop at Dartmouth College, in Hanover, US, where he was working as an assistant professor. AI is now commonly accepted as that branch of computer science concerned with making computers and other machines perform tasks typically requiring human intelligence.
AI has come in waves. In the mid-1950s—the so-called early days—Herbert Simon and Allen Newell developed computer programs for universal problem solving. However, given the computing technology available at the time, these programs could only address toy problems. Meanwhile, other researchers, most notably Frank Rosenblatt, looked to build computers modelled on the human brain. Rosenblatt focused on the development of a single-layer artificial neural network (ANN) called a perceptron that could be trained to perform simple pattern recognition tasks6. Yet limited success followed, and during the 1970s—after Marvin Minsky and Seymour Papert pessimistically observed that Rosenblatt’s perceptron could only learn linearly separable patterns (i.e. data classes separable by a straight line in input space)—neural computing suffered a serious setback.
In the mid-1960s, there was another AI wave. We saw the advent of software for solving problems that were difficult and real but also narrow in scope, for example, medical diagnosis and mineral prospecting. The computer programs developed, also known as expert systems, contained codified human expertise and used that knowledge to reach the required solution. We also saw the birth of fuzzy logic for handling problems characterised by imprecision and nature-inspired algorithms such as genetic and evolutionary algorithms for finding solutions to complex optimisation problems.
Interest in neural networks and expert systems resurged in the 1980s. The Fifth Generation Computer Systems initiative by the Ministry of International Trade and Industry in Japan triggered other funded research programmes here and elsewhere that focused on information technology and intelligent computing. For example, the UK government had its five-year Alvey programme of collaborative research in information technology. Europe-wide, over the period 1985–1998, there were five consecutive European Strategic Programmes on Research in Information Technology (ESPRIT).
It is not certain what started the current AI fever or when exactly it began. Whereas previous waves were associated with step changes in computer technology—the births of the digital computer, minicomputer and microcomputer—there has not been a single phenomenon that could be attributed to today’s renewed intense interest in AI. Instead, several factors have contributed to it: faster and cheaper hardware, availability of cloud services, autonomous cars, more powerful machine learning paradigms and spectacular software successes such as IBM’s Watson and Google’s DeepMind.
So, when will the fever subside? To find the answer, I was tempted to apply a theory by a UC Berkeley professor about the duration of research fashions. Cynically, he posited that they generally last 7–8 years or roughly two US election cycles. However, as election cycles differ around the globe and, moreover, the precise beginning of the current AI fever is unknown, that theory cannot predict the end with certainty. On the other hand, without having to be a great futurist, I can confidently say that this AI fever will pass when it is time for another fad to take over.
www.birmingham.ac.uk/schools/mechanical-engineering/index.aspx
References
1 McCormick, E. (2017). Big Brother on wheels? Fired security robot divides local homeless people [press release]. December 17. The Guardian. Available at: bit.ly/2HkfZbg
2Furness, H. (2018). Prince Charles warns of 'crazy' AI world of ‘part human, part machine’ [press release]. September 6. The Telegraph. Available at: bit.ly/2Mjs5jx
3Flashman, G. (2018). Robots must work for the good of humanity, the Pope tells Davos [press release]. January 22. World Economic Forum. Available at: bit.ly/2Ml0Gh2
4PM speech on science and modern Industrial Strategy: 21 May 2018. GOV.UK. Available at: bit.ly/2U238vO
5Delcker, J. (2019). Finland’s grand AI experiment [press release]. January 2. POLITICO. Available at: politi.co/2DkUDGD
6O’Riordan, A. (2005–6). An overview of neural computing (CS5201 notes) [lecture notes]. University College Cork. Available at: bit.ly/2RGKrRF