In the summer of 2005 a herd of twenty-three driverless cars barrelled across the Nevada desert, watched by scientists, engineers and nervous representatives of military funding agencies. Several hours later the first car crossed the finish line claiming a $2 million prize for the DARPA Grand Challenge and, naturally, the keen attention of DARPA itself. But it wasn’t just their interest that was piqued – journalists were also waiting to see if the whole field of artificial intelligence might emerge from the wilderness along with the beaten-up cars. John Markoff, writing for the New York Yimes, began his coverage of the event by describing AI as:
“…a technology field that for decades has overpromised and underdelivered… At its low point some computer scientists and software engineers avoided the term artificial intelligence for fear of being viewed as wild-eyed dreamers.”
It’s safe to say that artificial intelligence as a field has largely beaten off that image today, and is currently enjoying a golden age of investment, growth and discovery. In 2006 Ray Kurzweil wrote in his book ‘The Singularity’ that “the AI winter is long since over” – ‘AI winter’ being a term people use to describe catastrophic slumps that the field experiences following a period of prosperity. New techniques emerge that seem to solve problems better than ever before, forecasts and predictions are made about the future, hopes are raised, and then eventually the bubble of excitement bursts under the weight of its own expectations. The winter that follows is long – research funding is cut, tech startups shutter, businesses and governments withdraw interest, and the public loses their faith in the field. When Kurzweil wrote that the winter was over in 2006 he may have been talking specifically about the winter that took place in the 1990s, but it’s possible he was also talking more generally – many AI researchers I’ve spoken to believe this is it, that there will be no more winters. In 2012 Demis Hassabis, then the founder of a little-known company called DeepMind Technologies, declared that ‘the time is right for a push towards general AI’.
2017 is the summer solstice for artificial intelligence, the warmest and longest day, the kind of day that makes it feel like summer might last forever. But nothing lasts forever, and this season will pass like all the others have before it. The only thing that we can affect is how bitter and harsh the coming winter will be, and that is largely dictated by how badly let down people feel when the bubble finally bursts. What dream did we sell them, what did we let them believe, how did we advise them to act and spend their money? We need to start thinking about the image of artificial intelligence this year, and change it for the better.
Today is Ada Lovelace day, a day “about sharing stories of women — whether engineers, scientists, technologists or mathematicians — who have inspired you to become who you are today”. Lovelace was an incredible woman, described by Charles Babbage as “The Enchantress of Numbers”, and one of the first people to think about the kind of concepts that became integral to modern computer science. Through her understanding of Babbage’s Analytical Engine (a complex invention that was never built, but had many of the crucial features of a programmable computer) she wrote the first computer programs for a machine that didn’t exist. I wrote about how amazing she was as part of a piece for I, Science.
Today is about Ada, but it is also about celebrating inspirational women in science and technology, and given her huge input to my entire education, as well as ANGELINA itself, it seems foolish not to write something today about Azalea Raad.
For the last two weeks I’ve been answering questions and chatting with schoolkids as part of I’m A Scientist, Get Me Out Of Here! and today I discovered I was the last scientist standing in my zone, and I’ve won funding to put towards science communication! IAS 2012 was a really great experience, and if you’re a scientist – in any field, doing any form of work – you should think about taking part.
Below, I’ve written a bit about the event, some misconceptions I had before taking part, and why I think it’s important you sign up.
A lot of very mad things happened in the past 24 hours, most of which I’m trying to document on ANGELINA’s ‘In The Press’ page, but in short – Engadget, Kotaku, The Verge, New Scientist… the list goes on. Lots of people have been talking about ANGELINA, and I’ve been getting a huge amount of traffic over here. I just wanted to spend a few hundred words explaining why this matters to people like me, and why I’m so grateful when things like this happen.
This week ANGELINA and I are featured in the Tech section of UK science magazine The New Scientist, which is a huge honour and was a huge amount of fun to do. As part of the interview, we made a platform game for their site in the same vein as the Santa games you might have played back in December. I made a few tweaks to the system to improve the layout of powerups and general map design – I really recommend you go check the game out! The game may appear up here eventually, but if the New Scientist link continues to work I will probably leave it only on their site.
The New Scientist exposure had also led to a bunch of other mentions of ANGELINA all over the net. I’m really honoured! Check out the Press page for more. Meanwhile, here’s what’s going on with the project…
I’m giving one of the Advanced Object-Oriented Programming lectures to Imperial’s first-year Computing students next week. Definitely looking forward to it, and I hope they enjoy the talk I’ve got in store. I also thought I’d digest it and post it up here, as it neatly does something I’ve been wanting to do for a while on the blog – discuss the basics of evolution that underpin ANGELINA.