AI People You Should Know About, Pt. 1

electric

I saw an article today about the future of AI in games and suchlike and I was tempted to start tweeting about it but that inevitably leads to boring arguments and isn’t very constructive. Instead, what I’m going to do is give you a list (in no particular order) of some researchers who I think are really interesting, who are important to the future of game AI, and who have interesting things to say, and most importantly who I don’t see interviewed or talked about enough. They’d all make great people to talk to for articles, features and interviews, and each one has a research portfolio that paints a cool future for games. Go check them out!

Continue reading

Would You Look At That!

samplelevel

I’ve been really excited and interested in level design recently, and reading a lot of work by folks like Robert Yang about lighting, space, and building worlds in 3D. It’s amazing stuff and it links in really well to the research I want to do right now (mostly because it’s influencing the research I want to do right now!) I wanted to write a little update about some work I did recently along these lines – building a level generator that uses in-game cameras to evaluate levels.

Continue reading

Appreciating Bots

Screen Shot 2014-09-11 at 12.13.07

Recently, Appreciation Bot – my Twitter bot that responds to museum artefacts with pseudo-intellectual responses – tweeted something a bit off-colour. Not something intentionally offensive perhaps, but certainly something that would raise eyebrows were a human to tweet it. I didn’t include the tweet directly but you can view it here. Even a bot tweeting this elicited some responses from people, and I wanted to write a bit about the bot, why this happened, and what it made me think of. Before I go any further, let me just say: my bots shouldn’t offend people, and when they do it’s my fault. But this event did throw up some interesting things for me to think about. Continue reading

An EXAG Science IV

Screen Shot 2013-08-30 at 15.15.20

(This is a series of short ‘previews’ of papers to be presented at the upcoming Experimental AI for Games workshop at AIIDE 2014. Tune in live on Twitch on October 8th to catch the presentations of these papers, or find the PDFs online at http://www.exag.org)

Procedural content generation (PCG) is a thriving area for games. Everyone from indies to AAA developers is using PCG. Spelunky, Minecraft, Diablo, Dwarf Fortress, and many others use PCG at the core of the game. But are the games we have now using PCG in all the ways they can? Where has PCG been and where can it go next? Gillian Smith, in her paper “The Future of Procedural Content Generation in Games“, covers five major lenses on PCG and what unexplored areas the future might hold. Read on for a preview.  Continue reading

An EXAG Science III

Screen Shot 2013-10-06 at 15.53.33

(This is a series of short ‘previews’ of papers to be presented at the upcoming Experimental AI for Games workshop at AIIDE 2014. Tune in live on Twitch on October 8th to catch the presentations of these papers, or find the PDFs online at http://www.exag.org)

Game stories often have an intended path for the player to follow. But players don’t always play along. Sometimes players just miss the main story thread. Other times players even try to foil the intended story arc. Is there a way to adjust the story or world to keep players on track? Can an interactive narrative give players unconstrained choices while maintaining the intended story? Justus Robertson and R. Michael Young, in their paper “Gameplay as Online Mediation Search“, present the General Mediation Engine system (GME) to guide players along an intended story in a game world. Read on for a preview of how the system works to guide players along an author’s intended story path.  Continue reading

An EXAG Science II

Screen Shot 2014-08-29 at 21.52.30

(This is a series of short ‘previews’ of papers to be presented at the upcoming Experimental AI for Games workshop at AIIDE 2014. Tune in live on Twitch on October 8th to catch the presentations of these papers, or find the PDFs online at http://www.exag.org)

AI is deeply connected to gameplay, perhaps more than graphics, audio, or other in-game assets. Yet we’ve seen  few games that put interaction with AI systems at the core of the game. Existing game AI developed in support of already popular genres like first-person shooters or real-time strategy games. This lead to refined systems for reactive gameplay situations. Classical AI, however, is best at using expressive formalisms for tasks like complex problem solving and question answering. In his paper “Game Design for Classical AI” Ian Horswill designs new game mechanics around high-end classical AI. What problems does an AI-heavy game need to address? What game design supports this kind of AI? Read on for a preview.  Continue reading

An EXAG Science

Screen Shot 2014-08-29 at 21.00.32

(This is a series of short ‘previews’ of papers to be presented at the upcoming Experimental AI for Games workshop at AIIDE 2014. Tune in live on Twitch on October 8th to catch the presentations of these papers, or find the PDFs online at http://www.exag.org)

‘The Ideas Person’ has a bad reputation in the games industry – someone who offers up game concepts but doesn’t want to pull their weight. But everyone needs ideas from time to time, and when we’re stuck for inspiration, maybe it wouldn’t hurt to have a source of ideas on-hand? In their paper ‘Towards the Automatic Generation of Fictional Ideas for Games‘, Maria Teresa Llano Rodriguez, Simon Colton, Rose Hepworth, Michael Cook and Christian Guckelsberger describe their ‘What-If Machine’ (WHIM) project and how it might be applied to invent ideas for games. Here’s a preview.  Continue reading

Quick Guide: How To Set Up A Stream, Pt 1

streaming

Hey! This post is quite popular on The Googles but it’s a little outdated now. Apologies if some of this doesn’t work for you – most of it should do though! I’ll try and update it with a new version soon.

Earlier this year I streamed the talks from the International Conference on Computational Creativity live on Twitch. We had almost 100 unique viewers in total over the course of the conference, extending the conference’s reach to people who couldn’t afford to attend, were from universities without travel budgets, or people who were just curious about what a conference talk about computational creativity might look like. It was a huge success and more events like this should stream their talks (where appropriate – many events avoid video recording for important reasons like the privacy, comfort or freedom of their speakers). I was asked several times for a guide on how to set up a stream like this – and I’ve finally written it. 
Continue reading

Look At Things, Help Science!

iccc8

While I was over at ICCC 2014 I met Dan Ventura, who heads up a research group over at BYU in Utah. Dan always presents interesting work at ICCC, and has wonderful students doing great work. Right now they’re running a survey to evaluate DARCI, a piece of software that can create, modify and evaluate images using a nice bit of visual intelligence that lets it understand the kind of image it’s looking at. They need your help! The survey only takes 10 minutes and it would really help them out.

Take The Survey Here