Artificial Intelligence in Gaming: Part 2 – Perception
Part 2Â of Three Speech’s 5 part discussions on Intelligent Design and AI is live and this time, it discusses perception in the world of gaming, how NPC’s are able to sense the player through sight, sound, smell, etc. Included in today’s round-table are Jean-Christophe Capdevila from Alone in the Dark, Dominic Guay from Far Cry 2, Remco Straatman from Killzone 2, and Alex Champandard of http://aigamedev.com. What is needed in order order to produce an interactive NPC capable of sensing it’s surroundings? Read on to find out.
Remco Straatman starts off the discussion with the factors that are involved in recreating perception such as the viewpoint of an object and it’s relative size and distance. One method used by Straatman and his team is “threat prediction” where NPC’s will utilize their knowledge of the environment to search for their target. Dominic Guay expands on this with stealth tactics and how enemies can detect movement in the foliage as well as adapt to situations. Both Alex Champandard and Jean-Christophe Capdevila discuss how NPC’s react differently when the player is not around and how memory plays an important role in creating better perception.
Another great aspect of AI has been checked off the list. The first thing that comes to mind when I think of perception in gaming is the Metal Gear Solid series and how well they pull it off. To me, in order to have top perception, one must cover all the senses; sight, sound, touch, smell, and taste. Well…maybe not taste. After all, I don’t see that affecting gameplay very often if at all. Long gone are the days where running in, guns blazing, would actually work. Now, players must strategically think about what move they must make because their enemies are doing the same thing.