General Video Game Playing (GVGP) has become a popular line of research in the past years, leading to the existence of a wide range of general algorithms created to tackle this challenge. This paper proposes taking advantage of this research to help in game design and testing processes. It introduces a methodology consisting of using a team of Artificial General Intelligence agents with differentiated goals (winning, exploring, collecting items, killing NPCs, etc.) and skill levels. Using several agents with distinct behaviours that play the same game simultaneously can provide substantial information to influence design and bug fixing. Two methods are proposed to aid game design: 1) the evaluation of a game based on the expected performance in the behaviour of each of the agents, and 2) the provision of visual information to analyse how the experience of the agents evolves during the play-through. Having this methodology available to designers can help them decide if the game or level under analysis fits the initial expectations. Including a Logging System can also be used to detect anomalies while the development is still at an early stage. We believe this approach allows the flexibility and portability to be easily applied to games with different characteristics.
Cite this work
@inproceedings{guerrero2018using, author= {Guerrero-Romero, Cristina and Lucas, Simon M and Perez-Liebana, Diego}, title= {{Using a Team of General AI Algorithms to Assist Game Design and Testing}}, year= {2018}, booktitle= {{Proc. of the IEEE Conference on Computational Intelligence and Games (CIG)}}, month= {Aug}, pages= {1--8}, abstract= {General Video Game Playing (GVGP) has become a popular line of research in the past years, leading to the existence of a wide range of general algorithms created to tackle this challenge. This paper proposes taking advantage of this research to help in game design and testing processes. It introduces a methodology consisting of using a team of Artificial General Intelligence agents with differentiated goals (winning, exploring, collecting items, killing NPCs, etc.) and skill levels. Using several agents with distinct behaviours that play the same game simultaneously can provide substantial information to influence design and bug fixing. Two methods are proposed to aid game design: 1) the evaluation of a game based on the expected performance in the behaviour of each of the agents, and 2) the provision of visual information to analyse how the experience of the agents evolves during the play-through. Having this methodology available to designers can help them decide if the game or level under analysis fits the initial expectations. Including a Logging System can also be used to detect anomalies while the development is still at an early stage. We believe this approach allows the flexibility and portability to be easily applied to games with different characteristics.},
}