Postmortem


Did I achieve my goal?

When I reviewed my case study into AI, the key theme was to sell the illusion that the computer-generated AI Controllers had intelligence. Games such as The Last Of Us have human characters that have a purpose in the world. They were fully voice acted; they had custom animations that interacted with the environment; and they reacted to the player with human-like responses. This artefact does not do that. What I have created is a simple AI response to sensing the player which utilises multiple AI Controllers to conduct a very deliberate flanking attack. Everything is very sudden and robotic, but this is the fault of the individual AI behaviours rather than the focus of this artefact, the Combat Director. As a prototype, this game demonstrates the grading of AI Controllers and assigning different roles, which it does and thus achieving my goal.

Key takeaways

Overall the production pipeline worked well and I would not remove any components. Some areas needed improvement, such as committing to the repository and using additional animations to sell the illusion, however these did not hold up the production and would be considered secondary efforts. Processes such as using Itch.io to template a developer’s log and the ease of using event graphs within Unreal Engine 5 were a great success and I am excited to use these in the future.

Itch.io

I knew what I wanted to achieve from the start and mapped out a production timeline, noting that I had other competing commitments. Listing the timeline in a Devlog on Itch.io worked well and it allowed for adjustments as I progressed, however it involved duplicating the list in each post. As a single developer on a small project this was not an issue, however a service such as Trello would be better suited in a team environment and also allowing the breaking up of the larger milestones into smaller tasks. Itch.io has been intuitive and prompted additional content during the production pipeline. I hope to continue to use this service in the future.

GitHub

This was my first attempt at using GitHub for Desktop and found it very easy to use as a single developer. I had no requirement to branch or compete with other users so I will have to test these concepts before I can compare it to other services such as SourceTree. My commits were few and far between, mostly focused on completed milestones. These were stretched over numerous sessions and affected multiple functions, increasing my risk of data loss. Coincidentally at one point I did lose my local copy and was required to clone the server. I also found that leaving commentary was difficult for large commits as I had made numerous changes and would have to keep a notepad handy to document what I had done. All of these issues would not exist if I committed after each change.

Game Engine

I chose Unreal Engine 5 as I wanted to use the Behaviour Trees. I am familiar with the event graph style of programming in UE5 and wanted a change from Unity which I was using prior to this artefact. 

Level Design

Level design utilised the third person template. Minor changes were made to obstacles to test the AI sensing, but overall nothing significant needed to be changed and the template was sufficient to achieve the outcome.

Animations

The character models had basic locomotion animations by default. This was ok for chasing the player, however I wanted additional animations such as attacking. I downloaded additional premade animations from Adobe Mixamo, however including them into the build was more complicated than anticipated and I believed was not necessary to demonstrate the AI. What should have been a punching animation is currently a swinging T-pose which remains in the current version. Given more time I would like to replace the character rig and all of the animations.

Controllers

For the player’s Controller I included some debugging tools such as ‘reset’, ‘make a noise’ and ‘damage enemy’. These were simple tasks linked to the number keys. I added an audio queue, a particle emitter and some additional objects to the characters to test out some ideas and polish, but they were not necessary for the AI. Given a greater timeline a menu widget with additional details should be added. The AI Controller essentially said: “start and update the behaviour tree”, but also included functions detailing what to do when sensing the player or how to change states when directed to by the player or Combat Director.

Behaviour Trees

Developing the Behaviour Tree was enjoyable but time consuming. Structures were built on designs by Ali Elzoheiry (2023). The concept of selector and sequence nodes were straightforward, and the custom tasks allowed any desirable action to be placed in the sequence. Unfortunately, I had to iterate over the tree numerous times to refine the AI’s actions. This process involved major changes and likely removed functional code, only to be rebuilt when a new idea failed. The end state of the Behaviour Tree had bespoke Selector nodes that were linked to a finite state machine and keyed to the associated Blackboard. Through a combination of decorators and custom tasks, I was able to directly influence the behaviour of the AI, setting the conditions for the Combat Director.

Combat Director

The Combat Director built on concepts by Ryan Laley (2020) to include the flanking manoeuvre. The concept was that each enemy would be graded by the Combat Director, the grade determining which AI would do what, and updating the individual AIs Behaviour Trees accordingly. In this artefact the AI Controllers are randomly assigned a grade, however this can be weighted in future builds to take into consideration the distance to player, enemy health or any other variable. Much like the development of the Behaviour Trees, this was a trial-and-error approach which required play-testing and reworking the code. One notable error was not assigning the random number generated per loop to a local variable. This caused each AI Controller to have a number assigned for grading and another number sent to the widget making play-testing the grading system very difficult. A very good reason to always promote hard coded values to variables as they are realised.

Alternative Management

Ali Elzoheiry (2023) also discusses combat management through the use of tokens. The concept involves the target controller having a pool of tokens which attacking controllers will borrow from and return in order to conduct an attack. As attacking controllers borrow tokens and the target controller’s pool reduces to zero, blocking other attacking controllers from taking a turn. This concept is great and I would have enjoyed implementing it, however at the time I believed the grading of AI with the Combat Director would work better for tactical movement of characters.

Conclusion

The Combat Director achieved the goal of creating an illusion of intelligence by using a grading system to task AI Controllers and subsequently attack the player using different tactics.

Reference List

Elzoheiry, A 2023, Smart Enemy AI | (Part 5: Enemy Types), viewed 01 September 2023,
https://www. youtube.com/watch?v=UDsbwnuD22Q

Elzoheiry, A 2023, Smart Enemy AI | (Part 11: Group Enemy Combat), viewed 01 September 2023,
https://www. youtube.com/watch?v=tGHjB1Bu8b4

Laley, R 2020, Unreal Engine 4 Tutorial - Melee AI Part 6 - Combat Director, viewed 01 September 2023,
https://www. youtube.com/watch?v=qReKN2lRA4o

Files

v0.3.zip 378 MB
Oct 29, 2023

Get Combat Director Demo

Leave a comment

Log in with itch.io to leave a comment.