University Projects

Volumetric Lighting (Honours Project):

For my honours project in the fourth year of my undergraduate, I chose to tackle volumetric lighting via frustum-aligned voxels (froxels, for short). Over the course of this research module I learned a great deal about volumetric rendering and building critical analysis skills when looking at resources while researching like academic papers and SIGGRAPH/GDC presentations.

My research involved comparing differences across volumetric fog implementations like Wronski’s original implementation, NVIDIA’s tessellated mesh method and Kovalovs’ improvements on the froxel method for The Last of Us Part II, then comparing the impact each difference in implementation had on the computational latency, memory footprint and visual results of the final application.

To create the application I had to research and understand past and current volumetric lighting solutions and implement the froxel-based method in OpenGL (which in a nutshell involves evaluating volumetric lighting properties like scattering, absorption and extinction at points at points within a voxelised frustum, then storing the results in a 3D texture ready for composition onto the final scene), creating a rudimentary graphics framework along the way to be able to quickly and easily make changes to the final scene showcasing this rendering technique.

Augmented Reality Game:

This fourth-year module had myself and others on the course produce a mobile game running on Android devices utilising augmented reality as the core gameplay. I created an escape room-style puzzle game where the player had to walk around an environment, moving and rotating the mobile device and tapping on objects in the world to solve each puzzle. Puzzles include aligning a radio with a certain direction to learn some coordinates, pouring chemicals into a flask from some test tubes and melting a padlock with the final potion, all by moving and rotating the mobile device the game is running on.

The game was built with Unreal Engine 4, and as this project was my first experience with the engine I had to learn its systems and workflows, similar to using Unity for the professional project above. I learned a lot about Unreal’s material system - adjusting material parameters at runtime with C++ - Unreal’s audio system and manipulating objects in the world with information from the mobile device (e.g. touch input and the device’s position and rotation) with ARKit.

Though this project involved a lot of software design and gameplay programming, a significant amount of work went into the design of the puzzles to make sure they all make proper use of AR, such as having the player physically pick up and pour chemicals out of the test tubes to mix a potion and reading from a map in the world to input a town name into an encryption machine using a ROT-N algorithm for encryption.

Edinburgh Airport Training Simulator (Prof. Project):

For the professional project module in third year, I worked in a team of eight for our client, Edinburgh Airport, creating a training simulator for their security personnel working with the luggage scanners in the airport. As they make the transition from 2D to 3D scanning technology, this project was about creating a gamified simulation of the 3D scanning technology so that security personnel could play our game while building skills relevant to their jobs at the same time. As the project progressed, it caught the attention of the Civil Aviation Authority (CAA) who showed interest in our prototype and the project as a whole.

My role on the team was to program a “luggage editor”, which would allow players to create their own suitcase with prohibited items hidden within it (or not!), so they could send it to their colleagues and challenge them to assess the suitcase quickly. This was my first big project working with the Unity game engine, so I adapted to its systems and workflows while working with designers and artists on the team to produce the final editor in the prototype.

2.5D Game with Procedural Levels:

In the second year of my undergraduate we created a 2.5D game using the university’s “GEF” framework, and I created a side-scroller inspired by “Super House of Dead Ninjas” (SHoDN), with procedurally generated levels.

Since I had no knowledge or awareness about common procedural techniques like noise, agent-based generation or dunkard’s walk at the time, I reverse-engineered SHoDN’s generation method to create the rules that would reliably generate playable levels in the game. Other than this, the game used very simple physics-based gameplay using Box2D where you had to hit an enemy in each room while avoiding getting hit back, being unable to progress to the next room/level until the enemy is defeated.

Animation API:

In this fourth-year module my coursework involved creating an API for 2D and 3D animation loading and playback, including implementing advanced 3D skeletal animation techniques like animation blending, different transition types and inverse kinematics with cyclical coordinate descent (CCD).

The part of this coursework I am most proud of is creating a blend tree system for 3D animation that can synchronise an arbitrary number of animation clip nodes that are blended between. Provided that a “relative node” is specified which is always running, each animation clip node is visited and their durations are taken into account while producing the final skeleton pose.

High-frequency volumetric shadows cast by a planet, rendered with a froxel-based method of volumetric lighting.

The luggage editor being used to fill some luggage with legal and prohibited items.

An Xbot animated with 3D skeletal animation, with a number of parameters for animation blending, animation transitions and inverse kinematics on the left.

Video demo of the full game, showing off how the puzzles work and physically moving around the environment to interact with and solve those puzzles.

A screenshot of the gameplay, with the player attacking an enemy (blue cube) in the game to progress to the next room.

A view of the level generated in a console application (left), with a zoomed-out orthographic view of the same level generated in-game (right).

Low-frequency volumetric shadows cast by a planet, rendered with a froxel-based method of volumetric lighting.