My name is Christian Burberry and here is a selection of what I've been up to during my years as a programmer (or programmer fledgling if more appropriate).
Having finished my undergraduate degree in Computer Science at the University of Warwick, I felt I needed to get a bit more hands-on experience in game development.
I just finished a postgraduate course for Games Programming called GamerCamp: Pro at Birmingham City University. As a part of that I got to experience some of the joys and woes that
come with a game development environment and find myself wanting more.
I want to be in games development because I want to challenge myself and work with a team of aspiring people to create something we can be proud of.
As game developers, we can tell stories, we can bring people together, we can make hair look silky smooth unlike the shampoo adverts on TV.. we can do all sorts of things.
But most importantly, it's becuase I'm finding the fun here.
The game has 5 pre-loaded songs packaged in the FMOD master bank.
These songs can be selected by pressing on their thumbnail in the song select screen.
This prompts you with a beatmap selection box which dynamically loads any saved beatmap files.
After the selection and countdown, buttons play an animation based on the saved trigger value for that button.
The time at which the user hits the button as it's flashing determines the type of hit (great/cool/miss). Hits are counted for end of song display.
The animation values are found from a beatmap struct that contains a 2d float array for the hit timings.
Every frame the 2d float array is iterated through and determines the current button state and its resulting visual state through accompanying variables.
Each button is connected to a function that matches the hit button and checks if the button is within a period/has been hit before to determine if any
visual change occurs or if the input is ignored.
Beatmaps can be created by the user by pressing 'input mode' (red if active) and selecting the
song to make a beatmap for. The user gets a button grid identical to play mode and as the music plays can enter their own periods for animations.
Upon hit the button turns yellow and blocks any further input until the period length is over.
Many systems are recycled from the play mode for this functionality. At the end of the input, the user can choose to redo/save/play their new map.
Save functionality is done using UE4's standard save slots.
The game copies a fair bit of the atari classic 'asteroids' with a small few differences.
For gameplay description please check out the itch.io page.
This was my first attempt with Unity and C# and I had a little over two weeks to create for.
The competition organisers didn't provide any indication of when it would be or what we would use to create the game so when it came in the middle of working on
compiler design coursework I left it for awhile and tried to finish what I was already working on at the time.
The task was to use the existing basic project (which consisted of a magnet controllable with WASD that attracted or repelled spawning cubes).
While I was very tempted to extend towards an original design, I felt that my inexperience with Unity was going to be a huge hurdle and so I decided it would
be better to go with a non-original design and add on some original bits afterwards, I would discard the originality points to get as much as I could on working gameplay elements.
As luck would have it, Unity was a hell of a lot easier to get working with than Unreal. Spent about 5-6 days conceptualising the design around 'asteroids' from atari and
watching/following through the Unity training days tutorials (space shooter/tanks/survival shooter/adventure game etc.).
The remaining week I was working from about 9am till midnight (left out the last hour on the last day becuase I vowed not to touch it anymore on the hour before submission).
While I got the majority of what I wanted to get in, there were a fair few setbacks:
On the final day I wanted to get in some pickups that get attracted and picked up by the player(s) e.g. repair kits, but the day was spent bugfixing and special thanks
to my housemates who offered to bugtest for me and without them I don't think I would have even gotten the 8th place I got.
Initially I had written a pause feature into the game, but as the project neared to a close I came to the conclusion that with the many systems that used coroutines
I would be unable to reliably pause the game without bugs (chalking this up to my lack of knowledge about Unity's coroutines), so I commented out the feature.
Wanted to get in mouseless menu navigation (kb only) but the way I had structured by UI was less than optimal for the default UI navigator.
Wanted to get in xbox 360 controller support, but time was just not on my side for this one. I considered it a low priority item, hoped to get it out on the last day :'(.
Notes:
I've only provided the Windows' executable on itch.io, but it can be built for OSX & Linux using the project files in the GitHub Repository.
'Echoes': Vertical Slice Project for PlayStation 4
Video Gallery
*Final vertical slice full gameplay video*
*Recording made of camera demo prototyping I had been preparing for pre-production sprint 2*
GitHub Repository (C++ Code Excerpts)
GitHub Repository (Discord Bot) - Batch / Python
Description:
This project is being undertaken as the latter part of the GamerCamp: Pro MSc game development course offered at BCU. Working within a team of 27 other students from various disicplines to deliver this project
from conception to vertical slice. This project was completed on 21st August 2018.
Based on our provided Minimum-Viable-Product, the game is a 3D Metroidvania game with an overhead perspective. The game concept is about a woman exploring a lost civilisation with a mysterious relic/weapon she finds in the ruins.
The main gameplay will be battling enemies with a combat system that is enhanced through skill upgrades found throughout the environment.
On this project, I have been working as the liaison for the code team and as a gameplay programmer. As code liaison, I handle most of the PS4-related deployment issues and task assignment for the team. As gameplay programmer, I have worked on various
facets of the project including: Camera systems, Enemy AI, UI, and the Boss Battle. I also enforced coding practices with the other team members such as Buddy Checking and PS4 build tests pre-commit.
My contributions to the project:
Boss Battle: As we got further and further into development, the team felt that we needed an additional feature to vary the gameplay for the remainder of the demo's duration. I had been prototyping a boss battle as
part of an ongoing feature experiment. Due to a lack of personnel on the design team at the time (quite a few were on placement) I decided to try and copy a boss fight that repeated itself in the Legend of Zelda games.
Specifically the Ocarina of Time Gannondorf battle. The idea being that the boss is invunerable until the player succeeded in a 'tennis' bout and stuns the boss, allowing for additional melee hits. This feature took six weeks to develop.
It uses player distance to gauge what it's next action should be. If the player is hit by the projectile, they are stunned and the boss attempts a devastating attack in response.
Enemy AI Rework: Our pre-production AI were extremely basic and were viewed as not fun to fight. I decided to try taking over the enemy AI and add additional behaviours to improve the overall experience.
One of these features was a grapple attack that involved the crystal spiders trying to latch onto the player's back as a grapple attack if the player was not facing them. However this feature stopped working correctly while I was working
on the Boss Battle and did not have time to bugfix. However, the same grappling behaviour was re-used for the player stun in the Boss battle (when the player gets hit by the projectile).
Discord Bot for UE4-PS4 Remote Builds: As we used a discord server as our team's main form of communication, I felt it would be appropriate/fun to try and automate the building process for PS4 using it.
We have a decidated build machine setup for packaging PS4 builds and deploying them to the development kits. The bot provides for a manual call for a build request and PS4 package remote deployment.
The bot uses batch scripts I wrote to get latest from perforce, UE4's RunUAT.bat to package a PS4 build, and outputs formatted messages into the server automated-ps4-builds text channel with errors from RunUAT if it failed.
What it looks like
PS4 SDK & UE4 Setup for all team members: Because of an issue with our university machine setups, we cannot use the Epic Launcher UE4 builds. Instead we were using the source version of 4.18 from Epic.
However, it wouldn't be appropriate for artists and designers to have to install visual studio and all the dependencies needed to make the source build so I made and distributed a binary version made from the source build.
Some programmers also use it to prevent accidental rebuilds of the engine source code. I documented on our wiki all the steps needed to correctly install and confgure the PS4 SDK and UE4 as well as conducting a workshop
with the artists to get them all setup.
Camera Trigger Boxes: As shown in the preview video above, I made two types that manipulate the character's attached camera directly. One is a time-based trigger, the other is a distance-based trigger.
The derived trigger box classes take curves that determine camera transformation over distance or time. Designers and replace the curve assets on the different instances to have different effects. Implementing this came about as a
stopgap of sorts for a more sophisticated camera system that we explored but never went for due to time constraints.
A designer using it in a level mockup.
UI Prototyping: The player HUD consists of a set of simple progress bars that use custom assets and some additional images to display the currently selected element the player has active. The enemy HP bars are simple
progress bars with no custom assets that are displayed in world space as a component above the enemies and is rotated to always face the currently active camera. Also developed a shoulder button controlled pause menu UI that swaps
between tabs (i.e Options, Skills, Codex, etc.).
Panda Palate Android Game
Preview video
*This video is a recording made of the windows v0.6 build that was made at the end of the module.*
This project was undertaken as part of Modules 1 & 2 of the GamerCamp postgraduate course at Birmingham City University. We prototyped the functionality in the first month and moved onto developing the feature
& content complete version of the game over the next 2 months leading up to winter. The team was made up of 4 artists, 3 designers & 2 programmers. The game is a linear single-screen platformer.
Cocos2D-x doesn't provide a user interface so most aspects of the game were coded such as In-Game Menus, asset loading, interfacing with an open-source level editor (OGMO) and physics.
We decided to minimalistically use Box2D for physics where appropriate but made limited use of dynamic bodies and forces. Control is handled by Cocos2D's touch listeners which either have behaviour in the game through
Cocos' Button/Widgets or whole screen touches which are then checked for position within the screen to determine position-based callbacks.
Between the two of us we split tasks into different areas to prevent having to merge our changes on perforce all too often. I wrote UI, player movement, level loading through OGMO Editor XML files, platforms and the various types,
wrote some changes to the OGMO editor as well to make it easier to use. In pre-production, we used a
text-based level loading class
I wrote for quick prototyping so that we didn't have to recompile the project every time to test
code changes (included in code excerpts repo link above). This included loading some global variables and spawning lists for entities like platforms or collectables. When we moved to production we changed level/asset loading to the open-source OGMO editor.
Notes:
The final application was be released under the GamerCamp developer name as that is the name of the postgraduate course.
3D Maze (2017)
Demonstration Video
Screenshots
GitHub Repository
Description:
This project involved using basic OpenGL 1 and C++ to create one of several tasks. I chose the 3D maze becuase it I believed it would be sufficiently challenging
and possibly extensible to another project at some point with its ability to have a top down map and a first person perspective.
Although I haven't made any moves thus far in that direction.
It would probably be best to learn modern OpenGL instead for that if I do end up taking that route.
The program has hard coded walls and includes prevention of movement by the user into a wall if a movement moves into a wall.
The solution line is also hard coded but involves using a vector as a stack-like structure where if the camera moves towards the line verticies it pops,
and otherwise pushes the previous camera position into the stack.
The turning 'animation' was done by utilising the Idle functionality where I set a flag to cue a custom Idle function that starts a turn and turns off the idle flag
when two 45 degree turns are made. Camera turns are based on multiplying the camera by a rotation matrix. This project was completed in four days.
Notes:
The 'You're Winner' text is a personal touch.
Enron Corpus Analytics (2016)
User Acceptance Test Video
*This video was the original in project video sent to Deutsche Bank representives for assessment if meeting the requirement criteria.
As such some parts are different from the final product. Apologies for the video quality.*
GitHub Repository
To run the prebuilt jar file, you need to 'cd' into the same directory as the .jar (dist folder) and run "java -jar Alpha_Project.jar".
Description:
This coursework required students to be sorted into random groups of five and prepare a solution to a common task.
We had to write the specification and requirement documents, create the application, present a final report on the document and the methodology used to create the application.
In addition, we were required to do a Dragon's Den styled presentation to some judges for our software, outlining the key features of the application.
I used Powerpoint to create the intial design (at the time I did not know the pleasure of Paint.net and often played around with the image tools in powerpoint as a kid).
Here was the result: GDrive (ppt), we were required to create a UI tool for navigating the data
provided (Enron Email Corpus) and highlight communications between individuals in a selectable period. For this, we were not taught any UI tools previously
(exception of Web Development module in Year 1) and had to start from scratch. Thanks to the experience of having to work with the JavaFX without any real teaching or extensive textbooks
I can appricate the fact that UI programming is just not easy (and it doesn't help when everyone around you seems to think it is).
Notes:
Everything except Backend.java in the 'src' directory is my own work. For amusement or otherwise a lesson to myself, I have left it as it was when submitted.
There may be some display bugs when run on Windows 10 (although this was back when Win10 was first released).