The Little Guy

The Little Guy

Sentinels, towering machines that vaporize anything within their sights. You, a puny robot that makes cute little squeaky noises. Experience the stealth genre gone sci-fi through VR as you try to escape from these towering monsters. You don’t stand a chance against sentinels but you can slow them down. Melt special panels and absorb the remains to store up energy. Once you’re ready, pull the trigger to release a devastating energy blast that will slow your target in its tracks.
The Little Guy went through one month of development and was made for the Oculus DK2 in 2016. I worked as the team’s programmer. Level design was done by Nicholas Adams. 3D models were done by Roman Moskvichev. Concept art and art direction were done by Linden Li  and all of us took part in the design of the game.
The game was covered by the Toronto Star, the article was in print but an archive can be found here.


The main programming language I used to code the game.


I modified JavaScript components bought through the Asset Store and converted a lot to C#.


The engine used.

Oculus Rift DK2

Rift used with Unity’s Integration.


Used Git with Bitbucket for version control.

Visual Studio

The IDE used with Unity.


Used to create and modify a lot of textures.


Used for pseudo code and problem solving.


Used to modify and mash together sound effects.

Google Sheets

Used to manage all tasks for me and other members.

Microsoft Project

Used to manage progress and stages of the project.

Xbox Controller

Integrated as the primary form of input.


AI Movement


One of the initial challenges I faced was getting the sentinels to move around without clipping into things. I could just make them move between two set points placed by the level designer but that would just be a pain to use. The game relied on good level design and I wasn’t willing to make the main feature hinder that. Instead I aimed to make them as automatic as possible to the point where the level designer could place them and they’d just work.
DetectionI initially thought of a simply sending a ray once forward to detect walls. It worked well to the point where I used it to detect other sentinels as well. The main probably was that these guys were giant and would clip before the ray even got a chance to get in range to detect.  Optimally I would just send a huge block-like ray outwards to account for this size but that’s just not efficiently possible (or possible at all). Sphere casting and triggers were possible solutions but were limited as they’re mainly used to detect within broad ranges.
Instead I played around with ray positions and even considered moving rays kind of like a real robot. What I settled with were three rays all placed by the bottom. One ray in the middle was still used but two on the sides were also added to ensure nothing would clip with the sides of the sentinel. This worked incredibly well to the point where I haven’t seen them clip once since. I created an invisible wall object that could be used in case a tiny object were to get passed these rays but it was never actually needed.


Coding Effects


One of the most interesting features incorporating would be absorption. Before absorption the original idea was to have the mesh move and rotate towards the player’s face and gradually get eaten like a trash compactor. I had no experience warping meshes through code so I wasn’t sure if it could be done within the time frame we had. Even If I were to accomplish this I wasn’t even sure if this would look good in VR. It’d probably just end up looking like a mess of clipping in front of the player’s face. Because of these reasons I tried to think of a different way of depicting consumption using what I had. I first sketched out some ideas and looked through all the particle effects I had. One that really stuck out was a slime effect. This is where I came up with the idea of melting and absorbing as opposed to trash compacting.


To achieve the effect I tweaked the slime effect and spawned it on the upper front of the consumable object. Because we have no floating platforms I decided to make the object move downwards into the floor to making it seem like it was melting. To enhance the effect I also bumped up the emission to max copying the slime color. Finally I had to think of a way to make players understand they’re absorbing the object and not just melting it. I played with ideas like having the player glow green once the object melted but what really worked well was creating a line of slime directly to the player. To do this I spawned particles from the bottom of the object and moved them towards the player in intervals. I had to make sure that these particles wouldn’t start spawning until the particles coming from the top of the object hit the floor. Otherwise players wouldn’t be able to make the connection that the object is melting and being absorbed.






  • Tile Dodger
  • We Are All Animals