[Theta Legion VR] Creating the drone

theta legion vr | creating THE drone

I wanted to make the first enemy simple to animate, so that I could test the results without having too much throw-away in case things went bad. Also, from the player perspective, it’s better to keep the “fancy” enemies as a surprise (and progression validation) later in the game.

I decided very early that I wanted my enemies to be animated sprites, rather than models. several reasons motivated that decision:

  • Because the general aesthetic of the game is very much about pixel art, I wanted to make sure potential complex enemy models wouldn’t distract from the pixel art work: already on the environments, the pixel art can get lost in the perspective and the lighting, so I wanted my enemies to stay as “classic” as possible.
  • Modeling and animating the enemies would lead to a serious increase of work, and time. And I’m not sure the trade-off was worth it in the first place. Since we would have to stay extremely low-poly, we would have enemies in the lines of the Quake enemies – and I happen to prefer by far the Doom enemy design compared to Quake. I’m much faster at drawing and animating than I am at modeling, texturing, rigging and animating, and I’m also much more comfortable with the classic approach.

MOODBOARDS

By now, the fact that I always create moodboards for everything I do shouldn’t surprise anyone! 🙂

For the drone, I was hellbent on a round shape: first, because it reminds me the awkward menace of the spheres from the movie Phantasm – but also because round and cylinder-type shapes work the best with the billboard approach (for those that don’t know, a billboarded sprite is a sprite that always face the camera). The round shape can also be evocative of some sort of “evil eye” and fits the lore of a “guardian” mechanism that will attack everything on sight.

As you can tell by looking at the images on the left, I’m far from being the only one with this idea: the round oculus/beholder-like mecha-guardian is a staple of many sci-fi universes.

Even though the sprite was meant to stay simple to animate, I wanted to have multiple action-states:
– Looking around
– Suspicious/Hunting
– Firing

This means I needed additional elements that would help animate the sphere, such as external mechanisms, and a shell that opens up to reveal machine guns, etc.

SPRITE WORK IN PHOTOSHOP

Because I knew I would move a lot of things around while animating, I decided to layer pretty much everything in Photoshop, including the sphere shading: that would allow me to animate the various shades and help convey that the sphere is slightly rotating when looking down/left/right.

The animation is composed of 8 frames total, separated in 4 groups:

  • Idle/looking around: with the drone looking straight, then down, left and right (4 frames).
  • Alarmed: with the drone luminescent spikes popping out of its shell (1 frame).
  • Opening fire mechanism: the lower trap of the drone opens up to reveal the cannon (2 frames)
  • Firing: the drone has a fire frame that is also combined with the last frame of the opening fire animation to give the illusion that the muzzle fire is flashing (1 frame).
  • Closing fire mechanism: for when the drone returns to the idle state, playing in reverse all the frames from the opening animation.

After animating everything and testing the animation in Photoshop, I exported all the sequence as one long spritesheet to integrate in Unity.

UNITY INTEGRATION

Importing the sprite in Unity is as easy as just dropping it in the right folder. Of course, the texture type is set as Sprite (2D and UI), the Wrap Mode is set up to Clamp, and I deactivate the filtering (Filter mode to Point) and Compression set to None in order to not have any anti-aliasing attempt on the sprite and preserve the crisp aspect of the pixel art sprite.

With Sprite Mode set to Multiple, I then go in the Sprite Editor, and slice my spritesheet (Grid by cell size) making sure it slices every 64 pixels.

After validating the result, I now have a sprite ready to integrate. On my scene, I create a new 2D object, as a Sprite and assign it the right image in the Sprite Renderer (it should do it by default).

In the animation panel, I create a new animation (I always start with the idle animation as a rule of thumb) and drag and drop the right sprites on the timeline, testing the animation to get the timing/spacing right. Not all the frames should be placed at equidistance if you want to preserve an organic feel to your animation. I also often play with the sample speed from one animation to another to keep that “natural” vibe and not have my animation come across as too mechanical.

I will repeat the same operation for all my different states, until I have my Idle, Alert, Open Fire, Close Fire and Fire animations.

One thing I like to do is separate what I call intrinsic animation and extrinsic animation: by intrinsic animation, I mean all animations that happen to the object itself, and by extrinsic, I mean animations that happen to the object in relation to its environment.

The reason for separating them is that one shouldn’t influence the other. In this case, I wanted the sprite to bob up and down, to simulate some sort of “hover” effect. This led me to nest my sprite into a drone object, and the whole into a drone game object (just to be clean). The droneSprite object features the spritesheet animation, while the drone object features the Transform up and down animation.

The last part of the integration is to actually string all the animations together using the animator State Machine. This can be more or less complex depending on how you will handle, via code, the different triggers that control what state plays when. For now, though, I just wanted to be able to see my animation in-game to make sure everything looped correctly, so the state machine was pretty straight forward.

IN GAME VIDEO OF THE DRONE

SCREENSHOT