Due to my lack of ability to draw, I developed a pipeline workflow where I render pixel art animations using 3D models and toon shaders. Not only does this ensure quality artwork with high frame rates, it also provides normal maps that is used for 2D lighting effects sensitive to the shape of each character.
The gameplay demo displays what was intended to be a very first level of a larger world and narrative. While it prepares players for rhythm sensitive combat and platforming, it only scratched the surface of what is possible in what I imagine to be a music-centric gameplay experience. Here is a video of an earlier prototype that demonstrates more possibilities with musical inputs, some systems that I have implemented for this purpose includes:
- Adaptive sound track based on game progression
- Realtime sound synthesis in Unity
- Tempo/Rhythm based input and events
- Freeform music playing as interaction with virtual world
- Frequency spectrum analysis driven game elements and visuals