Friday, June 24, 2016

AIA Postmortem

Section 1
Hello to who may be reading this. My name is Sean Sparaco and I am a student majoring in the Recording Arts at Full Sail University. I am very interested in Sound Design and Audio Post, so working on this project had been interesting. This month I have been working on two projects, and one was done with a team. This postmortem is meant to state and address progressive and destructive qualities of my experiences working on the UE4 Platformer game and our Bioshock version of Limbo. I intend to discuss issues and strengths with our workflow and our implementation as well. I will reflect on our new experiences with non-linear audio, which has led us to new discoveries and understanding.

Section 2
In this section of my postmortem, the discussion will pertain to the UE4 Platformer Project that was worked on in and out of our lab hours. The Unreal Engine middleware was something very new to me and I did not possess experience in this type of implementation before this project. For this project, I aimed for a more comedic aesthetic because I did not want to limit myself to only creating realistic sounding assets. An example of this is present clearly with the choice of jump assets, and how they do not reflect the actual sound of a character jumping in real life. The ‘time bonus pickup’ was also meant to embody an arcade game sound or aesthetic. If I were to suggest a reference game for what I tried to achieve for this project, it would most likely be Super Mario 64 or any of the Mario series games. The sound design process for the footsteps was interesting because I wanted to make sure that the character’s footsteps didn’t sound normal, as the character isn’t a normal person.
For the music during a ‘win’ or ‘lose’ I aimed to create a polarizing reaction for each finish line asset. For example, when a player wins the coinciding music asset is positive and celebratory; when a player loses they audibly receive the gloomy and destructive music asset as an indication that they have died/lost. The music that loops throughout the game was composed by myself and was meant to induce a mysterious yet familiar sort of vibe. I consciously composed the melodic interval with means of provoking the reaction that the interval creates. The octave interval was implemented purposely, and is a fairly comfortable and simple melody for the game in this case. In lecture I learned that humans have an emotional connection to music, and we didn't want the audience/player to be too comfortable with what could happen.
The cinematic assets, out of all of my submitted assets, were meant to sound the most realistic and obtain hyperrealism at certain parts. Hyperrealism was meant to be most present during the explosions and gunshots, so additive low end was layered and implemented into the game.
               I think that I achieved a lot of my goals with this specific project. With muscle memory I gained more understanding of how to properly implement my audio assets into Unreal Engine 4 and I have progressed since.  Implementation of the sounds was difficult at first while I was experimenting with the middleware and discovering the tools at my disposal. The trigger timing of my animations and cinematic scenes were implemented correctly and were heard exactly when I wanted to hear them, creating a realistic and believable experience for the player. There were times during gameplay that I could notice some minor audio level issues in the mix, and I aimed to bring all assets within a peak normalization balance between -14dB and -8dB; however, the game sounded differently when I demoed it on different monitoring systems.
               I would say that the schedule and quality expectations were very realistic in regards to achieving my set goals. I was able to experiment with UE4 a lot during lab and outside of class because it is a free middleware engine available to anyone.  I became very familiar with creating sound cues and implementing assets this way. A couple things went wrong with the final implementation of a few assets, though. The Foley asset for the ramp bending and breaking did not trigger correctly, and I could have spent more time with that. I also had some trouble with correctly implementing the intro music asset after the first cinematic scene. Overall, the gameplay could have been more immersive if I spent a bit more time playing the game through and perfecting it. On the positive side, the implemented Foley and animation of the character matched up seamlessly and was very believe able. The trigger for the ‘win’ and ‘lose’ music and cinematic was perfectly timed and believable. This project was a great learning experience.
Section 3
In this part of the postmortem I will be evaluating my involvement and progress in regards to the Wwise Limbo Project. My team and I chose to pick a specific direction to take this game in, and after some in-depth discussion we decided that the Bioshock series would be an interesting game to reference. Our aesthetic direction we attempted to take would be described as dark or gloomy, because the Bioshock games are dark, obscure, and filled with elemental mystery. The ambience we rendered should achieve that dark and abysmal vibe, in a non-positional way.
The main character of the game, Bioshock, wears a heavy metal suit and caries mechanical weapons made out of mostly metals and other dense materials. For my character footsteps assets I really wanted to give the impression that thick heavy metal feet were stomping on each textured surface. To do this I consciously layered in a metallic sounding impact within each footstep asset, being mindful that all other footstep assets from my teammates would have to coincide with the sound of the impacts on each texture. All three of us used the same metal source in our footsteps assets, and I specifically worked on the ‘ground’ and ‘wood’ character steps. While designing the sounds for the ground footstep assets I kept in mind that the metal feet would be coming into contact with the earth, which would include elements of grass, dirt, or pebbles. I attempted to layer in and incorporate more low-end frequencies with these five assets due to the nature of heavy impacts on the dirt. On the other hand, for the wood steps I aimed for a more hollow sounding asset to expand on the impact of metal legs walking on a log of timber. I asked my roommate for some feedback on my progress at the time of experimentation, and to him the assets sounded fairly realistic alongside the gameplay.
I would say that I achieved many of the goals that I had set for myself during the course of this project. The Wwise implementation protocols were much different that of the Unreal Engine. I feel as if UE4 was much more user friendly and easier to navigate. One of my team members had a lot of issues with Wwise, and I myself would say that is was a lot of trouble adapting to Wwise protocols after using Unreal Engine; however, we overcame and succeeded in our implementation of all assets. I think our biggest success in terms of achieving realistic sounds would be the assets that included water. We knew that in Bioshock water is a crucial element, and we wanted to implement realist water layers within our assets. We recorded some water sources including a bathtub and used those sources to our advantage.
We did run into a few problems with the implementation that we could not completely refine. For instance, the ‘cart roll’ looping asset that I created would not stop when we wanted to trigger it, resulting in some assets playing longer than necessary. The platform rotate assets that I created were meant to sound mechanical, like in Bioshock, but did not really match the visual perfectly because it was made of wood. We ran into a few setbacks, which were due to some glitches and incorrect backing up of files during the second Wwise lab. This was something that we began paying close attention to in lab. I think, however, the biggest thing that went wrong was the fact that the Wwise animation event titles in the Profiler view did not match the asset names, so it was very difficult to implement the correct assets when they should be triggered.
The schedule and quality expectations were somewhat reasonable for achieving our goals. Wwise was particularly difficult my team to comprehend and overcome, maybe because the middleware is less user-friendly and more complex in terms of implementation. Event timing was not much of a problem in terms of ‘playing’ an asset; it was more difficult to trigger some specific assets to stop.

               Working with a group was definitely unique experience, like always. As a team we were able to combine our strengths and work together successfully, while fusing our ideas as a whole to achieve our final product. Our group coalition was very positive and beneficial, and we kept in contact via an iMessage group I created so that we would always be on the same page. One thing we talked about together was the direction in which to take the menu sounds. There were some negative effects of working in the group, but nothing that would prevent me from aiming to work with these two people again. The issue of all group members not being present was something that we faced during lab 2 of the Wwise labs. We ran a little behind but were able to make up for the member’s absence. Overall the Wwise project was a successful learning experience and I learned a lot about custom tools.







No comments:

Post a Comment