MoCap to IKinema
This lab was definitely the most fun so far. It feels very rewarding to have the ability to use the combination of everything we’ve learned so far to create a live action avatar in Unreal. Yuuuussss! I think my group was getting a better sense of occlusion and how to direct the actors to avoid occlusion with the charades exercise.
One thing that I noticed in the IKinema tutorials that we didn’t do in the lab was that they aligned all the angles in the avatar to the data coming in from the live feed (seen here). I’m curious to see how important this is to do? It seems like it could relieve a lot of headaches in the final output if some time was allocated to this.
One thing that I have been struggling with is having the textures present on the final avatar. I’m not sure where the issues keep coming up, but I haven’t been able to export a character with textures.