Abstract
ABSTRACTBackground: Generating locomotion for characters is a complex field with many challenges remaining for researchers to tackle. Whilst there has been various research undertaken into how to create diverse motion using physical simulation, inverse kinematics and motion capture, there is still little research on how to relate changes in virtual characters’ body shape to the way they walk. This is important as audiences are capable of detecting repetition of character appearance and walking styles. By relating generated walk cycles to the body morphology of characters we can improve their believability. And to achieve this using a dynamic and automated system would save animators time when needing to create a variety of believable characters.
Objectives: This study will explore how people perceive gait to change over variations in body shape, how gait actually varies and whether it is possible to build a framework that believably correlates changes in gait parameters over changes in anthropometric parameters.
Implementing this framework could then produce a tool for animators that generates a variety of virtual characters with believable variations in walking styles.
The goal of this project is to improve the believability of virtual characters by relating virtual character’s body shapes to an appropriate walk cycle.
Methods: 8 papers were analysed to generate 8 empirical appearance to motion trendline formulas. These formulas formed the basis of the scripted animation tool.
The animation tool was then used to create a point light survey testing 6 motion parameters to test people’s perception of changes in motion over appearance.
n= 59 participants completed the perceptual online video surveys.
The animation tool’s formulas were updated with the results of the point light survey and another survey was created using character meshes.
n= 69 participants completed the perceptual online video surveys.
28 adult male gait patterns were motion captured and analysed using a Vicon motion capture suite.
5 parameters were analysed to have the strongest appearance to motion correlation and were sorted by order of perceptual dominance. These parameters were implemented in the scripted animation tool and a final perceptual poll was conducted.
n= 96 participants validated the final animation tool using an online video survey.
Findings: The empirical data analysis identified speed, stride length, step width, stance/ step phase and foot progression as motion parameters that change over increases in Body Mass Index (BMI).
The point light perceptual survey found that changes to arm abduction, average arm bob and arm swing all produced motions associated with obese body morphologies.
The character mesh perceptual survey verified that speed and walking base were motion parameters associated with changes in body morphology, whilst verifying previous parameter strengths and combinations.
The actual motion capture sessions produced a framework of 5 appearance to motion formulas, ordered by perceptual dominance. The predictive correlations include:
1. Preferred walking speed over height
2. Average arm abduction over chest circumference
3. Walking base over waist-to-height ratio
4. Arm bob magnified over height
5. Arm swing over body fat percentage
A final perceptual video poll found that when asked to rank 4 different types of obese generated motion, participants voted the framework of anthropometric to locomotive parameters tool to be the most believable by a 38% majority.
Conclusions: This study identifies 5 gait parameters that people have identified as being perceptually dominant. The motion capture analysis highlighted 5 gait parameters with significant correlations to appearance parameters. When implementing the chosen combination of appearance to gait parameters a significant majority of people ranked this to more believably represent an obese character walk, than a lean, obese and keyframe obese walk. An efficient and believable method for generating diverse locomotion that relates to the body morphology of the character has been created and validated.
Date of Award | 2018 |
---|---|
Original language | English |
Awarding Institution |
|
Supervisor | Yifeng Zeng (Supervisor) & Shengchao Qin (Supervisor) |