4. Video to Biomechanics

4.1. Overview

These notes outline the project that was last carried out by Fleur Kleene, who presented her final MSc thesis presentation in TU Delft on 20th Feb 2025. Her work was based on (or, even, a continuation of) work by Bofan Lyu, who published his master’s thesis late-2023. Zhi-Yi Lin has also has local changes to Bofan’s original code, which were made in parallel with Fleur’s project.

Relevant links:

As a rough overview, in combination, these projects involved:

  • Using an OpenSim model to record/replay realistic human motion and then mapping that model to a realistic-looking SMPL mesh that could be rendered as animation frames in blender (Bofan, 2023, Fleur improved this in 2023/2024).

  • Parameterizing the resulting pipeline so that things like skin color, clothing, body shape, etc. could be modified to generate new video frames, which were then fed into machine-learning algorithms to see how those parameters affect the quality of the results (Fleur, 2024).

4.2. Technical Details

These notes aren’t comprehensive technical documentation. They are rough notes collected by Adam Kewley and Hassan Osman with Fleur at the end of her MSc (Feb 2025). They have not yet been validated by any 3rd-party developer/researcher (i.e. as far as we are aware, only Bofan/Fleur have ran pipeline end-to-end).

  • All project data is held on the TU Delft webdrive at W:/staff-umbrella/Video to Biomechanics/Files_fleur

  • Bofan’s original files are at W:/staff-umbrella/Video to Biomechanics/

  • The development timeline was: - Bofan originally wrote the OpenSim-to-SMPL-blender pipeline in 2023. His original code

    is located at https://github.com/blyu413/Synthetic-human-motion-video-generation . Most of Bofan’s code runs via a typical python / anaconda pipeline, but some of it (e.g. rendering) is ran via blender.