Anatomy Re-engineering Framework – August 2022 update

Written by Lathreas and Ellie

It’s time for another update on the Anatomy Re-engineering Framework! To briefly recap, our aim with this project is to create a tool that gives us and others the ability to redesign anatomy in a way that will be accurate enough to allow surgeons and geneticists to implement the changes without any trouble. In a sense, we want it to be easy for people to define a physically accurate “end goal” that can be relied on by engineers so that they in turn can create the technological solutions to perform said changes in real life. This is not just an artistic project; all physiological functions must remain preserved, and the patient must under no circumstance be at risk of harm. Since every transformation will be unique, it will be extra important that the tool is easy to use for engineers and artists while creating reliable results.

Our last update was in May of this year. In the past few months, we have continued to bring ARF closer to an initial release. As part of the journey, we discovered that the popular 3D modeling tool Blender has received a large update that partially implements a big part of the pipeline we have been working on, specifically Geometry Nodes. Although it might sound odd to consider an artistic program for something as technical as what we are attempting, a close analysis of many CAD tools ended up with Blender coming out on top for us! We would benefit heavily from not having to reinvent the wheel on many aspects of our intended modeling pipeline, which Blender already solves for us. At the same time, we can fit our program into the workflow many people are already used to, sitting side-by-side with other powerful plugins that may solve different tasks. We have decided to port our existing work over to Blender while retaining our physically accurate mathematical model.

The transition is coming along quite well! We found it to be much easier to add new joint types and to change the user input in real-time, and we’ve found great performance improvements by utilizing Blender’s optimized raytracing. We can benefit from the user interface as well, enabling us to focus only on what needs to be added or streamlined. The Blender implementation is now on par with and even further along than the original, with an experimental implementation of muscles underway.

Demonstration of muscle fiber model

Our example bone now has joints (as explained in previous updates), as well as work-in-progress muscles consisting of individual muscle fibers. Although we haven’t added all muscles yet, after the recent additions to our system, this is as simple as a click of a button!

Our example bone now has joints (as explained in previous updates), as well as work-in-progress muscles consisting of individual muscle fibers. Although we haven’t added all muscles yet, after the recent additions to our system, this is as simple as a click of a button!

We can also add custom buttons to Blender’s UI. In an experimental development branch, here is a working custom UI panel by Videah, which allows you to dynamically add a bone or a muscle to the model.

We can also add custom buttons to Blender’s UI. In an experimental development branch, here is a working custom UI panel by Videah, which allows you to dynamically add a bone or a muscle to the model.

One major feature is currently still under development, namely, the physical simulation that will ensure that muscles deform properly when a person moves, and so that they won’t intersect each other and bones. Liz is working on this with incredible speed and is getting close to a Python implementation. We will leave the mathematical details for one of the big updates, but it’s exciting stuff for sure!

Improving user editability of bone shapes

The joint constraints can be shown as wireframes or as visual surfaces, and the parameters can be changed in the right-most panel, shown zoomed in below.

You can now change the important parameters in a visual way. This is a great improvement over where we were previously, when you still had to recompile the entire program to change a single parameter.

You can now change the important parameters in a visual way. This is a great improvement over where we were previously, when you still had to recompile the entire program to change a single parameter.

There are a few technical questions we do need to answer while working with Blender. Although the Geometry Nodes system is very powerful, it is quite difficult to separate these node trees from .blend files and cleanly show them on GitHub. This makes it harder to do code reviews, and that means we need to spend some time thinking about how to ensure that we keep the project maintainable in the long run. Luckily, we have managed to find a Blender plugin that converts the node trees into Python code files, which might provide a scalable solution once we are out of the experimental phase. This is something we will be testing in the near future.

An example of a simple node tree.

An example of a simple node tree.

Looking ahead

To give an overview of the timeline, we are aiming first to handle bones, muscles, and skin, while leaving room and flexibility to add important anatomical details such as nerves and blood vessels for a later update. Once we are at such a stage, we can do a feature freeze and focus on polishing the program for a public release.

For now, you can find our work as a set of .blend files on a new repository on Github: https://github.com/Freedom-of-Form-Foundation/anatomy3d-blender. Please note, though, that the program is still under active development, and therefore cannot be considered stable or usable yet beyond some simple testing. As our work progresses, we will shift our focus from features to user-friendliness and stability!