Hair Sim Freeze Script
TheMysteryIsThePoint
Posts: 2,958
Just wanted to share a nugget of pure gold that I found. It's a way to be able to simulate hair, edit the results, and simulate again until you've got exactly the results you want. Something that I've been wanting to do for years. Maybe other people have, as well.
I'd been futzing around with this shot on and off for weeks until I got it just now in just two sim/edit cycles.
Change the extension to .py, and check out the link in the comment that is the first line.
txt
txt
hair_freeze.txt
1K
Comments
thanks for the boulder, I mean nugget here
Thank you.
Thank you Donald for sharing this.
As a personal note. I am not into hair styling a lot, but from what I understand the hair guides are for styling the rest pose of the hair, so they should be brushed in edit mode to get the initial position for the hair style. I mean it makes sense that the simulation doesn't affect the hair guides. In the example at stackexchange the hair should probably be styled as desired to start with, before the simulation. And I'd use much less guides.
Then the script to let the simulation affect the hair guides, thus the hair style, may always be useful to use the simulation as an additional "brush" for styling.
https://blender.stackexchange.com/questions/205531/
edit. Changed the video as the above is better to show the process.
Hiya @Padone,
I'm honest is my assessment of both Blender's hair tools, as well as my own artistic limits :) I had actually come across both of the videos you cited, but quickly came to the conclusion that I simply don't have the artistic talent to style a nice-looking hair system, and so instead of using these types of tools, I concentrated on making Sagan better at exporting the excellent hairs made by Daz PAs.
But Blender hair sim is Blender hair sim, has serious shortcomings, and the hair quickly diverges within a dozen or so frames, unless you've damped it down pretty drastically, in which case not also killing the dynamics that you do want is a delicate, time consuming process that makes Blender's finnicky cloth sim look positively easy. It's even worse with the complex hair styles exported from DS that just aren't made to simulate.
Thanks for your comments, as always, bro.
Edit - Another reason why being able to freeze is because of this tool, which can apparently make "shape keys" for hair particle systems. That'd be a great new capability to just keyframe something as complicated as a hair particle system...
As for hair imported from daz studio, if they're SHB there's a trick to export some sort of guides. You can go into the SBH editor and set the density to a low value. This is not the same as exporting the real SBH guides that can't be exported, but given that the hair follows the guides it can be close.
As for geometry hair it is easier since Thomas provides a sparsity option to "decimate" the hair. But yes, in general daz hairs do require some work to get them usable in blender.
https://diffeomorphic.blogspot.com/p/hair-section-version-16.html
@TheMysteryIsThePoint As you mentioned, the hair simulation in Blender is a bit of a pain, but I have found a few things that make a world of difference.
Scale your characters up. I use centimeters for my measurements, so my characters are 150 to 200 cm tall. Note that if you scale them up, you will have to modify the Subsurface Radius for the materials and scale the values up by 100. If you use Diffeo to import your scenes, then just set the "Unit Scale" to 1 in Global settings before importing. I set the Blender unit scale to .01 and use grams for weight measurements.
The hair mass is critical. I found that .0001 to .0003 grams work OK. This is also the reason I use grams for measurement. If I use kg, the UI will not allow me to set the mass low enough.
Make sure that no hair guides (hair particles) penetrate the character's mesh. Check ears and lashes carefully. Any particles that penetrate the collision mesh will go haywire.
It's often easier to modify the imported hair mesh using the sculpting tools to clear ears and eyebrows and shape it the way you want it to look before converting it to particle hair rather than modifying the hair using the hair particle editor after conversion.
Disconnecting and reconnecting the hair particle systems can sometimes fix badly behaving hair.
Hope that helps.
@Padone That's really interesting. Sagan just sets the SBH's Tesselation modifier to 2 sides (making it a degenerate ribbon), and exports whatever geometry as-is using the same code that converts ribbon-based hair to particles, uses the UV coordibates to determine which end is the root.
Because hair in Blender is so archaic, there needs to be a separate particle system for each different length, as scaling the hair to always have the same number of vertices regardless of its length changes its properties. Houdini has no such limitation and I'm working on Sagan to export to Houdini because it has no such limitation, and animating the hair is Houdini. The Houdini .hda then scales the hair to equal length and writes out .bphys files for the Sagan-generated particle systems to read. That way, it's not strictly dynamic hair but rather a baked sim, and so the changed vertex count of the guides and the changes that makes to the hair's physical properties doesn't really matter. Yes, I've really gone full-in on Houdini :)
@Cinus I should have known that you had so many clever tips :) especially using grams as units because Blender's defaults are so many orders of magnitude off of the values that actually work. I don't generally do hair/cloth in Blender anymore for the same reason I don't animate is DS (the problem of exporting DS content to software that actually works is easier to solve than making it work in DS) but with your above pointers, I think I'll give it another go, with managed expectations. Thanks!
But I do make a point to push Houdini further into my workflow. Version 19 is coming, and they are *really* piling on the functionality for character animators like procedural secondary motion and simplifying the F-curves of mocap data so it can be worked on by keyframing, rather than using layers.
Yep I always resize the hair to the same length when converting to particles with diffeo, can't even think to work with hundreds of p.s. that would be crazy.
@Cinus Unfortunately in my tests changing the unit scale doesn't always work fine. It seems some simulation is very sensitive to the unit scale and the results vary a lot. I mean just changing the unit scale, with the same physics properties. That shouldn't happen since the simulation should depend only on the physics properties and not on the unit scale.
So this seems to me a serious bug in the blender simulation system. Unfortunately the blender developers don't agree and they say it's normal. The only solution I found is to don't change the unit scale and struggle with the limited parameters to find something that works fine enough. Luckily this is somewhat manageable.
Overall I believe the blender simulation system just lacks precision, possibly due to optimizations for speedup. Then if they could code to use the gpu with opencl instead of the cpu this would also be a big step forward.
https://developer.blender.org/T62631#1126709
Thomas has thought of everything :) But luckily, I have not encountered an asset with more than 4 different sizes