Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
This is awesome tips for direct painting material in blender 2.8
Geografts are the deal breaker for me, If they don't work then there's no point in me even attempting an export to Blender. I think, if that is the case, then my only option would be to export in OBJ/MDD format. How the materials will look is something I will need to see for myself but I would have to do all of the posing and animating in DAZ Studio and just export for rendering - which, using Eevee, should be a great time-saver over IRay. Perhaps Casual's mcjteleblender is a better option but I believe he uses FBX to export which seems to have its own limitations.
It is frustrating - there's always a gotcha.
There is another option: I think if you want geografts to use in Blender you need to convert the Geograts to be part of the original DAZ model's obj geometry and then go through and systemically add all the morphs contained in the original geograph to the newly create model geometry in DAZ that has integrated the geograph. Then export to Blender. You'll need to save separate 'base character' as you integrate the geograph directly to the base geometry to avoid messing up your DAZ Originals though.
TBH I'd have no clue where to start with that process but a couple of things spring to mind immediately. That might be fine if it is a one-off export but if I am flipping back and forth between DAZ Studio and Blender, surely I would want to be working on the same figure? Secondly, wouldn't the OBJ format export the figure+graft as one? I really need to get some time to try these things because I'm doing a lot of guesswork here. I just need to get my current project done first.
[EDIT] I see what you are suggesting though ... if I want "working" geografts (rigged and morphed) then your solution might be the only way. Thanks. As I said, I might just have to settle for doing all the posing and animation in DAZ Studio and export the posed (and/or animated) figure to Blender in OBJ format just for rendering.
On further consideration: would this create a loadable, rigged character with integrated grafts? And would it be possible to change textures, morphs, etc? If so, where is the procedure described? However, I'm guessing this would involve exporting as an OBJ and then using the character creation tools to transfer the rigging, etc. As I understand it, all the JCMs and weight mapping would be lost in the process, right?
Yes it would but it involves multiple exports of the DAZ figure + geograph as a new object base. & then repeat exports for each of the character's geograph morphs plus the character's morphs too you'll need & reimport the integrated export, integrate into the new base as a morph, & repeat, potentially many times. You're essentially learning how to build a brand new DAZ character from scratch, sans any modeling. You'd need to reconstruct UVs and textures but because the figure is still the same shape you shouldn't need to do excessive adjusting.
It's definitely lots of work but I think it's the only way to do what you are talking about if Blender never gets a plugin that can handle geographs.
I'd study SickleYield's tutorials as they explain much of what you need to know and then search on YouTube (the DAZ 3D YouTube channel has short tutorials on how to do most of those things).
Finally in the DAZ Store there is a big PDF tutorial on how to do much of what you want to do and more with regards to rigging but I can't remember the PA's name that wrote the tutorial.
I'm having an odd problem with importing an OBJ that was exported from Studio. It probably has a simple solution, but I must be missing it.
I exported a Genesis 8 figure with some simple clothing items as an OBJ from Studio and opened it in Blender. I don't need the textures to be pre-applied (since I want to do it myself with nodes) and I don't need it to posable (because Studio is far better for posing) or carry over any morph dials. Basically, I just need a statue to come over to Blender with the proper UV mapping so that it's regular textures will work as they do in Studio. It does load, but there are some weird issues.
- The figure is mirrored for some reason. It was facing right in Studio and is now facing left in Blender.
- When rendered in Cycles, the figure is entirely black. I've tried adjusting and adding nodes, lighting, and other options and the figure always renders black while the ground plane and such render as expected.
Anyone have any suggestions? Thanks in advance.
edit: never mind, I solved both of these 2 minutes after posting this. ;P Needed to invert X positive direction and 'write material library' got unchecked.
Is your texture set to UV or generated? If it's set to UV and the texture is black the UVs may not have transferred or gotten screwed up somehow. As for the mirroring, not sure how to help there other than to double check your Blender import settings.
Try exporting your figure, hair and clothes separately and join them in Blender (I know it's a pain), but the program may be confused by the multiple uvs.
Laurie
Sorry, I must have edited my post as you were typing a reply. I did figure it out, the export settings from Studio were just wrong. Blender actually does a decent job of bringing in everything correctly with the clothing in the right positions and the diffuse maps applied at least. After bringing in a figure using a DAZ importer, everything became so slow and confusing that I figure I'd just stick to posing in Studio (I like Powerpose anyway) and export the stuff to Blender as static files.
Great. Glad ya got it solved :)
Laurie
No, I have not found a solution for this. I thought about just finding where they convert to base res, and remove it. But all these small issues just serve to make me want to go with Alembic and not have to deal with any of this.
Wait pdr0, what?! If that works, then why is the problem of exporting not considered solved? Are you saying you can get a G8 into Maya, with the JCMs intact and driven by joint rotations? Then what is the big deal over the DEX Exporter?
To clarify, you're saying you do see the bad geometry ? There can be other issues related to this too, it's just that the teeth are easiest to see right away.
BTW - DEX is going to have that problem too. You'd need to use workarounds as well.
In order to properly solve that specific issue - there has to be a way to convey the information in the duf about which edges to subdivide specifically , and in the same manner has DS . There are other programs have the same algorithm available (pixar open subdiv, catmark), yet experience the same problem. So it's not only the algorithm, but something about the implementation of it
Not everyone has Maya ...
And what part of the export isn't solved ? Did you mean specifically JCM's ? (Because, yes, there are other export issues in those types of workflows)
JCM's were always possible in Maya - you'd just have to manually set it up which was a big pain in the ___
Yes JCM's work right now with the mel script. But this wasn't available until maybe a year ago. Also, I don't know if every single one 100% works correctly. And there are custom characters that have custom JCMs, I don't know if those are transferred properly. You have to do a lot of testing to verify - so maybe this is what DAZ and the are working on for DEX
Kind of unrelated question to other current topics in this thread, but could someone tell me how you save material presets or characters to use in other Blender scenes or on objects in other scenes? I haven't seen any sort of library like Studio or Substance Painter have, so I'm not quite sure how to, for example, download a sample scene with a skin shader and then save that shader to use on a figure in one of my own scenes. Thanks in advance.
Save in a blend file and then append that file when you want to use the material, choosing only the material when you append. There won't be a material library until 2.81.
Laurie
There will be one in 2.81 though? That's good, I can wait. :) Thanks very much.
Yep (they're shooting for this November). You can download the alpha and try it out if you want to. It's portable and you can run it independent of 2.80. DON'T use it on any files you're worried about getting corrupted. But to test and play with it's fine ;)
Laurie
Thanks, I'll probably just wait (lots to learn still), but I'm definitely starting to like using Blender more than before.
Couple more questions if you guys don't mind; I haven't been able to find any real guide on setting up textures made for DAZ products/Iray in Blender. I know how to make nodes for the Diffuse and Normal maps, but I'd appreciate knowing if I'm getting the specular and sss maps set up correctly.
- If there's only a Specular map included for a figure/prop, does that connect to the 'Specular' dot in the Principled BSDF shader and you just use the Roughness and Anisotropic dials to control it?
- I'm honestly not sure what to do with subsurface scattering maps for Genesis 8 figures. Connecting it to the Subsurface dot in the Principled BSDF shader does seem to mask off SSS to certain areas, but not always the ones I'd expect. I've seen SSS skin example videos for Cycles, but most of them don't use maps and I don't know exactly what the DAZ SSS maps are specifically designed to define (location? strength?)
- I've forgotten how you apply both a normal map and a bump, do you use an Add shader to connect them both to the one Normal dot?
Sorry for the many questions. ;) Thanks again.
*incorrect setup...correction to follow*
Laurie
Sorry, I should have said I forgot how to connect a bump and a normal at the same time.
I'm not sure about the SSS, but a normal map can be applied by adding a bump node between the texture and Principled node. Connect the color output to the bump node and then connect the normal output to the normal input on the principled node. You can find the bump node under "Vector".
Laurie
Two bump nodes...one for the bump and one for the normal.
Laurie
mcausal's mcjblender 4 is in Beta now for going from DS to Blender and he has animation going as well. Can I post a DA link here or a link to their scripts?
There is a dedicated thread on the forums here: https://www.daz3d.com/forums/discussion/2877/mcjteleblender-daz-studio-scenes-animations-w-blender-s-cycles-engine, that has links to each new (daily) release
Using Blender, is there a way to easily create daz "Material Zones" or "Surface Zones" (forget what the correct terminology is).
The geometry editor method within Daz itself is kind of tedious.
Normally you only have to add some materials in the material tab, then go to your model, enter edit mode, and assign a material to some faces by pressing the assign button...
If you're new to blender (as I am), I strongly recommend a beginners course like https://www.udemy.com/blendertutorial/?couponCode=YOUTUBE_VIMEO
(Atm, the course is available for about 11 bucks)
You can also "link" assets from other files, which is useful if you'll continue to work on the linked asset and want the changes reflected in the other blend file as well. You can later "resolve" the linked asset and have it placed in the blend file, much like "append" does.
Yes, I'm seeing it. I thought it was because Blender and Daz used different subd algorithms, but while beating my head against it trying to perfect an Alembic exporter, I came (slowly) to understand that the base res model in Daz is not the same thing as a subd model at 0. Daz uses the base res model as the SubD cage, instead of a real SubD cage intended to be subdivided. The difference is that surfaces pass through the vertices of the base res model, but when it is repurposed as a SubD cage, surfaces are merely influenced by the vertices. So convex surfaces shrink, and concave ones swell, and this is most pronounced around areas of high curvature, i.e. teeth, lips, noses, ears, and fingers. I believe that was the impetus behind the HD Lip Contour morphs; it's extremelt noticeable around the lips.
Unfortunately, the issue is not fixable because Daz uses the same mesh for base res, and for the SubD cage. These things interpret the vertices quite differently,
Indie pricing :)
My head is spinning. I just exported a charcater from Daz as FBX. It imported into Maya perfectly, except for a glitch with the armature on one of the character's rings. But the rig was there. I assume that I can characterize right there in Maya, just as I do in Motion Builder, retarget mocap data, clean it, and animate right there. JCMs are not necessary for that. If so, I don't need to care about the JCMs because I can just export a BVH back to Daz, and have it apply the JCMs natively. And then finally export Alembic to Blender for sim, VFX and rendering. All this time, all I had to do was export via FBX?! If this works, then that's my work flow, right there... thank you for making the lightbulb go on.
Need to make a correction for this - you use a Normal node rather than a bump node between the normal map and principled shader. Still under "Vector". Sorry about that ;). It was late when I posted the wrong info. LOL The correct setup for nomal and bump together is in the image below.
Laurie
Some consider them very special.
If someone has finally managed to make something idiot proof, they could make millions!
All joking aside, it is interesting to read about your experiences.
Thanks, I forgot the Bump node had dots for both a height and a normal map.
Do you happen to know how to use DAZ specularity maps in Blender? I'm used to a PBR workflow where I would have Roughness and Glossiness maps, but I can't seem to find the right setup when all I have is a Specular map.