Looking for Miku Miku Dance enthusiasts that like using DAZ as an MMD platform

13

Comments

  • kwanniekwannie Posts: 865

    Here is yet another link that I hope the smart people can make heads or tails out of. pdr0 you asked me to report back on it so this is where I found the info. just browse down the the neko-sentai page and a few of the MMD folks are pondering it. I tried researching and the best I could figure is that it is a project dealing with pose estimation. They were referring to it as Autotrace on this page.  Anyway all of the files are there but it will take someone with some savy to decipher it. If something like this works that would be amazing.

    Wendy did you try the pmx2fbx yet? I know you dabble with all kinds of stuff, I hope you join us.

    https://forum.neko-sentai.com/showthread.php?t=14031&page=94

    https://hub.docker.com/r/errnommd/autotracevmd

  • WendyLuvsCatzWendyLuvsCatz Posts: 38,230

    I won’t simply because the licensing is so unclear on much stuff 

    I can only use resources with permission in plain English

  • pdr0pdr0 Posts: 204

    I don't think Shape Rigger Plus will help; you're not really adjusting the rigging or using different morphs. The goal is to use that VMD motion (or any motion source)  on your target character - you don't want to have to change the character

    I don't think any DS product will help with transferring motions . You can clean them with graphmate, or that 4.12 beta looks interesting too ; but that only helps to fix problems afterwards.

    You still have to clean stuff up with motionbuilder, it won' t magically fix everything - especially on a source motion that has problems to begin with. But mocap cleanup is it's job - so it has better tools for that sort of thing

     

    Thanks for the links,

    Those pose estimation / autotrace projects look very promising ... But tough to use unless you're very techy and understand Japanese.

    If would be nice if they were repackaged into some friendly GUI where you click a button, or some blender addon

    But it looks like Openpose has an unity plugin in early alpha stage, but usable. I wonder how good it is ? I can't find decent reviews with a quick search

     

     

  • kwanniekwannie Posts: 865

    The niconico page in the link show a video on how to set up and it doesn't look too involved. I'm not sure what hub docker is but I always get edgy when requires networking applications. I'm always parnoid about data mining. Too many years in the military I guess. Please let me know if you figure something out.

  • PadonePadone Posts: 3,700
    pdr0 said:

     

    This does not only apply to MMD VMD transfers - it's applicable to DS in general, all animation, all motions ,all sources , mixamo, carnigie hall, unity, unreal  etc... To illustrate the issue better, here is an animated gif (motion not from a MMD transfer, but you can see this with a VMD transfer too and in your BVH). Notice the hands/arms are roughly correct position, but slightly off. Also notice the ugly skin deformations. In DS, that's what the twist bones are for - weight mapping

    It's a known issue that G3 G8 don't work fine with mocap because of twist bones, you have to fix them manually. G1 G2 work fine.

  • pdr0pdr0 Posts: 204
    Padone said:
    pdr0 said:

     

    This does not only apply to MMD VMD transfers - it's applicable to DS in general, all animation, all motions ,all sources , mixamo, carnigie hall, unity, unreal  etc... To illustrate the issue better, here is an animated gif (motion not from a MMD transfer, but you can see this with a VMD transfer too and in your BVH). Notice the hands/arms are roughly correct position, but slightly off. Also notice the ugly skin deformations. In DS, that's what the twist bones are for - weight mapping

    It's a known issue that G3 G8 don't work fine with mocap because of twist bones, you have to fix them manually. G1 G2 work fine.


    G3G8 can work fine with mocap and twist bones; it just needs to be retargetted correctly

    That was about weight mapping and deformations of G3G8 in DS and the additional "off slightly" - beyond what you would expect. The reason is 3DX is pasting the x-rotation values into the shoulder bend , not the shoulder twist. It's not retargeted correctly. eg. Mobu does it correctly if it is setup properly (look at the gif) .

    That's the underlying reason why people unlock nodes before applying data - it's a workaround but with issues. Without unlocking the hidden x-rotation channel in the shoulder bend, all the x-rotation values would be discarded. Then you would have arms flapping, massive issues . It's the same for the thigh twist and bend, "off" issues with foot sliding, floor penetration, floation.  But pasting into the wrong channel gives you closer to the desired motion, but results in those ugly mesh distortions from the weight mapping

    But VMD transfers are even more off besides those (beyond the regular mocap, or skeletal difference issues) because the VMD keys are typically not specified every frame (not baked to every frame).  The interpolation curves are different between MMD and PMX2FBX . You can see the result of the same model, same motion is different in MMD and the converted FBX is different in the other post with the foot through the floor.

     

     

     

  • kwanniekwannie Posts: 865

    model info

  • nicsttnicstt Posts: 11,715
    marble said:
    kwannie said:

     

    Also, I have used many tutorials from Dreamlight to get the rendering times down within DAZ since most MMD animations are into the 8000 plus frames range, but I always would love to hear more tips on being able to render MMD style scenes with all of the complex assets in the scene simultaneously. I'm sure compositing is probably the way to do it but love to hear more ideas

     

    Wow ... 8000+ frames! I doubt I have enough years left in me to wait for that render. This is exactly why I've been looking at Blender and Eevee but that's a whole new learning curve in itself. Judging by the variety of software being discussed in this thread, yet again I fear I am running out of years. So much to learn!

    LOL My non-existent grandkids will be but a distant grand-parent.

  • WendyLuvsCatzWendyLuvsCatz Posts: 38,230
    edited August 2019
    nicstt said:
    marble said:
    kwannie said:

     

    Also, I have used many tutorials from Dreamlight to get the rendering times down within DAZ since most MMD animations are into the 8000 plus frames range, but I always would love to hear more tips on being able to render MMD style scenes with all of the complex assets in the scene simultaneously. I'm sure compositing is probably the way to do it but love to hear more ideas

     

    Wow ... 8000+ frames! I doubt I have enough years left in me to wait for that render. This is exactly why I've been looking at Blender and Eevee but that's a whole new learning curve in itself. Judging by the variety of software being discussed in this thread, yet again I fear I am running out of years. So much to learn!

    LOL My non-existent grandkids will be but a distant grand-parent.

    my last UE4 render was likely more than 8K frames and I have done numbers like that with Octane too but it is also dependent on whats in it

    iClone such numbers easy, those are all game engine quality renders though not PBR

    iray is terribly slow for me, slower now my 980ti is gone

    you would not honestly want to render a MikuMikuDance figure in iray though they are very low poly

     

    UE4

    another huge cheat is using the binary to "play" it and shadowplay to capture it

    Post edited by WendyLuvsCatz on
  • wolf359wolf359 Posts: 3,828

    This is exactly why I've been looking at Blender and Eevee but that's a whole new learning curve in itself. Judging by the variety of software being discussed in this thread, yet again I fear I am running out of years. So much to learn!

    IMHO The best way to learn is to start an actual project using one or two of the new programs/options you wish to incorporate into your creative pipeline.

    Chasing down every new shiney program or breast jiggles option, posted in the forums, will only distract you from ever actually creating anything as you will be spending all of your time in theoretical framework disussions instead of actually using any of the many non Daz studio options we have available.

    If you want to use blender as an external rendering option  then send a daz scene over with whatever transfer method you prefer, and start some test stills/  animation renders to get familiar with what is actually involved and go from there.

  • kwanniekwannie Posts: 865

    The big issue with trying to render in external platforms is that you lose some of the functionality that you had in DS, such as morphs or dynamics. I like the way DS handles skin and surface textures with Iray and I also like the addons like VWD and the breast and clute physics by masatok. I really like using Aniblocks for environment features like the ones from the philosopher. So to me the challenge is to find tricks to work with Iray, and finding ways to get things out in a speedier fashion. From what I have seen here the biggest time saver for large complex scenes comes in the form of composite rendering (sending rendered pics out in layers to composite software). I have seen Ia few touch on the topic but it is really not exploited as a viable option, and there are so many tools available to acheive it, canvases for example right here in DS.  Dreamlight has put together many tutorials and tools  to aid in composite rendering but it is really overlooked by many DAZ users. I have been able to render 1 or 2 characters with MMD animations applied generally getting a 4 to 6 frame per minute rate at 1080 resolution. Rendering the stage ( or background) seperately then combining in layer with the characters a composite platform like gimp saves a lot of time. Obviously all of the high resolution high content scene or environment products sold here  were never intended to be practical for animations. The best thing to use for animations in most cases is vinettes, so I use the geometry editor extensively to delete parts to included in my scenes. I also import many scenes in from MMD which are natively low poly and add some of my own texturing to beef it up slightly, which usually works great if you dont have too many round surfaces to give away the blockiness. I have even been able to use atmocam and dust particles (from the philosipher) with a spotlight to emulate haze in a club setting with a character render at a 800 resolution as stlil get at least 4 to 6 frames a minute. You just gotta manage the geometry and textures. So I like DS as a platform and would love to hear more ideas about ways of doing compositing and importing fantastic assets from free resources like MMD. Just my two cents.

  • kwanniekwannie Posts: 865

    Created Seperate OBJ Files for Fox Konko Outfit trying to figure out how to bring into DS and fit to G3 and G8 with texture OBJ files attached

  • kwanniekwannie Posts: 865

    Sorry attached wrong file before

    Created Seperate OBJ Files for Fox Konko Outfit trying to figure out how to bring into DS and fit to G3 and G8 with texture OBJ files attached

    zip
    zip
    Konko Outfit Seperate OBJ.zip
    853K
  • 3dOutlaw3dOutlaw Posts: 2,471

    Obj import works, but I tried it on the shirt and the normals are reversed.  I switched to wireframe mode so you can see the mat zone, and tri's.  I don't think any of that is a problem.  You may need to bring it thru blender to reverse the normals, otherwise, should work fine.  Just size and scale, rig and drape.  :)  Easy as that!

     

    shirt.jpg
    390 x 491 - 73K
  • kwanniekwannie Posts: 865

    Outlaw,

         So, is the normal being reversed just the way the creator package it as PMX file or is something that most likely happen during my export process. I did use DS geom editor to remove parts and export as OBJ. also when you say size scale and rig,  does rig mean send through the transfer utility? Lastly how do I prep these objects to be used with Dforce. Whenever I try to run a sim DS tells me I need to add a Dforce modifier.

  • 3dOutlaw3dOutlaw Posts: 2,471

    Normals may have been reversed during export, I am not sure.  In Blender you can show normals, and they should be facing out.  I literally just ran into this, you can see in the Blender thread.  Yes, rig is transfer utility.  Addin a dforce modifier is a process, see https://thinkdrawart.com/daz-studio-dynamic-cloth-dforce-tutorial-for-beginners, it is older but it shows the concepts.  Step 2-5 shows adding a modifier

     

  • outrider42outrider42 Posts: 3,679
    marble said:
    kwannie said:

    marble, I have been able to get decent quality animations by tweaking iray parameters and most of the time I can get 5 to 6 frames a minute. Yep about 10 seconds per frame and the trick is the compexity of the scene. All of the pics I included wereframes from animations that I did with only a few second spent on each rendered frames. I always use image series renders so I cam save and come back to continue rendering. I would love however  to find more ways of adding more complexity ( more elaborate stages, lighting more background characters) and getting rendering times down. Using older generations and doing toon styling cuts down on the memory hog textures of G3 and G8. but again alway looking for new ideas.

    Again, wow! 5-6 frames a minute - the best I have been able to achieve for a scene with 2 Genesis figures (3 and 8) and very little by way of furniture or props is about 2.5 minutes per frame. I could use noise reduction but that produces quite poor results to my eye. However, we are probably talking apples and melons here. I tend to produce scenes with the character much closer to the camera than the figures in the MMD dance animations and the closer, the slower the render. 

    Don't know what system you have, but the trick to rendering animation with Iray faster is to always use the Iray Viewport. Otherwise the scene has to reload all the information between every single frame. Using the Iray Viewport skips this step, saving a ton of time. Then you cap each frame in some way, either by limiting the render time or limiting the iteration count. You can also use the denoiser, but this is optional since some people don't like it. The denoiser can also look better or worse depending on the scene setup. You can experiment with different combos until you find a one that renders quickly while still retaining just enough detail. Remember, this is going to be in motion, you can get away with a lot of imperfections inside individual frames. You can also use some optimization tricks to make far away objects not in clear view simpler, so that they render faster. You can use camera tricks like depth of field to cover up the background not being fully detailed.

    But yeah, both Unreal and Unity offer ray tracing in real time, so they can accomplish quite a bit these days.

  • FSMCDesignsFSMCDesigns Posts: 12,755
     

    But yeah, both Unreal and Unity offer ray tracing in real time, so they can accomplish quite a bit these days.

    Agreed. After playing with DAZ figures and animations in Unity I don;t get why users still try and use DS and Iray.

    real time animations in Unity with GF2, easy peasy, especially cool in VR

  • kwanniekwannie Posts: 865

    FSM, you know what........... you win, lol. I really really need to look into Unity as a matter of fact i have really had intrerest tweaked by DVVR because they are importing MMD assts in also. So , my goal was always to find a way to make MMD type videos with realistic looking characters and physical based rendering  which is why I use DS as the platform. However, atmosphere and lighting effects and physics animation tools,  IK  that is comprehensable. All are very attractive in Unity. My biggest concern is conversion of my DS assets, clothes, morphs, scripts. But, who says I can't use both Unity and DS. As a matter of fact I have sooooooooo may different platforms alreay, DAZ, Poser, Blender. Carrara, Iclone, Ikinema and still have not been able to put together a full flashy video like one free program like MikuMikuDance can produce in a matter of minutes. I mean the Characters are redundant and toony but the stages the clothes the effects, no contest. I have attached the DVVR link for you to enjoy.

    https://www.youtube.com/watch?v=V1NbW-_V3vE

    https://www.youtube.com/watch?v=so_hrdVIo2I

  • kwanniekwannie Posts: 865

    outrider,

    I have tried opening the Iray Veiwport but never got a significant improvement, I'm sure I did it wrong. I never use more than 100 iterations. The stats I was quoting above was rending only the characters anf I can usually get up to 3 or 4 characters and keep the render time very low. My biggest hang up with DS besides the render times is getting lighting effects because I try to do alot of concert or Theater or Jazz club themes if would be great to get light effects cutting through the haze and not with just a fog plane but with real atmosphere effects where the haze has movement. I have forked out a ton of money here at DAZ to acheive minimal effects but it is simply not getting anywhere anytime fast.......................as many have communicated here before I like DAZ but it can be so frustrating to spend so much money then have jump through hoops  to just get instruction on how the features work....................Can any body explain the new IK system, and what its' full capabilities are, and if you canwhere the heck did you find the info. OK OK OK...............Hakuna Matata!!

  • RuphussRuphuss Posts: 2,631
    edited September 2019

    i have managed to make some small games with unity but

    how do you get a movie out of a game engine

    pls FSMC tell me

    Post edited by Ruphuss on
  • wolf359wolf359 Posts: 3,828

    @Ruphuss there may be some useful info here:
    http://standsurestudio.com/gamedev_dazanim2unity/

     

  • marblemarble Posts: 7,500
    marble said:
    kwannie said:

    marble, I have been able to get decent quality animations by tweaking iray parameters and most of the time I can get 5 to 6 frames a minute. Yep about 10 seconds per frame and the trick is the compexity of the scene. All of the pics I included wereframes from animations that I did with only a few second spent on each rendered frames. I always use image series renders so I cam save and come back to continue rendering. I would love however  to find more ways of adding more complexity ( more elaborate stages, lighting more background characters) and getting rendering times down. Using older generations and doing toon styling cuts down on the memory hog textures of G3 and G8. but again alway looking for new ideas.

    Again, wow! 5-6 frames a minute - the best I have been able to achieve for a scene with 2 Genesis figures (3 and 8) and very little by way of furniture or props is about 2.5 minutes per frame. I could use noise reduction but that produces quite poor results to my eye. However, we are probably talking apples and melons here. I tend to produce scenes with the character much closer to the camera than the figures in the MMD dance animations and the closer, the slower the render. 

    Don't know what system you have, but the trick to rendering animation with Iray faster is to always use the Iray Viewport. Otherwise the scene has to reload all the information between every single frame. Using the Iray Viewport skips this step, saving a ton of time. Then you cap each frame in some way, either by limiting the render time or limiting the iteration count. You can also use the denoiser, but this is optional since some people don't like it. The denoiser can also look better or worse depending on the scene setup. You can experiment with different combos until you find a one that renders quickly while still retaining just enough detail. Remember, this is going to be in motion, you can get away with a lot of imperfections inside individual frames. You can also use some optimization tricks to make far away objects not in clear view simpler, so that they render faster. You can use camera tricks like depth of field to cover up the background not being fully detailed.

    But yeah, both Unreal and Unity offer ray tracing in real time, so they can accomplish quite a bit these days.

    I knew about the viewport trick but I am pretty sure that only makes a difference for the first frame. With an animation using the same characters in the same scene the load time is only a penalty once. I've tried test which seem to confirm this.

  • marblemarble Posts: 7,500
     

    But yeah, both Unreal and Unity offer ray tracing in real time, so they can accomplish quite a bit these days.

    Agreed. After playing with DAZ figures and animations in Unity I don;t get why users still try and use DS and Iray.

    real time animations in Unity with GF2, easy peasy, especially cool in VR

    Yes that is very impressive if rendered in real time. I guess it may be time for me to look again at these game engines but it did look like quite a learning curve last time I looked. I also go the impression that the characters had to be converted to low-poly versions and that the animations were limited to a set of stock motions. But I really have not looked hard at down that avenue so I could be out of date or just plain wrong. I have to say that the figure in your video example looks quite high-res to me so that is even more impressive for real-time.

  • wolf359wolf359 Posts: 3,828

    As a matter of fact I have sooooooooo may different platforms alreay, DAZ, Poser, Blender. Carrara, Iclone, Ikinema and still have not been able to put together a full flashy video like one free program like MikuMikuDance can produce in a matter of minutes. I mean the Characters are redundant and toony but the stages the clothes the effects, no contest

    Consider that the MMdance program is purpose built 
    for creating flashy Dance videos..... and not much else.cool
    Blender. Carrara, Iclone, etc are designed for a wider array
    of  more general character animation purposes.

    However I tend to agree with FSM .
    Rendering animation in a brute force path tracer that is more suited for Architectural& product Visualization stills, is not the best course IMHO.

    Reallusion introduced IRay to Iclone pro about a year ago and I see No one, in  the Iclone community, using it for animation renders.

    The have since introduced a realtime Live ink to Unreal 4.

  • kwanniekwannie Posts: 865

    wolf,

       You are another one of those that brings factual based comments to every conversation, I gotta say I love it when all of you that do have experiance chime in on a topic. I know you are a true animator, wolf based on many of your posts wolf. I am not an animator at all but all I create with DAZ is animations. I absolutely love the fact that I can use a tools such as facegen , Altern8, Sloshwerks, VWD, Dforce, Strand Based Hair.......all in one package. But.........the animation tools...............oh where, oh where are you!!! I have looked at Iclones Curve Editor and it looks so intuitive and capable for timeline control, with Iclones ability to bring in G3 and G8 really a contender. How do these other platforms compare with DS's capability with cloth simulation and more importantly cloth interaction, for example can the character grab the edge of a tablecloth and pull it with their hand. This can be done in DS, can it be done in Unity, Iclone, Blender or other platform with almost any cloth product in your database. Can shaders be applied on the fly, or age changed from one frame to the next..............I mean, this industry has got us just where they want us..........lol.

  • ArtiniArtini Posts: 9,471
    edited September 2019
     

    But yeah, both Unreal and Unity offer ray tracing in real time, so they can accomplish quite a bit these days.

    Agreed. After playing with DAZ figures and animations in Unity I don;t get why users still try and use DS and Iray.

    real time animations in Unity with GF2, easy peasy, especially cool in VR

    Pretty impressive.

    What hair have you used and how did you get it to fly, like that.

     

    Post edited by Artini on
  • ArtiniArtini Posts: 9,471
    kwannie said:

    wolf,

       You are another one of those that brings factual based comments to every conversation, I gotta say I love it when all of you that do have experiance chime in on a topic. I know you are a true animator, wolf based on many of your posts wolf. I am not an animator at all but all I create with DAZ is animations. I absolutely love the fact that I can use a tools such as facegen , Altern8, Sloshwerks, VWD, Dforce, Strand Based Hair.......all in one package. But.........the animation tools...............oh where, oh where are you!!! I have looked at Iclones Curve Editor and it looks so intuitive and capable for timeline control, with Iclones ability to bring in G3 and G8 really a contender. How do these other platforms compare with DS's capability with cloth simulation and more importantly cloth interaction, for example can the character grab the edge of a tablecloth and pull it with their hand. This can be done in DS, can it be done in Unity, Iclone, Blender or other platform with almost any cloth product in your database. Can shaders be applied on the fly, or age changed from one frame to the next..............I mean, this industry has got us just where they want us..........lol.

    If you wish to learn more about Unity, there is a nice pack of knowledge books and videos right now at HumbleBundle.com

     

  • kwanniekwannie Posts: 865

    Looks like a pay site Artini, I could probably learn enough about Unity on youtube thanks for the info though!

  • ArtiniArtini Posts: 9,471
    kwannie said:

    Looks like a pay site Artini, I could probably learn enough about Unity on youtube thanks for the info though!

    Definitely. There are a lot of free resources available on YouTube and on the Unity website, as well.

     

Sign In or Register to comment.