The "Animators Assemble!" thread for Daz animation WIPs, clips, and tips

17810121330

Comments

  • StonemasonStonemason Posts: 1,179

    Hey guys..not sure if I ever shared this project from a couple years back,everything rendered in Daz Studio using Iray,not much animation involved other than moving cameras around but was fun to work on..wish I had more time to work on these kind of projects

     

  • mindsongmindsong Posts: 1,701

    The old Mimic Pro still seems to be easier and more accurate than a lot of newer solutions. Anilip, for instance, doesn't look good, even in their demos. Mimic Pro still works in Windows 10 on my new machine, so glad about that, but I wish they would update it for the newer figures. This is quick test I rendered right in Mimic, so no other animation added, but I think the lip sync is pretty good, considering this software is from 2008!

    I'm another mimic-pro hold-over. I use it with Genesis1 and while the tool is rough-edged, it just works the way I think, so it's hard to let it go. (I also still like Carrara for the same reason.)

    I also have IC7, but the time and learning curve to enter/master that world isn't practical right now as my projects go. Same for Blender animation. Clearly more capable than I at this point. I gotta say that the 4.12 animation updates indicate a good development direction, but the kinks are thick and unforgiving at this point, and I am rapidly losing faith in their development model. I see the Iclone, Blender, and Unreal development and commitment and am really questioning the depth of my DS investment, but also wary of the greener-grass effect of any platform you get to know 'too well'. Fortunately I can poke at all of 'em when I need a particular function.

    Of note, for more cartoony work, mjcasual has a really simple, but powerful tooltility called animotion, that simply maps audio volume to arbitrary (user set) pz2-based controls (e.g. mouth morphs, eyebrows, dog-tails, or VU meter needles, etc. - anything on a slider) It's a small EXE that lets you map sound volume to DS/Poser controllers, and it's surprising how well vocal volume to simple mouth motion works. Certainly not an end-all, but a good little hack to add to the toolbox for any number of clever uses. I'd be curious how well a frequency-domain equivalent would work - such that the rise and drop of the voice/sound frequency could also be mapped to such actions. I should ping him on that.

    Has anyone been playing with Brekel's new release candidates? Each past version has an Achilles heel to my workflow, but the specs and modular nature of the latest version look promising for markerless work.

    Iclone's new integrations look $eriou$ too - Bryan, have you poked in that direction at all?

    Good stuff in here, btw - I lurk, but very-much appreciate the wit and wisdom that congregates here.

    (and Ivy, the banana suit clip was a hoot. :)

    best,

    --ms

     

  • mindsongmindsong Posts: 1,701

    Hey guys..not sure if I ever shared this project from a couple years back,everything rendered in Daz Studio using Iray,not much animation involved other than moving cameras around but was fun to work on..wish I had more time to work on these kind of projects

     

    While your modeling and world building talents are unrivaled, apparently your visual sensibilties are equally refined.

    clean. crisp. elegant.

    --ms

  • Stonemason, that is jaw-dropping beautiful.. thanks for sharing! Sometimes simplicity is best, and this is a fantastic example.

     

  • wolf359wolf359 Posts: 3,827
    This may have potential for Blender.
  • That might be very interesting Wolf

    On another note, on a lark, wanted to see how the same animation worked in Studio with one of 3d Universe's new Toon Generations 4, I think the animation actually looks better on this character!

    https://vimeo.com/418108551

  • laststand@runbox.com[email protected] Posts: 866
    edited May 2020
    wolf359 said:

    Iclone has an optional cam based Facial motion capture plugin requiring an $$$ IphoneX $$$

    But I wont buy back into the Apple Ecosystem just to use one program feature.

    I like Apple too (tic), but the things I have seen done with that 3D Camera they have are irrestable. I found I could connect an iPhone to my local LAN without having phone service or having anything to do with Apple. I found the iPhone X to be nearly affordable, comparable to a digital camera. I'm considering it.

     

    Post edited by [email protected] on
  • Hi Mindsong, sorry, I missed your post

    I'm really not looking at iclone at all, Motionbuilder is way more powerful in regards to animation and realtime, and with some of the plugins from Neill3d, some realtime rendering capabilities (suitable for things like demos, and since I do some freelance work for the industry, its a standard), for other things, I've gotten somewhat good at UE4, which as time passes has more and more features (and virtually everyone is going that route). 

     All of the mocap systems I own have plugins for both motionbuilder and UE so at this point I don't see the need to plunk down around 2k for the same featureset.  If I didn't have motionbuilder,  I would probably look at iclone (it is around 2k cheaper..)

  • CinusCinus Posts: 118
    edited May 2020

    You guys have probably seen this, but just for in case you have not.

    https://www.youtube.com/watch?v=CXl8hz1q93U&list=LLjPmAeuxi9X_nwOF9_fObrA&index=2&t=0s

    and

    https://www.youtube.com/watch?v=iMVj44w4EUk

    I am not associated with the product in any way, just stumbled on it a couple of days ago and found it interesting.

    Post edited by Cinus on
  • wolf359wolf359 Posts: 3,827
    edited May 2020
    When the government starts handing out free IphoneX's, I will become interested in those facial capture products.
    Post edited by wolf359 on
  • nonesuch00nonesuch00 Posts: 18,107
    wolf359 said:
    When the government starts handing out free IphoneX's, I will become interested in those facial capture products.

    They don't even hand out those huge tubs of peanut butter or blocks of cheddar cheese anymore.

  • AuroratrekAuroratrek Posts: 218
    edited May 2020
    wolf359 said:
    Hey there Tim!!

    Welcome back to your own thread :-)

    LOL--thanks, Wolf! Yeah, sorry, it's been a while. Crazy busy with work, but had a little time for my favorite pasttime, trying to update my workflow. I see you've really made the leap to an almost entirely new workflow--very brave! Very cool render for you lip sync test. You inspired me to take a stab at it, and it seems to work fairly well, if a little slow for me--for instance, when you delete a phoneme key, it takes about 10 seconds to go away--I assume it's because it's updating the realtime animation. A little frustrating, but I think I got a decent result, seeing it was my first try (please excuse the male voice, I just grabbed something quick):

    Post edited by Auroratrek on
  • AuroratrekAuroratrek Posts: 218

    And this is the final render...  https://vimeo.com/417876497  brought it into Marmoset Toolbag (1 hr render vs 12hrs that it took for my previous test in Studio), I will render it out in Studio for comparison

    Quick compositing in AE to add rain, smoke and a slight blur to the character to knock down the sharpness.  As far as I'm going to go with this one, but I think I will use it for the tutorial I'm planning (yes, still going to do that)

    Thanks for sharing that Tim!, ahh, the good old days..

    Very cool clip, Bryan! Dynamixyz looks really interesting. I had always kind of shied away from the idea of facial capture, partly because I usually edit the heck out of my audio so I'm not sure a live face capture would help me, but it weirdly never occurred to me that you could use face capture on recorded video, so that might allow for a bit of flexibility.

  • AuroratrekAuroratrek Posts: 218
    edited May 2020

    Wolf, I had a question regarding importing Genesis 8 into iClone. Is it supposed to retain the weight mapping on the figure, like on the elbows? I imported G8 as a test and it didn't seem to do that. Not sure if I'm doing something wrong. The elbows on this figure are kind of kinked and rounded.

    g8_elbow.png
    463 x 580 - 175K
    Post edited by Auroratrek on
  • AuroratrekAuroratrek Posts: 218

    And since we're supposed to post WIPs here, this is an excerpt from the prologue of my latest project "Quest of the Key" that I'm working on. The music is not final. V4/M4 characters animated in Daz and Poser, rendered in Cinema 4D. 

     

  • AuroratrekAuroratrek Posts: 218

    This is part one of my Star Trek Fanimated Film "Blood of Tiberius". I've been working on it for about two years now. Part two is coming along... slowly.
    https://www.youtube.com/watch?v=_a1OnpNGt40

    This was animated entirely in DAZ 3d and rendered with Octane. Post work in After Effects then edited with Premiere. Audio Post was done with Protools. 
     

    (Sorry, still catching up on this thread.) This looks great, Tikiman! And, of course, I'm geeking over your story being based on TOS! I'd never considered rendering in Daz, and didn't know there was an Octane plugin for it. Thanks for sharing--I think! Now I have another possible workflow to explore. Certainly Daz 4.12 has much better animation tools than previous versions, which I was never able to work with. Your lip sync is quite good--that was done with the native  Daz Lip Sync? Did you have to tweak it at the keyframe level?

  • wolf359wolf359 Posts: 3,827

    Wolf, I had a question regarding importing Genesis 8 into iClone. Is it supposed to retain the weight mapping on the figure, like on the elbows? I imported G8 as a test and it didn't seem to do that. Not sure if I'm doing something wrong. The elbows on this figure are kind of kinked and rounded.

    Thanks Tim

    When doing Audio based lipsynch, in iclone , remember to activate your viseme track in the timeline to expose the lip smoothing options that were added in version 6.5

    On the G8 question, I cannot help you, I am afraid

    For me ,Genesis 8 Was Dead on arrival as they offered nothing new but more headaches with the twist bones behaving badly with most retargeting systems.

    In fact I have used GENX2 to move ALL of my legacy and custom shapes backwards to the venerable genesis-1 that I support with my own custom clothing & morphs along with face gen artist pro for random variety.

    Loving the new pipeline !!.. it will be moslty FBX from Iclone to Blender with some occasional Alembic after I get the Iclone cloth physics sorted to where I can make my own Dynamic garments for Iclone.

    Maxon Cinema4D is dead!!..Long live Blender. Final cut pro is dead!!..Long live Davinci Resolve
  • wolf359wolf359 Posts: 3,827
    @Bryan I could not resist taking a shot at the "Roy Batty" death scenne

    I sort of screwed things up by recording my own version of the speech to get a clean voice track without the music in the original movie clip so the lipsync is off in some parts.

    But I think I captured that wet ,rainy,late night atmosphere of the movie scene, with EEVEE and some lens effects comped in with the free Fusion app that is part of the FREE Davinci Resolve suite.
  • @Auroratrek  

    That clip is amazing Tim, I see nothing I can nitpick at all, I especially like how you paid attention to the secondary characters in the background, are you still using the Optitrack system for body mocap?  You mentioned Poser.. I used to do everything there, until they broke the fbx importing (I'm assuming you are using bvh for animation?)  Your flow and shots are excellent.

    That is one of the things I like about Dxyz, you are not tied to any particular hardware and you can use any video from any source, It does help to have an hmc when using body mocap, but you can do it separately.  Another thing, you can capture in sync with the Optitrack system (you are still using Arena no?, might have to test that, I know Motive works well with it), as you capture in Optitrack, it captures in Grabber (Dxyz's video capture utility)  However, if I remember correctly, Arena would allow you to plugin a usb camera, and would capture video from it as a reference camera (tested it several times), which you could in theory use with Dxyz.  Now, if you want to test it (and this goes for everyone reading this), you can get a free 30 day trial.  And I shared earlier some files that can help with that if using Gen8 characters (I could do the same for other genesis products)

    @Tikiman  Same goes to you, that clip is very, very good

    One of the things that was gratifying from both clips was this, you paid a lot of attention to the speed at which people move.  Most of what I see looks like it was filmed under water or in a dream, both for body and facial movement.  I know mocap helps a lot with this, you realize after a while how people move (Tikiman, are you using a mocap system? if not, your keyframe skills are pretty good)

    @Wolf.  Nice!, you especially nailed the wet look, which is something I was having some trouble with.  I've been using Resolve for some things too, getting the hang of it, but for effects I won't replace my AE CS6, there's simply too much I can do with it (but Resolve is FREEE!, so I have it on my laptop that I carry everywhere)

    One of the reasons I moved away from phoneme based solutions is that I was never really satisfied with how they looked.  I could never remove completely that tendency of snapping between poses, especially when speaking fast, without it becoming, for a better word, "mushy"

    Now, I will be the first to admit, facial mocap is not easy, it is a TON of work, especially the more realistic you want to get. 

  • wolf359wolf359 Posts: 3,827
    After effects CS (with video co-pilots optical flares) was used heavily for ALL of the VFX in "Galactus Rising"

    But it, like C4D and the rest of Adobe CS, is installed on my OLD Mac computer which is literally in "sudden death over time"

    That old Mac rendered uncompressed targas 24/7, for nearly Six years straight from C4D to make my 92 minute animated feature.

    As a Sci fi nerd ,there were two things I had been loathing to give up with my AECS.

    Holographic effects and Anamorphic lens flares.

    I Just Discovered that I can do BOTH in the FREE version of Davinci Resolve with a node base workflow when comping the "Roy Batty" animation.

    when I get my new AMD PC ,in two weeks, I am not even installing any of the "Kiddie pool" software on my new machine.( Daz studio, Iclone etc), I will leave them on my older PC and transfer character assets ,as needed, across the network

    The new PC is going to have only pro production&finishing software and windows itself will be the only "paid" software Blender 2.8x (free alternative to Maya, Max,C4D,Modo)

    Davinci Resolve (free alternative to Premier & Final cut) Davinci Fusion (free alternative Adobe After effects)

    Natron compositor (free alternative to Nuke)

    Krita (free alternative to Photoshop)
  • AuroratrekAuroratrek Posts: 218
    wolf359 said:
    When doing Audio based lipsynch, in iclone , remember to activate your viseme track in the timeline to expose the lip smoothing options that were added in version 6.5

    In fact I have used GENX2 to move ALL of my legacy and custom shapes backwards to the venerable genesis-1 that I support with my own custom clothing & morphs along with face gen artist pro for random variety.
     

    Thanks, Wolf--I put a little smoothing, but probably could use more. Still, it was nice to find a lip sync utility I could work with,

    Does Gen1 keep the weight maps, or is that just naturally get lost in translation?

  • AuroratrekAuroratrek Posts: 218
    wolf359 said:
    @Bryan I could not resist taking a shot at the "Roy Batty" death scenne
     

    Fun stuff! Maybe we should make this a "challenge" that everybody here take a shot at it!  ;-)  

  • AuroratrekAuroratrek Posts: 218

    @Auroratrek  

    That clip is amazing Tim, I see nothing I can nitpick at all, I especially like how you paid attention to the secondary characters in the background, are you still using the Optitrack system for body mocap?  You mentioned Poser.. I used to do everything there, until they broke the fbx importing (I'm assuming you are using bvh for animation?)  Your flow and shots are excellent.

    That is one of the things I like about Dxyz, you are not tied to any particular hardware and you can use any video from any source, It does help to have an hmc when using body mocap, but you can do it separately.  Another thing, you can capture in sync with the Optitrack system (you are still using Arena no?, might have to test that, I know Motive works well with it), as you capture in Optitrack, it captures in Grabber (Dxyz's video capture utility)  However, if I remember correctly, Arena would allow you to plugin a usb camera, and would capture video from it as a reference camera (tested it several times), which you could in theory use with Dxyz.  Now, if you want to test it (and this goes for everyone reading this), you can get a free 30 day trial.  And I shared earlier some files that can help with that if using Gen8 characters (I could do the same for other genesis products)

    @Tikiman  Same goes to you, that clip is very, very good

    One of the things that was gratifying from both clips was this, you paid a lot of attention to the speed at which people move.  Most of what I see looks like it was filmed under water or in a dream, both for body and facial movement.  I know mocap helps a lot with this, you realize after a while how people move (Tikiman, are you using a mocap system? if not, your keyframe skills are pretty good)

    @Wolf.  Nice!, you especially nailed the wet look, which is something I was having some trouble with.  I've been using Resolve for some things too, getting the hang of it, but for effects I won't replace my AE CS6, there's simply too much I can do with it (but Resolve is FREEE!, so I have it on my laptop that I carry everywhere)

    One of the reasons I moved away from phoneme based solutions is that I was never really satisfied with how they looked.  I could never remove completely that tendency of snapping between poses, especially when speaking fast, without it becoming, for a better word, "mushy"

    Now, I will be the first to admit, facial mocap is not easy, it is a TON of work, especially the more realistic you want to get. 

    Thanks, Bryan! Yes, still using the Optitrack, outputting bvh. It's surprisingly forgiving in some ways. I moved to a new house with a much smaller, cramped basement, but I am still able to get decent capture. I only used Poser in this case because the dress for the queen character only worked with Poser's cloth room. I used the Poserfusion plugin to send her to Cinema 4D. For the robot character I used 3DXchange and fbx. The other characters were done in Daz and exported as obj and mdd files to C4D via Riptide. Both Poserfusion and Riptide are zombie apps at this point, so that's why I'm interested in other workflows. Wolf has moved beyond this problem, but I'm hoping to stay with C4D if I can. I looked into Blender and was a bit intimidated, but maybe I should suck it up and take another look. I am using Motive, tho an older version. I may have to check the free version of Dxyz when I know I have a block of time to give it a good test. Yes, it's hard to avoid that "snapping" with phoneme editing. The trick I found to help it is to get rid of as many phonemes as you can get away with. Yes, mocap is such a godsend. I hand keyed the first half of my first Star Trek movie, and tho I got better as I went along, mocap looks a million times better. 

  • AuroratrekAuroratrek Posts: 218
    wolf359 said:
    After effects CS (with video co-pilots optical flares) was used heavily for ALL of the VFX in "Galactus Rising"

    But it, like C4D and the rest of Adobe CS, is installed on my OLD Mac computer which is literally in "sudden death over time"

    That old Mac rendered uncompressed targas 24/7, for nearly Six years straight from C4D to make my 92 minute animated feature.

    As a Sci fi nerd ,there were two things I had been loathing to give up with my AECS.

    Holographic effects and Anamorphic lens flares.

    I Just Discovered that I can do BOTH in the FREE version of Davinci Resolve with a node base workflow when comping the "Roy Batty" animation.

    when I get my new AMD PC ,in two weeks, I am not even installing any of the "Kiddie pool" software on my new machine.( Daz studio, Iclone etc), I will leave them on my older PC and transfer character assets ,as needed, across the network

    The new PC is going to have only pro production&finishing software and windows itself will be the only "paid" software Blender 2.8x (free alternative to Maya, Max,C4D,Modo)

    Davinci Resolve (free alternative to Premier & Final cut) Davinci Fusion (free alternative Adobe After effects)

    Natron compositor (free alternative to Nuke)

    Krita (free alternative to Photoshop)

    Intriguing! Taking notes....

  • wolf359wolf359 Posts: 3,827
    wolf359 said:
    When doing Audio based lipsynch, in iclone , remember to activate your viseme track in the timeline to expose the lip smoothing options that were added in version 6.5

    In fact I have used GENX2 to move ALL of my legacy and custom shapes backwards to the venerable genesis-1 that I support with my own custom clothing & morphs along with face gen artist pro for random variety.
     

    Thanks, Wolf--I put a little smoothing, but probably could use more. Still, it was nice to find a lip sync utility I could work with,

    Does Gen1 keep the weight maps, or is that just naturally get lost in translation?

    Hi Tim,

    The default Iclone Audio based Lipsync is better than any of the available Daz options IMHO, and you can layer on expressions with face puppet in realtime as I did with the "Roy Batty" clip.

    On the matter of joints

    If you do an "old style" FBX import ( as I am Doing), via 3DXchange, after loading the .Duf face key from the genesis extension pack in Daz studio,
    Then NO, you wont have the Daz Jcm's in Iclone

    If you have the CC3 pipeline version ( I do not obviously) your FBX import is shape projected onto a NATIVE Iclone CC3 base avatar and I believe there are some Iclone native joint corrective morphs or something

    The whole "joint thing" is not a show stopper for me as my characters are dressed head to toe 99% of the time.

    I had a similar thought about a "Blade Runner "Challenge.
    Here is the "Roy Batty" audio for anyone else here wishing to have a go at it.

    https://drive.google.com/file/d/1ERgIhgj7SItICAvrdePshswvg_k3qm7i/view?usp=drivesdk
  • AuroratrekAuroratrek Posts: 218
    wolf359 said:
    Hi Tim,

    The default Iclone Audio based Lipsync is better than any of the available Daz options IMHO, and you can layer on expressions with face puppet in realtime as I did with the "Roy Batty" clip.

    On the matter of joints

    If you do an "old style" FBX import ( as I am Doing), via 3DXchange, after loading the .Duf face key from the genesis extension pack in Daz studio,
    Then NO, you wont have the Daz Jcm's in Iclone

    If you have the CC3 pipeline version ( I do not obviously) your FBX import is shape projected onto a NATIVE Iclone CC3 base avatar and I believe there are some Iclone native joint corrective morphs or something

    The whole "joint thing" is not a show stopper for me as my characters are dressed head to toe 99% of the time.

    I had a similar thought about a "Blade Runner "Challenge.
    Here is the "Roy Batty" audio for anyone else here wishing to have a go at it.

    https://drive.google.com/file/d/1ERgIhgj7SItICAvrdePshswvg_k3qm7i/view?usp=drivesdk

    Okay, thanks! I have to confess that I bought CC3, but never quite figured out what it did, exactly, since it didn't seem like part of the import/export of Daz characters, but maybe I need to take a look...

  • hookflashhookflash Posts: 169
    wolf359 said:
    If you have the CC3 pipeline version ( I do not obviously) your FBX import is shape projected onto a NATIVE Iclone CC3 base avatar and I believe there are some Iclone native joint corrective morphs or something...

    Unfortunately, iClone & CC3 still don't support JCMs, despite the fact that they were on the 2019 roadmap at one point (RL ran into "performance issues"). It's just vanilla skin-weighting; although, if you're a sucker for punishment, you could theoretically animate corrective morphs manually using Morph Animator. frown

  • tikiman-3dtikiman-3d Posts: 35

    This is part one of my Star Trek Fanimated Film "Blood of Tiberius". I've been working on it for about two years now. Part two is coming along... slowly.
    https://www.youtube.com/watch?v=_a1OnpNGt40

    This was animated entirely in DAZ 3d and rendered with Octane. Post work in After Effects then edited with Premiere. Audio Post was done with Protools. 
     

    (Sorry, still catching up on this thread.) This looks great, Tikiman! And, of course, I'm geeking over your story being based on TOS! I'd never considered rendering in Daz, and didn't know there was an Octane plugin for it. Thanks for sharing--I think! Now I have another possible workflow to explore. Certainly Daz 4.12 has much better animation tools than previous versions, which I was never able to work with. Your lip sync is quite good--that was done with the native  Daz Lip Sync? Did you have to tweak it at the keyframe level?

    Hi Tim! Thank you for the kind words!
    I did all the lip syncing in DAZ 32 bit version because (to me) mimic in it has better features. IE: the ability to import a .wav file and text file. I turn off everthing else like head movement and expressions. Saved as a pose/animation then went back to 64bit. I didn't too much tweaking. Only closing the mouth after they speak. 
    Star Trek Aurora was actually my inspiration for doing mine as animation. It was around the same time as Axanar gate when I was originally going to do my script as live action. But, then, well yeah you know what happened. So I turned to doing as animation on my own schedule.

          

     

  • tikiman-3dtikiman-3d Posts: 35
    edited May 2020

     


    This is a test I did on the speed of Octane v. Iray. I wanted to try an experiment only for my own curiosity as related to my system.

    I hope this is of interest.

    Here's a scene I did a few days ago. 800 Frames.
    Using DAZ:
    Genesis 8 – Carmelita
    dForce - Bardot dress
    Pony Knots – Hair
    Free Animation from Flipbook Market (defunct)
    Hotel Balcony

    Scene was put together in about 10 minutes.

    dForce simulation took about 45 minutes.

    VWD used to simulate hair. 45 minutes.

    For the render test:
    Each scene was rendered in three passes.

    Using Octane:
    After doing an auto import, I had to adjust the materials and environment. 30 minutes.

    *NOTE* "CARMELITA" should have a slight Latina skin coloring. I was trying to achieve this in Octane without much luck.

    Figure pass was rendered at 1920X1080 at 100 samples with denoiser.
    (based on file creation time stamp)
    5:40pm – 11:36pm = 5 hours: 54 minutes render time.
    Background pass was rendered at 960 x 540 at 50 samples with denoiser.
    7:20am – 8:30am = 2 hours: 10 minutes render time.
    Foreground Railing from frame 523 - 789. Rendered at 960 x 540 at 50 samples with denoiser.
    9:16am – 9:25 = 9 minutes

    Total render time for Octane: 8 hours: 13 minutes

    Using Iray:

    Did not need to adjust the materials except for the environment which was Sun/Sky. 1 minute.

    Figure pass was rendered at 1920X1080 at 100 iterations with denoiser.
    12:02 pm – 7:13pm = 7hours: 11 minutes

    Background pass was rendered at 960 x 540 at 50 iterations with denoiser.
    8:05pm – 11:53pm = 3hours: 48minutes

    Foreground Railing from frame 523 - 789. Rendered at 960 x 540 at 50 samples with denoiser.
    8:56am - 9:07:am = 11minutes

    Total render time for Iray: 11 hours: 10 minutes

    **NOTE**
    (The pixelation in the hair was due to my forgetting to turn on the noise degrain filtering. In my tests it didn't affect the render time.)

    Using After Effects:
    Both scenes:
    Composited the scene.
    Slightly blurred the background for depth of field.
    I added an adjustment layer for color.

    My specs:
    ASRock Z370 Extreme4
    Intel Core i7-8700K -Overclocked to 4.5
    NVIDIA GeForce GTX 1060ti 6GB
    32gigs of ram.

     

    Post edited by tikiman-3d on
  • wolf359wolf359 Posts: 3,827
    Link is refused by Chrome

    '"Too many Facebook redirects"
Sign In or Register to comment.