A simple script to utilize the viewport preview to dramatically redcue animation render times.

I recently got a RTX 3090 as a companion to my RTX 2080.  Render times are certainly blazing fast for static images.  However for animations or image sequences render times overall weren’t that much better.  This is due to the way Daz handles multi frame rendering.  It seems that at the end of each frame rendered the entire iRay engine is completely restarted.  For my example I was using a single character with a black background and 2 mesh lights  rendering to images sized at 500 x 500 pixels.   The time of the actual rendering of each frame was 2-3 seconds before 95% convergance was reached.  Then there was a delay of 15-20 seconds while it does who knows what, before rendering the next frame.  Total time for a 10 frame test was 182 seconds with only 20-30s of that time being spent actually rendering.

I am pleased with the performance of the viewport and thought “Why not just capture the viewport frame by frame?”  After all, It reaches an acceptable appearance within 2-3 seconds after an adjustment is made and I’m not looking for extreme quality.  So I came up with this script.

The script captures each frame of the viewport after a specified number of milliseconds and then saves the image with a sequential suffix.  The overhead of time between each frame is reduced to around 200ms. A test sequence of 10 frames rendering for 3 seconds each had a total execution time of 32.238s. Which is a time reduction of 82.2%! 
The downside of capturing the viewport is the image will reflect exactly what the viewport is displaying.  If a node is hovered over and highlighted it will show in the image,  as will any tool gizmos.  Thus, greater care needs to be taken in the setup.

This won’t work for all situations and is highly dependent on hardware and the timePerFrame value.  But I imagine it would reduce the overhead between frames substantially even for CPU renders. 

 

const timePerFrame=3000;//msconst timeStep=Scene.getTimeStep();const playbackRange=Scene.getPlayRange();const startFrame=playbackRange.start/timeStep;const endFrame=playbackRange.end/timeStep;const sFilePath="D:/renderTest2/";const filename="test";const vp = MainWindow.getViewportMgr().getActiveViewport().get3DViewport();(function(){	var timeOfNextFrame=0;	var totalFrames=endFrame-startFrame;	startProgress ("Rendering "+totalFrames +" Frames:", totalFrames, true, true);	for(var i=startFrame;i<=endFrame;i++){		Scene.setFrame( i );		stepProgress(1);		statusPrinted=false;		timeOfNextFrame=Date.now()+timePerFrame;		print("    Rendering frame "+i+" of "+ totalFrames);		while(Date.now()<timeOfNextFrame){			processEvents();		}		saveImg(i);		if(progressIsCancelled ())return;	}})();function saveImg(i){	var numS=pad(i,4);	var filePath=sFilePath+filename+numS+".png";	vp.captureImage().save(filePath);}function pad(num, size) {    num = num.toString();    while (num.length < size) num = "0" + num;    return num;}

 

Comments

  • Amazing, and very clever. Thanks.

    So most people would be able to capture a viewport of about 1400 pixels widescreen with this? Then upscale that with the Topaz AI software, though that could take an overnight run, to 1920 pixels.

    Also, if you were filtering the renders in another way, by running the frames through a GMIC Photoshop action to get lineart and a painterly effect, then that size would be useful. It would make the filtering faster that at 4k. I wonder what the combination of a painterly filter and AI 4x upscale would look like?

  • WindamyreWindamyre Posts: 142

    This is an excellent idea.  Thanks for sharing this!

  • WendyLuvsCatzWendyLuvsCatz Posts: 31,933

    I think the interactive render can be set to reflect what is in the viewport, it does not render everything a photorealistic render does.

    I know strandbased  hairs and other features do not render the same and certain shader options.

  • Sci Fi FunkSci Fi Funk Posts: 1,115
    edited January 5

    @gamplod It's a great idea and thank you for shaing your work!

    I've spent my 3d time today testing a number of set ups with this script vs using the render engine the traditional way. I have a RTX2070super card.

    1. Speed. It's faster! (Some example tests below).

    G8f, no hair, 2 clothing items, 344mb texture memory consumption - average 16 frames a minute (Viewport) vs 12f/min (Render Output) - so 33% faster.

    G2f, no hair, dress and shoes, 172mb texture memory consumption - av 16f/min (Viewport) vs 9f/min (Render Output) - so 74% faster!

    G2f, hair, dress and shoes, 209mb texture memory consumption - av 15f/min (Viewport) vs 9f/min (Render Output) - so 69% faster!

    Now these were 1K renders, so no issues with the viewport info getting in the way (I got it down to just one small icon in the top right anyway).

    2. Issues / improvements.

    1. At 4K(ish) I ran into problems with 3 seconds rendering time, namely if the G8 had hair and 4 clothing items it couldn't actually get to the render in time some of the time (before finishing morph corrections, smoothing etc). Also there are little variations in render speed (I guess due to background tasks - although I had no other programs open), so in these ultra fast rendering scenarios, being able to render by number of samples would be a help for us with lesser graphics cards. Of course this problem dissapears as an RTX3090 becomes bog standard in 202x (when it's a RTX5080 or whatever), and we are all on faster cards.

    Now you might say, well just increase the render time. Well my record is 6-7 seconds rendering the traditional way (G2 model). 4 seconds of which was waiting for the render to start, so given that these are coming out at 4 seconds each (1 sec prep, 3 seconds render time), it starts to get close to not being worth it if I have to increase render time.

    2. This leads me to another consideration. Viewport rendering is great if it's a figure against an HDRI for example, but for those of us who layer frames in Sony Vegas or similar, a PNG output with no background is a must. (OK not a must but better than green screen as it has less issues and more seperation). For some reason the viewport render outputs the PNG with a colour, whereas the render output tab renders it as a cutout - which is what I need.

    Overall then as things stand - it's great! I will use it for HDRI scenes, but probably not for cutout scenes. If you, or anyone felt motivated to make either of these improvements fantastic! If not, well I am giving a little time to programming (must spend most of my time on animating), and I might be able to make it driven by samples in time, but probably not - my programing days are behind me and I'm firmly in bodge it and leg it mode these days.

     

    Post edited by Sci Fi Funk on
  • TromNekTromNek Posts: 7

    Thanks gamplod !

    I'm modifying the script to only render 'keyframes'. And this is working very well for me.

    But, instead of guessing at the time ( timePerFrame=3000;). I would like to wait until a certain number of Iray iterations/samples (i.e. 100 or 200).

    Do you know if the current Iray iteration/sample is avaliable through an API ?

    Or I could set 'Max Samples' in the render settings and check to see if the renderer is 'actively rendering'. But all the .isRendering() methods don't seem to work for Iray viewport rendering.

    Any ideas ?

  • Do note that the Iray Drawstyle does not show everythign that the full renders do - for example there is no displacement.

    On the script, const is just a synonym for var - it doesn't make the value constant in the current version of Qt Script. It may help as a reminder that there are values that should not vary, but ti dosn't have any effect at the code level that is diffrent from simply declaring a var. Also,

    statusPrinted=false; is asigned a value without any var (or const) declaration, which makes it a global variable, and in fact is never used in any way so could be deleted.

  • TromNekTromNek Posts: 7

    tx, I'll keep in mind the lack of displacement. These are just quick reneders to make sure all my keyframes are lighted properly before I kick them off for batch rendering overnight.

    I guess the lack displacement will mess up water ripples and such, but now that I know, it won't worry me.

    I'm new to Qt so all the const and var declarations are good for me to learn properly.

    Thanks again.

Sign In or Register to comment.