AWE Shading Kit for DAZ Studio and 3delight

1356716

Comments

  • Mustakettu85Mustakettu85 Posts: 2,933
    khorneV2 said:

    thanks, too bad as the shading kit is fantastic and very flexible/"open source".

    Great results !

    Thanks Kettu & Wowie for the quick answers and for the Fix, i'll will try that

    You're welcome, enjoy =)

    Since you're using RIBs BTW, it means you should be able to manually fix (by RIB editing) a yet another DS bug that messes up shadows and the like on subdivided instances. Let me know if you ever run into it.

  • khorneV2khorneV2 Posts: 146

    ouh ha ! I think this is over my abilities !

  • Mustakettu85Mustakettu85 Posts: 2,933
    khorneV2 said:

    ouh ha ! I think this is over my abilities !

    What, running into the bug? You'll know when you do =)

    But it's really a very easy fix, honest. Involves deleting a single line. So keep that in mind =)

  • Sven DullahSven Dullah Posts: 7,621

    So now we just need AWE grass- and rockshaders, AWE ghost opacity and AWE Uber/EasyVolume, to name a fewcheeky

    ...and of course the AWE Dynamic LensFlares that work with HDRI lightingblush

    Ok I was kind of joking, but seriously, how do you make things like godrays, UberVolume stuff within AweSurface?

  • Mustakettu85Mustakettu85 Posts: 2,933

    So now we just need AWE grass- and rockshaders, AWE ghost opacity and AWE Uber/EasyVolume, to name a fewcheeky

    ...and of course the AWE Dynamic LensFlares that work with HDRI lightingblush

    Ok I was kind of joking, but seriously, how do you make things like godrays, UberVolume stuff within AweSurface?

    We don't. As of now.

    UberVolume and EasyCamera are ray marchers - it's an old way of basically numerically integrating volume density along the view vector. Almost like stacking semi-transparent planes, just without the planes LOL. You can guess it's a pretty slow technique.
    Moreover, it's limited to picking up light from illuminate() constructs only - i.e. you can't ray-march a path-traced area light.

    The solution is OpenVDB, and 3Delight Studio does support it in RSL, but via a DLL only. Trying to link this DLL within DS leads to a crash.

    So this is why nobody really bothered with it.

    Not even me.

    See, there is a pretty simple compositing solution: you render your scene normally, then you save out a copy, turn all the surfaces 100% black (no diffuse, no reflection, nothing), load any volumetric shader, edit the lights so that there are no environment lights and the path-traced area lights are replaced by traditional illuminate()-based ones (or even spotlights), and render out a volume-only layer. 

    If rendering to a gamma-corrected format like PNG, you then "screen" it on.

     

  • Sven DullahSven Dullah Posts: 7,621

    So now we just need AWE grass- and rockshaders, AWE ghost opacity and AWE Uber/EasyVolume, to name a fewcheeky

    ...and of course the AWE Dynamic LensFlares that work with HDRI lightingblush

    Ok I was kind of joking, but seriously, how do you make things like godrays, UberVolume stuff within AweSurface?

    We don't. As of now.

    UberVolume and EasyCamera are ray marchers - it's an old way of basically numerically integrating volume density along the view vector. Almost like stacking semi-transparent planes, just without the planes LOL. You can guess it's a pretty slow technique.
    Moreover, it's limited to picking up light from illuminate() constructs only - i.e. you can't ray-march a path-traced area light.

    The solution is OpenVDB, and 3Delight Studio does support it in RSL, but via a DLL only. Trying to link this DLL within DS leads to a crash.

    So this is why nobody really bothered with it.

    Not even me.

    See, there is a pretty simple compositing solution: you render your scene normally, then you save out a copy, turn all the surfaces 100% black (no diffuse, no reflection, nothing), load any volumetric shader, edit the lights so that there are no environment lights and the path-traced area lights are replaced by traditional illuminate()-based ones (or even spotlights), and render out a volume-only layer. 

    If rendering to a gamma-corrected format like PNG, you then "screen" it on.

     

    Ok, tks Mustakettu! Yeah there are workarounds, so not a major problem... I just try to limit the amount of postwork because I suck at it=)

  • Mustakettu85Mustakettu85 Posts: 2,933

    I just try to limit the amount of postwork because I suck at it=)

    I can relate to that =)

    "Postwork" as in paint or mask stuff in the resulting render is what I prefer to avoid as well. Straight-out compositing, on the other hand, is often the easiest and fastest way to get stuff done.

  • Sven DullahSven Dullah Posts: 7,621

    Would it be hard to include a pathtraced pointlight in a future update? I mean, it's apparently impossible to configure the arealight to get a uniform 360 degree spread? Which, when you need it, you have to use a primitive sphere or something else with a geometry (more polys), causing longer rendertimes. Just thinking out loud herefrown

  • Mustakettu85Mustakettu85 Posts: 2,933

    Would it be hard to include a pathtraced pointlight in a future update? I mean, it's apparently impossible to configure the arealight to get a uniform 360 degree spread? Which, when you need it, you have to use a primitive sphere or something else with a geometry (more polys), causing longer rendertimes. Just thinking out loud herefrown

    Point lights do not exist IRL, neither can you make one from a flat area light. A tilted cube, however, is just six polys but the light it emits should look close enough to most small solid-body sources with more complex shapes.

    awePT_cameraon.png
    736 x 719 - 400K
  • Sven DullahSven Dullah Posts: 7,621
    edited October 2018

    Would it be hard to include a pathtraced pointlight in a future update? I mean, it's apparently impossible to configure the arealight to get a uniform 360 degree spread? Which, when you need it, you have to use a primitive sphere or something else with a geometry (more polys), causing longer rendertimes. Just thinking out loud herefrown

    Point lights do not exist IRL, neither can you make one from a flat area light. A tilted cube, however, is just six polys but the light it emits should look close enough to most small solid-body sources with more complex shapes.

    Tks, good point!laugh

    Post edited by Sven Dullah on
  • wowiewowie Posts: 2,029
    edited October 2018

    Would it be hard to include a pathtraced pointlight in a future update? I mean, it's apparently impossible to configure the arealight to get a uniform 360 degree spread? Which, when you need it, you have to use a primitive sphere or something else with a geometry (more polys), causing longer rendertimes. Just thinking out loud herefrown

    No point lights needed. All you need to get a 360 degree spread from an area light is a single plane with one instance of the plane.Even less polys than a cube. laugh

    That's actually what I used for the fire (combined with an opacity mask) on one of the promos .

    I did notice the files distributed by DAZ is missing the actual AWE Environment light node (when you load the AWE Environment Light.duf), It only loads the environment sphere. Thankfully, the files are all there. For now, you can just reuse the ones included in the quickstart scene. If you don't mind a little bit of work, you can just basically save the light/sphere combo as a scene subset, then replace the AWE Environment Light.duf with that .duf of that saved subset.

    fake pointlight.jpg
    874 x 618 - 122K
    Post edited by wowie on
  • Sven DullahSven Dullah Posts: 7,621
    wowie said:

    Would it be hard to include a pathtraced pointlight in a future update? I mean, it's apparently impossible to configure the arealight to get a uniform 360 degree spread? Which, when you need it, you have to use a primitive sphere or something else with a geometry (more polys), causing longer rendertimes. Just thinking out loud herefrown

    No point lights needed. All you need to get a 360 degree spread from an area light is a single plane with one instance of the plane.Even less polys than a cube. laugh

    That's actually what I used for the fire (combined with an opacity mask) on one of the promos .

    I did notice the files distributed by DAZ is missing the actual AWE Environment light node (when you load the AWE Environment Light.duf), It only loads the environment sphere. Thankfully, the files are all there. For now, you can just reuse the ones included in the quickstart scene. If you don't mind a little bit of work, you can just basically save the light/sphere combo as a scene subset, then replace the AWE Environment Light.duf with that .duf of that saved subset.

    Tks wowie, yeah I thought of that, but basically you need three planes (or one plane, two instances) to get the same effect(pointlight/candle), right? One plane needs to be horisontal. Well still less polys than the cube, have to try.

  • wowiewowie Posts: 2,029

    Tks wowie, yeah I thought of that, but basically you need three planes (or one plane, two instances) to get the same effect(pointlight/candle), right? One plane needs to be horisontal. Well still less polys than the cube, have to try.

    Not really if you don't mess with the spread angle (180 degrees).

  • Mustakettu85Mustakettu85 Posts: 2,933
    wowie said:

    No point lights needed. All you need to get a 360 degree spread from an area light is a single plane with one instance of the plane.Even less polys than a cube. laugh

    Awesome! XD I totally forgot that polys can intersect; talk about a "duh moment"

  • Sven DullahSven Dullah Posts: 7,621
    wowie said:

    No point lights needed. All you need to get a 360 degree spread from an area light is a single plane with one instance of the plane.Even less polys than a cube. laugh

    Awesome! XD I totally forgot that polys can intersect; talk about a "duh moment"

    I have a render going  and can confirm this is working great. Speeding up things significantly;)

  • Sven DullahSven Dullah Posts: 7,621
    wowie said:

    No point lights needed. All you need to get a 360 degree spread from an area light is a single plane with one instance of the plane.Even less polys than a cube. laugh

    Awesome! XD I totally forgot that polys can intersect; talk about a "duh moment"

    I have a render going  and can confirm this is working great. Speeding up things significantly;)

    So here it is, 24 emissive planes, two for each light, rendertime 1h35m

  • Sven DullahSven Dullah Posts: 7,621

    @wowie  How difficult and time consuming would it be to implement a bloom effect? I know it is easily done in postwork, but, considering how well the environment blur works, I have a guts feeling it wouldn't be impossible?

    Just tossing out ideas here;)

  • khorneV2khorneV2 Posts: 146

    Is it possible to convert a standard RIB generated file (standard 3DL) in order to implement the scripted renderer features ? In order to bypass Daz STudio bugs/problems ?

  • Mustakettu85Mustakettu85 Posts: 2,933

    @wowie  How difficult and time consuming would it be to implement a bloom effect? I know it is easily done in postwork, but, considering how well the environment blur works, I have a guts feeling it wouldn't be impossible?

    Just tossing out ideas here;)

    Proper bloom needs two things: an imager shader and rendering to EXR (so as not to get the luminance clipped to screenspace). Which should both be possible in theory - rendering to EXR is possible in practice actually, but.

    You can't render to EXR using that DS window. So it's either render to file only, or RIB export and then rendering to 3Delight i-display (that's how I do "serious work").

    Neither of these options will let you adjust bloom parameters on the fly, though.

    So IMO the best way is rendering to EXR (doesn't matter which way) and then adding bloom in postprocessing. Something like the Blender compositor works like a charm. Maybe GIMP too, but let's just say I haven't managed to find where they have bloom there.

  • Mustakettu85Mustakettu85 Posts: 2,933
    khorneV2 said:

    Is it possible to convert a standard RIB generated file (standard 3DL) in order to implement the scripted renderer features ? In order to bypass Daz STudio bugs/problems ?

    Editing RIBs - yes, perfectly possible. They're just huge text files.

    Which bugs/problems specifically are you looking to avoid?

  • Sven DullahSven Dullah Posts: 7,621

    @wowie  How difficult and time consuming would it be to implement a bloom effect? I know it is easily done in postwork, but, considering how well the environment blur works, I have a guts feeling it wouldn't be impossible?

    Just tossing out ideas here;)

    Proper bloom needs two things: an imager shader and rendering to EXR (so as not to get the luminance clipped to screenspace). Which should both be possible in theory - rendering to EXR is possible in practice actually, but.

    You can't render to EXR using that DS window. So it's either render to file only, or RIB export and then rendering to 3Delight i-display (that's how I do "serious work").

    Neither of these options will let you adjust bloom parameters on the fly, though.

    So IMO the best way is rendering to EXR (doesn't matter which way) and then adding bloom in postprocessing. Something like the Blender compositor works like a charm. Maybe GIMP too, but let's just say I haven't managed to find where they have bloom there.

    Ok no biggie. I use GIMP for bloom, what I do is make a b&w layer, crank up contrast and intensity until I only see the hotspots, then apply the artistic glow filter and/or gaussian blur, then use some of the layersettings like soft light, screen or whatever;) Tks for enlightening me once moresmiley

  • Mustakettu85Mustakettu85 Posts: 2,933
    Ok no biggie. I use GIMP for bloom, what I do is make a b&w layer, crank up contrast and intensity until I only see the hotspots, then apply the artistic glow filter and/or gaussian blur, then use some of the layersettings like soft light, screen or whatever;) Tks for enlightening me once moresmiley

    You're welcome =) So would you happen to know if that "artistic glow filter" has an adjustable luminance cutoff? Or it's a remainder from the days when GIMP did not support HDR images?

  • Sven DullahSven Dullah Posts: 7,621
    Ok no biggie. I use GIMP for bloom, what I do is make a b&w layer, crank up contrast and intensity until I only see the hotspots, then apply the artistic glow filter and/or gaussian blur, then use some of the layersettings like soft light, screen or whatever;) Tks for enlightening me once moresmiley

    You're welcome =) So would you happen to know if that "artistic glow filter" has an adjustable luminance cutoff? Or it's a remainder from the days when GIMP did not support HDR images?

    I haven't upgraded for a long time, my version doesn't support HDR. IIRC there are three sliders for the filter, radius, intensity and sharpness, no luminance cutoff.

  • Sven DullahSven Dullah Posts: 7,621

    So I'm just a bit curious, what's the reason normal maps are not supported?

  • khorneV2khorneV2 Posts: 146

    Is it possible to convert a standard RIB generated file (standard 3DL) in order to implement the scripted renderer features ? In order to bypass Daz STudio bugs/problems ?

    khorneV2 said:

    Is it possible to convert a standard RIB generated file (standard 3DL) in order to implement the scripted renderer features ? In order to bypass Daz STudio bugs/problems ?

    Editing RIBs - yes, perfectly possible. They're just huge text files.

    Which bugs/problems specifically are you looking to avoid?

    Creating RIB files is very stable with the standard 3delight (Vanilla) renderer in Daz Studio. I managed to copy the settings in the generated rib and AWE surface worked even if

    I don't know if all the settings were correct.

    So, i wonder if it is possible to have a script to modify the classic rib files to use path-tracing and so. This will avoid the DS Bug....

    ...well...I guess laugh

  • Sven DullahSven Dullah Posts: 7,621

    So what about procedural shaders? Possible?

    cheeky

  • ArkadySkiesArkadySkies Posts: 206
    edited October 2018

    Are AoA shaders incompatible with AWE? I'm not thrilled with the results I've been getting with AWE skin (not that I've had a lot of time to practice), but I am thrilled with the results I get with AoA as the result of a LOT of time and practice but I'm getting black areas where I have AoA applied in a scene set up with AWE lights and render settings.

    Post edited by ArkadySkies on
  • Sven DullahSven Dullah Posts: 7,621

    Are AoA shaders incompatible with AWE? I'm not thrilled with the results I've been getting with AWE skin (not that I've had a lot of time to practice), but I am thrilled with the results I get with AoA as the result of a LOT of time and practice but I'm getting black areas where I have AoA applied in a scene set up with AWE lights and render settings.

    AFAIK, yes, you need to convert everything.

  • ArkadySkiesArkadySkies Posts: 206
    edited October 2018

    Spent a bit more time with it, and it looks like Metalness and Transmission work ok (but not the reflection of transmitted surfaces) when you use either the Standard Example or Point Cloud 3DL scripted render engines, and it's probably just Subsurface that doesn't work in a desirable way. Haven't tested with Translucency. Looks like Standard Example is faster than non-scripted 3DL.

    EDIT:

    I removed the AWE lights for the other scripted 3DL test and replaced it with a sky dome with the same texture applied (exposure would need to be altered in photoshop or gimp I suppose) and I didn't try to make the starndard UE + Distant light setup match the AWE lights except in direction. Color tone is quite different, but I wasn't focused on that.

    Also, the eyes and lashes still have AWE applied. AWE subsurface requires AWE's scripted 3DL.

    awe-original.png
    508 x 508 - 305K
    awe-standardExample.png
    508 x 508 - 298K
    Post edited by ArkadySkies on
  • Mustakettu85Mustakettu85 Posts: 2,933
    khorneV2 said:
    Creating RIB files is very stable with the standard 3delight (Vanilla) renderer in Daz Studio. I managed to copy the settings in the generated rib and AWE surface worked even if I don't know if all the settings were correct.

    So, i wonder if it is possible to have a script to modify the classic rib files to use path-tracing and so. This will avoid the DS Bug....

    You mean full raytracing, right? Without being limited to progressive rendering? Because pathtracing happens with aweArea lights and aweSurface regardless of whether you're using the scripted renderer to call the raytracer with full options or the vanilla tab with all its limits via "progressive mode".

    Any script that would do that for you would be a Windows sorta script (Python?), which isn't something that I do. But if you have a good text editor (Notepad++ or something better), it should be able to help you automate text replacements, via its own script engine or some other interface.

    Let's look at a header (up to WorldBegin) generated by the vanilla tab:

    ##RenderMan RIB-Structure 1.0##Creator 3Delight 12.0.27 win64 (Oct  6 2015, 9cc783) "Suspiria"##CreationDate Thu Oct 18 20:47:28 2018Option "searchpath" "string shader" [ "C:/Users/Forever/AppData/Roaming/DAZ 3D/Studio4 Public Build/temp/shaders;C:/Users/Forever/AppData/Roaming/DAZ 3D/Studio4 Public Build/shaders;C:/Program Files/DAZ 3D/DAZStudio4 Public Build/shaders;&" ] Option "render" "bucketorder" [ "spiral" ] Option "limits" "bucketsize" [ 16 16 ] Option "render" "integer standardatmosphere" [ 0 ] Option "trace" "maxdepth" [ 2 ] Attribute "trace" "maxdiffusedepth" [ 2 ] Attribute "trace" "maxspeculardepth" [ 2 ] Attribute "trace" "bias" [ 0.01 ] Hider "raytrace"   "int progressive" [ 1 ] PixelFilter "box" 1 1 Exposure 1 2.2 ShadingRate 1 PixelSamples 8 8 Sides 2 Display "render.tiff" "file" "rgba" Imager "" Format 442 619 1 Projection "perspective"   "fov" [ 30.957276 ] ScreenWindow -0.71405495 0.71405495 -1 1 Clipping 5 4e3 Identity Scale 1 1 -1 Rotate 26.030424 0.5311948 -0.84093445 -0.10325368 Translate -299.1719 -324.1207 -740.4766 

    Obviously your search paths are going to be different :)

    The exact order of commands is not really relevant, but if a command is duplicated, the one coming last supercedes the first one.

    What you generally want to retain are the following lines:

    Display "render.tiff" "file" "rgba" // Path to rendered file and its typeImager ""  // Supposing you were using any; if this is empty, as here, disregard itFormat 442 619 1  // This and the following lines describe your camera positionProjection "perspective"   "fov" [ 30.957276 ] ScreenWindow -0.71405495 0.71405495 -1 1 Clipping 5 4e3 Identity Scale 1 1 -1 Rotate 26.030424 0.5311948 -0.84093445 -0.10325368 Translate -299.1719 -324.1207 -740.4766 

    The scripted method generates something like that:

    ##RenderMan RIB-Structure 1.0##Creator 3Delight 12.0.27 win64 (Oct  6 2015, 9cc783) "Suspiria"##CreationDate Wed Jul 25 08:37:34 2018ErrorHandler "print" Hider "raytrace"   "integer progressive" [ 0 ]   "float aperture[4]" [ 6 60 1 1 ] ShadingInterpolation "smooth" Option "render" "integer standardatmosphere" [ 0 ] Option "trace" "int diffuseraycache" [ 1 ] PixelFilter "gaussian" 2 2 Option "searchpath" "string shader" [ "C:/Users/Forever/AppData/Roaming/DAZ 3D/Studio4/temp/shaders;C:/Users/Forever/AppData/Roaming/DAZ 3D/Studio4/shaders;C:/Program Files/DAZ 3D/DAZStudio4/shaders;&" ] Option "render" "string bucketorder" [ "zigzag" ] Option "limits" "integer bucketsize[2]" [ 64 64 ] Option "trace" "integer maxdepth" [ 10 ] Attribute "trace" "integer maxdiffusedepth" [ 2 ] Attribute "trace" "integer maxspeculardepth" [ 3 ] Exposure 1 2.2 PixelSamples 10 10 Sides 2 Format 600 600 1 Projection "perspective"   "fov" [ 30.957276 ] ScreenWindow -1 1 -1 1 Clipping 5 4e3 Identity Scale 1 1 -1 Rotate 26.030426 0.5311947 -0.8409345 -0.10325368 Translate -22.84611 -195.36575 -56.847825 Display "C:/Users/Forever/Documents/!_3D models/!_RSL RIB/testneutral.rib.tif" "file" "rgba"   "int associatealpha" [ 0 ] 

    You can see that the path to rendered file comes after the camera commands here, which is basically the only thing you need to look out for when swapping headers.

    Trace depths for various ray types, pixel samples etc - I think it's clear where they are and hence where you can change their values; if I'm wrong, then just ask and I'll point you to them.

Sign In or Register to comment.