Support for 3Delight - Is it Fading? . . . and why?

1910121415

Comments

  • DaWaterRatDaWaterRat Posts: 2,885

    As I've said, for some of us, it's not about reality.  I like being able to render things "in camera" that break how light really works.  (I also hate postwork, and see it as a neccsary evil.)

    Like, say I wanted to render the scene from the beginning of Peter Pan when he's trying to catch his shadow.

    Iray would require (at my understanding) at least 3 passes.  One base image of the Nursery, one with Fantom Pan for his shadow, and one with Peter rendered alone so that he's not casting a shadow on anything.  Render times in my experience would average about 30 mintues per canvas pass.  (Some passes longer, some shorter)  plus the neccsary, if quick, postwork to composite, get the layering right for the shadow pass, and possibly adding Tink's glow, since I wouldn't want the same level of bloom on the rest of the room as I'd want on the sphere that was Tink.  This doesn't include the time to optimize the scene to render on my system.

    In 3DL, I can do one pass running about an hour to an hour and a half, with no compositing postwork neccsary, including Tink's glow.  This will require the use of pwCatch and pwEffect, of course.  And much less time spent on optimizing the scene to render on my system, not just because I know what needs optimizing, but also because it doesn't need as much optimization.

  • kyoto kidkyoto kid Posts: 41,040
    edited December 2017

    Two years old doesn't matter. Most development for that time went to the OSL. You don't miss a lot of thing. Thats not as if anything revolutionnary has been made.

    There hasn't been a revolution in CG for quite a while. But the evolution is palpable. Small things that matter. Subsurface sampling, general sampling, all that making-life-easier progress.

    Oh yeah BTW, the latest builds have fixed the major slowdown with UE2 bounce light that occurred somewhere between 11 and 12, which Parris apparently managed to report.

    ...so you are saying that UE is a bit faster than in earalier versions of Daz 4.x?

     

    What Iray is currently sorely missing is a curve primitive (like for LAMH hair).

    Neither does it have a dedicated outliner ("inker"). But apparently some folks managed to do NPR without it.


    ...lack of compatibility with strand/curve based hair is one the downsides of Iray as I have and use Garibaldi.  Importing hair as a .obj can add significantly to the polycount based on length style and such, which adds to the scene load and slows rendering time. For grass you need to use geometry in Iray rather than a procedural shader like the one AoA created, which also adversely impacts render time.

    Post edited by kyoto kid on
  • Sven DullahSven Dullah Posts: 7,621
    edited December 2017
    kyoto kid said:

    Two years old doesn't matter. Most development for that time went to the OSL. You don't miss a lot of thing. Thats not as if anything revolutionnary has been made.

    There hasn't been a revolution in CG for quite a while. But the evolution is palpable. Small things that matter. Subsurface sampling, general sampling, all that making-life-easier progress.

    Oh yeah BTW, the latest builds have fixed the major slowdown with UE2 bounce light that occurred somewhere between 11 and 12, which Parris apparently managed to report.

    ...so you are saying that UE is a bit faster than in earalier versions of Daz 4.x?

    wowie is mentioning something about "Heavily optimized opacity handling that works with both direct lighting and global illumination" in the 3DL Laboratory thread.

    Parris said the "bug " with transmapped hair has been solved, or something along those lines.

    Post edited by Sven Dullah on
  • Sven DullahSven Dullah Posts: 7,621

    Quoting Parris:

    As if that were not enough, with IBL Master rendering with Image Based Light in 3Delight is significantly faster than ever before. For the better part of a decade the popular opinion has been that 3Delight slows to a crawl when IBL and transmaps share the same scene. But as it turns out, it's a glitch in our surface shaders, not the render engine. The IBLM Light shader bypasses this glitch.

  • Oso3DOso3D Posts: 15,009

    As an aside, my upcoming shader OMS1 allows Fantom in Iray.

    ta daaaa

  • In general,  I like competition in business,  because variety of choice gives us options.  I still use 3DL whenever a scene is too big to fit on my 4 gig card to render it with iray.  I'm totally neutral on which tool or render engine.  As long as the result looks good enough,  I'm okay with that.

  • kyoto kidkyoto kid Posts: 41,040

    Two years old doesn't matter. Most development for that time went to the OSL. You don't miss a lot of thing. Thats not as if anything revolutionnary has been made.

    There hasn't been a revolution in CG for quite a while. But the evolution is palpable. Small things that matter. Subsurface sampling, general sampling, all that making-life-easier progress.

    Oh yeah BTW, the latest builds have fixed the major slowdown with UE2 bounce light that occurred somewhere between 11 and 12, which Parris apparently managed to report.

     

    3DL is a full featured production render with programmable shader. Iray shader programming is very limited

    MDL is pretty powerful actually. You shouldn't be that dismissive, really.

    What Iray is currently sorely missing is a curve primitive (like for LAMH hair).

    Neither does it have a dedicated outliner ("inker"). But apparently some folks managed to do NPR without it.

    Not if you think in term of exports. What is more standard ? Obj/FBX with equivalent channel or Uber/whatever ?

    And what about products that are designed with these shaders? Are you also going to abolish them too ?

    Deleting these shaders will prevent from loading and rendering products using these shaders. Leaving them doesn't harm

    Dude! Chill, I beg you! It´s me, Kettu, and my sense of humour. *rolls eyes*

    However, vendors should not be encouraged to use DS Default for their 3Delight mats. UberSurface will look miles better if only because its grazing highlights are prettier. And it pwns AoA Subsurface in terms of performance because it does not have this shader mixer bug when it assigns shader hitmode to the surface even when there is no opacity map.

    Speaking of standards - it´s 2017. The standard for texturing is PBS. Your renderers may vary.

    99% of 3delight features are acessible. ex Ubersurface has all these features.

    Dude, you're messing with lurkers' minds here. The features we were talking about are new physically plausible shading models - not in UberSurface because of age; and fine control over RiOptions and RiAttributes. Of which UberSurface cannot do RiOptions because in DS only "scripted renderer" can set RiOptions, and of RiAttributes it only does visibility.

     

    ebergerly said:

    So you guys would rather have a bunch of different shader systems and renderers so you run into issues like the one that brought up this thread in the first place? 

    And it seems like the market has agreed that 3DL isn't preferable, in favor if Iray. So at least that worked.  smiley

    Yes, I want options. I don't want someone developing a shader to have to make it the exact same as every other shader. Rigid adherence to the status quo is really bad for innovation. Think where we'd be if DAZ still used cr2s because that's what everyone used, and changing it would break things.

    ..indeed.  Look what happened when MS tried apply a "one size fits all devices" philosophy to an OS.

    ...we got Windows 8.0.

  • kyoto kidkyoto kid Posts: 41,040

    As I've said, for some of us, it's not about reality.  I like being able to render things "in camera" that break how light really works.  (I also hate postwork, and see it as a neccsary evil.)

    Like, say I wanted to render the scene from the beginning of Peter Pan when he's trying to catch his shadow.

    Iray would require (at my understanding) at least 3 passes.  One base image of the Nursery, one with Fantom Pan for his shadow, and one with Peter rendered alone so that he's not casting a shadow on anything.  Render times in my experience would average about 30 mintues per canvas pass.  (Some passes longer, some shorter)  plus the neccsary, if quick, postwork to composite, get the layering right for the shadow pass, and possibly adding Tink's glow, since I wouldn't want the same level of bloom on the rest of the room as I'd want on the sphere that was Tink.  This doesn't include the time to optimize the scene to render on my system.

    In 3DL, I can do one pass running about an hour to an hour and a half, with no compositing postwork neccsary, including Tink's glow.  This will require the use of pwCatch and pwEffect, of course.  And much less time spent on optimizing the scene to render on my system, not just because I know what needs optimizing, but also because it doesn't need as much optimization.

    ...I've done "in render" effects like that as well.

    lili final pw.png
    1200 x 1200 - 2M
  • agent unawaresagent unawares Posts: 3,513
    edited December 2017
    kyoto kid said:
    ebergerly said:

    So you guys would rather have a bunch of different shader systems and renderers so you run into issues like the one that brought up this thread in the first place? 

    And it seems like the market has agreed that 3DL isn't preferable, in favor if Iray. So at least that worked.  smiley

    Yes, I want options. I don't want someone developing a shader to have to make it the exact same as every other shader. Rigid adherence to the status quo is really bad for innovation. Think where we'd be if DAZ still used cr2s because that's what everyone used, and changing it would break things.

    ..indeed.  Look what happened when MS tried apply a "one size fits all devices" philosophy to an OS.

    ...we got Windows 8.0.

    Scary.

    Post edited by agent unawares on
  • HavosHavos Posts: 5,361

    I remember a thread a year or so back, where a couple of the 3DL experts (one of them could have been Wowie) were arguing that UE could be made just as fast as the AoA ambient light by altering the UE light's parameters to make it the same quality as the AoA light. I recall lots of sample renders using both lights being posted, and the times for both, where the image quality was similar, were around the same. The 3DL experts concluded that the main difference between the two lights speedwise, was the flagging, which the AoA lights had, and UE did not. Thus AoA could flag a hair, and change its render settings to significantly bring down the render time.

    I still think the AoA lights are better for most "ordinary" users, as they were just fast "out of the box", without all the parameter fiddling needed. I never really used UE lights at all, as the few occasions I tried it was far too slow (for me 15-30 mins is around the max I can be bothered to wait for a render).

    I believe the real issue with 3DL is that it is not easy to get right for novice users. I stuck to Poser for a period because I preferred the look of firefly over 3DL, but at that time my expertise was little more than being able to position spot/distance lights, and using just those firefly looked better. The appearence of the AoA ambient light, plus the ease with which people were saying it was to use, was one of the main factors that contributed to me switching from Poser to 3DL, as I could finally create renders that looked as good as my firefly renders in a reasonable render time. However, as easy as 3DL seemed to me at the time, Iray came along and was even easier. Pose your subject,  and with the default render settings the results looked great. I am sure many new DS users are seeing this, and quickly abandon any attempts to learn 3DL, and as we all know, new users are the lifeblood for keeping something from moving along and developing (since they provide the money by buying products)

     

     

  • kyoto kidkyoto kid Posts: 41,040

    ...I still feel there is a need for a more efficient set of default Iray to 3DL conversion presets like there is for 3DL to Iray.

    Again, I'd have no issue with even plopping down money for such a utility.

  • ebergerlyebergerly Posts: 3,255
    edited December 2017

    I guess one of the issues I have with some non-realistic renderers and shader systems is this...

    When I see a render of, say, a character, that is like 80% real, but not quite "realistic", it looks really strange to me. Now if it's a comic like Peter Pan, it's obvious that it's a comic and all the lighting and shading and stuff is appropriately non-realistic and "cartoony". Dimensions are distorted, colors are cartoony, and so on.

    But when I see a render that looks like it's trying to be realistic, but the lighting and shadows and shaders look clearly fake, it is a bit of a turn-off for me. Like shadows that are a uniform gray or something. I see that a lot in many 3D renders. It just looks flat and dull and almost strange to me. 

    Maybe I'm missing something, but it seems like there is realistic and cartoony, and shouldn't be much in-between. And my assumption (probably incorrect) was that the older renderers were the "in-between" renderers that just look strange. I'm not an art person so I'm probably way off, but anyway.... 

    Post edited by ebergerly on
  • kyoto kidkyoto kid Posts: 41,040
    edited December 2017

    ...so by that token a painting of say a person or scene would fall in the category of a NPR render engine as it isn't photo real and the shadows as well as SSS of the skin may seem a bit off,  but otherwise is still a pretty accurate rendition of the real subject. I guess that makes them "medicore renderings" as well (the term "render" has been used in the visual art world long before CGI). 

    Whether what I create looks like totally photo real, sort of, or not at all matters not as long as the final result touches someone and/or tells a story.  We all have our own personal tastes (I for one am not very taken by the abstract expressionistic school of painting but on the other hand have a strong appreication for the Bauhaus/Minimalistic schools of architecture), and the idea of what is "good" or "inferior" art styles can be (and usually are) totally subjective.

    Post edited by kyoto kid on
  • kyoto kid said:

    ...so by that token a painting of say a person or scene would fall in the category of a NPR render engine as it isn't photo real and the shadows as well as SSS of the skin may seem a bit off,  but otherwise is still a pretty accurate rendition of the real subject. I guess that makes them "medicore renderings" as well (the term "render" has been used in the visual art world long before CGI). 

    Whether what I create looks like totally photo real, sort of, or not at all matters not as long as the final result touches someone and/or tells a story.  We all have our own personal tastes (I for one am not very taken by the abstract expressionistic school of painting but on the other hand have a strong appreication for the Bauhaus/Minimalistic schools of architecture), and the idea of what is "good" or "inferior" art can be totally subjective.

    Well said.

  • kyoto kid said:

    ...so by that token a painting of say a person or scene would fall in the category of a NPR render engine as it isn't photo real and the shadows as well as SSS of the skin may seem a bit off,  but otherwise is still a pretty accurate rendition of the real subject. I guess that makes them "medicore renderings" as well (the term "render" has been used in the visual art world long before CGI).

    Nope. A real life "mediocre rendering" would be one of those where the art student kind of has anatomy down by now but they just aren't bothering to look at the light for reference so colors and shapes and bounce all get wrong and they keep polishing and adding details on top anyway, because enough bling and no one will notice, right? Right? (You've seen these, yeah?)

  • ebergerlyebergerly Posts: 3,255

     

     

    kyoto kid said:

    ...so by that token a painting of say a person or scene would fall in the category of a NPR render engine as it isn't photo real and the shadows as well as SSS of the skin may seem a bit off,  but otherwise is still a pretty accurate rendition of the real subject. I guess that makes them "medicore renderings" as well (the term "render" has been used in the visual art world long before CGI).

    NPR = National Public Radio ? smiley

    I guess with paintings and stuff you know is not a photograph the intent is clear. You don't expect a photo-real painting, so when you see one that's even close it's pretty amazing. I guess it depends on what you expect. 3D renders have always been a little strange to me. Paintings you know are not real, so you don't expect. Photographs are, so you know what to expect...realism. Cartoons are cartoons, so you don't expect realism. 

    But 3D renders are kind of in-between. Many kind of straddle the real/unreal area, so you dont' really know what to make of it. 

    I dunno, maybe it's just me. 

  • ebergerly said:
     

    But 3D renders are kind of in-between. Many kind of straddle the real/unreal area, so you dont' really know what to make of it. 

    NPR = Non-Photographic Render, which is any render that obviously isn't intended to be photo-realistic.

  • DaWaterRatDaWaterRat Posts: 2,885
    edited December 2017
    ebergerly said:

    I guess one of the issues I have with some non-realistic renderers and shader systems is this...

    When I see a render of, say, a character, that is like 80% real, but not quite "realistic", it looks really strange to me. Now if it's a comic like Peter Pan, it's obvious that it's a comic and all the lighting and shading and stuff is appropriately non-realistic and "cartoony". Dimensions are distorted, colors are cartoony, and so on.

    But when I see a render that looks like it's trying to be realistic, but the lighting and shadows and shaders look clearly fake, it is a bit of a turn-off for me. Like shadows that are a uniform gray or something. I see that a lot in many 3D renders. It just looks flat and dull and almost strange to me. 

    Maybe I'm missing something, but it seems like there is realistic and cartoony, and shouldn't be much in-between. And my assumption (probably incorrect) was that the older renderers were the "in-between" renderers that just look strange. I'm not an art person so I'm probably way off, but anyway.... 

    When did I say I was doing a toon take on Peter Pan?  It was a stage play and a novel well before Disney got his hands on it, and there have been several live action takes on the story as well.

    I don't think of myself as an art person so much as a storyteller, and I like to tell stories of fantastical things - like Faerie Playboys seducing pretty young things ignoring the tell of his shadow showing his Goat's Horns that they (and the viewer) cannot see, or sorcerers manipulating the shadows of a room to reach out and grab someone.  Or benevolent sun dieties who, by their nature, never have shadows on their faces.

    Are these things possible in Iray?  Yes, with varying degrees of postwork.  But since I also like to do serial art using the same characters, it's faster and easier if I can set it up to work "in camera" rather than having to go through the same composite sequence every time.

    and as for the final look I'm after... I'm trying to make Renders that look like paintings (though without the brush strokes).  Again, it is something that is possible in Iray.  I just need to work out what settings will get there for me.

    Post edited by DaWaterRat on
  • kyoto kidkyoto kid Posts: 41,040
    ebergerly said:

     

     

    kyoto kid said:

    ...so by that token a painting of say a person or scene would fall in the category of a NPR render engine as it isn't photo real and the shadows as well as SSS of the skin may seem a bit off,  but otherwise is still a pretty accurate rendition of the real subject. I guess that makes them "medicore renderings" as well (the term "render" has been used in the visual art world long before CGI).

    NPR = National Public Radio ? smiley

    I guess with paintings and stuff you know is not a photograph the intent is clear. You don't expect a photo-real painting, so when you see one that's even close it's pretty amazing. I guess it depends on what you expect. 3D renders have always been a little strange to me. Paintings you know are not real, so you don't expect. Photographs are, so you know what to expect...realism. Cartoons are cartoons, so you don't expect realism. 

    But 3D renders are kind of in-between. Many kind of straddle the real/unreal area, so you dont' really know what to make of it. 

    I dunno, maybe it's just me. 

    ...in the 1980s when I first dabbled in CG, nothing looked "real". 

    In traditional painting and drawing there were two branches known as the "Photorealist and "Hyperrealist" Schools.  I experimented in these styles back in college and I can say (save for coding, not modelling, your own CG meshes and textures from scratch) it is a lot harder than anything we do here. I found it interesting and quite the challenge, but I could just as well get the same results by getting my, camera plopping a roll of Kodachrome or Tri-X Pan in it and going out to take photos.

    This is actually a pencil drawing:

    This is an oil painting (notice the reflected light on the pavement from the windows)

    This is an airbrush painting (not merely touching up a photo but a complete painting)

  • Two years old doesn't matter. Most development for that time went to the OSL. You don't miss a lot of thing. Thats not as if anything revolutionnary has been made.

    There hasn't been a revolution in CG for quite a while. But the evolution is palpable. Small things that matter. Subsurface sampling, general sampling, all that making-life-easier progress.

    Oh yeah BTW, the latest builds have fixed the major slowdown with UE2 bounce light that occurred somewhere between 11 and 12, which Parris apparently managed to report.

    I know but all these don't really have a big impact on the subject, that is 3DL support from PAs inside DS.

    Having these evolutions inside DS will only make happy a few people like you and me.

    3DL is a full featured production render with programmable shader. Iray shader programming is very limited

    MDL is pretty powerful actually. You shouldn't be that dismissive, really.

    What Iray is currently sorely missing is a curve primitive (like for LAMH hair).

    Neither does it have a dedicated outliner ("inker"). But apparently some folks managed to do NPR without it.

    I don't dismiss MDL. I already mentionned that it can handle other renderers and is more than just a programming language which is a good point. I just say it's inferior to RSL. Didn't say it's bad. It's more than enough for the DS Crowd and limited if you begin to try to get some specific thing you could easily do with RSL

    Nvidia will introduce AI driven features, and that could may be counterbalance the comparison. But right now they are not there

    Not if you think in term of exports. What is more standard ? Obj/FBX with equivalent channel or Uber/whatever ?

    And what about products that are designed with these shaders? Are you also going to abolish them too ?

    Deleting these shaders will prevent from loading and rendering products using these shaders. Leaving them doesn't harm

    Dude! Chill, I beg you! It´s me, Kettu, and my sense of humour. *rolls eyes*

    OK wasn't sure

    However, vendors should not be encouraged to use DS Default for their 3Delight mats. UberSurface will look miles better if only because its grazing highlights are prettier. And it pwns AoA Subsurface in terms of performance because it does not have this shader mixer bug when it assigns shader hitmode to the surface even when there is no opacity map.

    Speaking of standards - it´s 2017. The standard for texturing is PBS. Your renderers may vary.

    99% of 3delight features are acessible. ex Ubersurface has all these features.

    Dude, you're messing with lurkers' minds here. The features we were talking about are new physically plausible shading models - not in UberSurface because of age; and fine control over RiOptions and RiAttributes. Of which UberSurface cannot do RiOptions because in DS only "scripted renderer" can set RiOptions, and of RiAttributes it only does visibility.

    No I talk about math. Physically correct = Energy conservation + fresnel driven specular. With DS shaders (uber and default) , you can do energy conservation by hand but that could have been done easily in the code. Same for fresnel and other thing.

    You are talking about "new" built in models that could have been coded manually. The fact that some models are now included doesn't mean you couldn't do it yourself before. That is how it was done in old 3Delight and that is what I'm talking about. Just  trying to fight the myth that PBR is some new tech discovered in the few last years and that it can't be achieved in older renderer

     

    From what I saw looking at the blender manual, the cycles principled shader is a cut down version of the DAZ Iray Uber shader, since blender can be set to provide the added details that are usually handled be texture maps in Studio.

    In fact it's the other way. Blender is following pixar's principled shader. DAZ Uber took it as it's inspiration and added a bunch of channels to it

    ebergerly said:
    ebergerly said:

    And here's a side-by-side showing the Blender Principled BSDF node settings on the left, and the D|S shader settings on the right. Much different. 

    Try the metallicity workflow on the Iray Uber, should be closer.

    Either way, it's all just terms.

    Yeah, I wish it was as easy as waving my hands and saying it's all just terms, but it isn't. Somebody has to wade thru all of those terms and figure out how to export and import it. Blender to SP, then SP to D|S. 

    I'm hoping we get to the point where the market agrees that a single, simple shader system is the way to go. It's better for PA's (they don't have to worry about designing for different shader systems), and it's certainly better for customers not having to work with 30 different nodes just to set up a surface. 

    Why wait? Every render engine gives the capability to create your own shader. Make your own and duplicate yours in all the apps. With many apps using a node system that is easier than before and even non coders could do it. Nobody is forcing you to use the built in monolitic shader.

  • ebergerlyebergerly Posts: 3,255

    Why wait? Every render engine gives the capability to create your own shader. Make your own and duplicate yours in all the apps. With many apps using a node system that is easier than before and even non coders could do it. Nobody is forcing you to use the built in monolitic shader.

    It's an issue of exporting/importing between apps and have the existing shaders match up, not writing shaders. I'm looking into doing some scripting to do that so it's not all manual. But it means scripts for Blender -> Substance -> D|S. A major pain, when all you want to do is produce renders. The plan is just send UV'd OBJ's from Blender to SP, do materials in SP, then import into D|S. Which means I'll need to get up to speed on D|S scripting to figure how to convert the SP materials automatically upon import. 

    Has anyone done much D|S scripting? I think I've seen a tutorial in the store. I hope it's similar to Blender scripts, cuz I'm not looking forward to having to learn something new. Or maybe it's easier to pack the materials on export from SP into D|S format. Arrgghh...... 

     

  • kyoto kidkyoto kid Posts: 41,040
    edited December 2017

    ...this is exactly my position as well. I don't wish to spend a fair amount of time scripting as that is not what got into this for. 

    When I painted, if I needed a brush I'd go to an art store and buy one, not make one myself.  The most "extra curricular" work I did outside of painting and drawing was make my own stretcher frames and stretch my own canvas (primarily an economic concern but also an artistic one as not many places sell 6' x 4' or 8.5' x 1.5' pre-stretched canvas), along with cutting mats for, and framing drawings.

    Post edited by kyoto kid on
  • ebergerlyebergerly Posts: 3,255

    Hold on...this could be easier than I thought. In Substance Painter you can configure your material exports, and even set up your own. You can create an output map that's grayscale, RGB+A, or whatever. They give you a ton of flexibility. So it looks like I can select Metal/Roughness preset in SP, then tweak those materials so they more closely match D|S, and then do a D|S script to pull those channels into the correct materials.

    Fingers crossed.....

    BTW, has anyone done this before?  

  • algovincianalgovincian Posts: 2,610
    ebergerly said:

    It's an issue of exporting/importing between apps and have the existing shaders match up, not writing shaders. I'm looking into doing some scripting to do that so it's not all manual.

    Yeah - this is precisely why MDL defines physical characteristics of materials and not shading functions:

    "THE NVIDIA MATERIAL DEFINITION LANGUAGE (MDL) gives you the freedom to share physically based materials and lights between supporting applications."

    https://www.nvidia.com/en-us/design-visualization/technologies/material-definition-language/

    I have never used SP, so I don't know whether modifying the parameters of the DAZ IRay Uber Base via script is sufficient or not. In the back of my mind, I'm kind of wondering why nobody has written a utility to do so yet. It makes me wonder if there's anything proprietary about what SP exports/references.

    Anyway, DS imports MDL, and Shader Mixer can be used to make the process rather easy - you might find this old thread interesting:

    https://www.daz3d.com/forums/discussion/59018/playing-around-with-iray-mdl/p1

    Also, here's a rather long list of scripting examples (including materials) written by the man himself (Rob):

    http://docs.daz3d.com/doku.php/public/software/dazstudio/4/referenceguide/scripting/api_reference/samples/start#materials

    Hope this helps.

    - Greg

  • ebergerlyebergerly Posts: 3,255
    edited December 2017

    Thanks algovincian. I think the challenge will be matching up the SP Metallicity/Roughness channels with the MANY D|S Metallicity/Roughness channels. Below is a comparison, and in some cases SP uses a single map, where D|S can have multiple maps. For example SP has Diffuse, D|S has Diffuse Roughness and Diffuse Overlay Weight. And for solid glass SP uses a "Transmissive" map, while D|S uses "Volume", with "thin walled" turned off. 

    Hopefully some of the references you posted will make this a lot easier.  

    DS Channels.PNG
    348 x 620 - 14K
    Substance Channels.PNG
    448 x 526 - 21K
    Post edited by ebergerly on
  • ebergerlyebergerly Posts: 3,255

    I have never used SP, so I don't know whether modifying the parameters of the DAZ IRay Uber Base via script is sufficient or not. In the back of my mind, I'm kind of wondering why nobody has written a utility to do so yet. It makes me wonder if there's anything proprietary about what SP exports/references.

    No, in fact SP give you tons of control over what you export. It uses Metallicity/Roughness just like D|S Iray, but you can also define your own material maps very quickly. So I'm guessing I can just add the extra maps that D|S uses and save that as a preset. Hopefully once I do that I won't have to tweak the D|S import cuz the maps will already be there. 

  • ebergerlyebergerly Posts: 3,255
    edited December 2017

    If anyone was wondering exactly how many possible maps/settings are available in D|S for a Metallicity/Roughness surface, the answer is 78. That's if you go thru and make sure every option is turned on (or off, as in "thin walled"), because those settings options aren't shown with the feature turned off. That's compared to the 19 Substance Painter Metallicity channels. SP doesn't even have stuff like Metal Flakes and the "top coat" options as far as I can tell. 

    Looks like I'll have to stick with the limited SP channels for now and just map those to the equivalent D|S channels.  

    Post edited by ebergerly on
  • I'm trying to support 3DL with my new freebie (Chinese Health Balls and Case) - want to get it sorted before Christmas - but I'm having trouble with mirror settings on the balls.

    See the comparison pic (no shadows on the right pic, but it makes no difference to the problem), then if you think you can help, kindly check this thread in Nuts&Bolts:

    https://www.daz3d.com/forums/discussion/219901/mirrored-sphere-in-3dl

    to see what's been tried already (basically, the mirror on the right pic (not my model) works fine and has the same settings as the spheres but the spheres come out all dark and not metal-looking).

    Thanks for any help :)

  • I'm trying to support 3DL with my new freebie (Chinese Health Balls and Case) - want to get it sorted before Christmas - but I'm having trouble with mirror settings on the balls.

    See the comparison pic (no shadows on the right pic, but it makes no difference to the problem), then if you think you can help, kindly check this thread in Nuts&Bolts:

    https://www.daz3d.com/forums/discussion/219901/mirrored-sphere-in-3dl

    to see what's been tried already (basically, the mirror on the right pic (not my model) works fine and has the same settings as the spheres but the spheres come out all dark and not metal-looking).

    Thanks for any help :)

  • kyoto kidkyoto kid Posts: 41,040
    edited December 2017

    ...are you using the default environment setting in the Iray one?   For 3DL, you need something to reflect for the reflectivity to work.  Either a reflection map (like Poser uses) or better, adding a skydome or HDRI sphere to the scene.

    Also check your reflection channel colour, it should be white (255, 255, 255)

    Post edited by kyoto kid on
Sign In or Register to comment.