LuxRender 1.5 Released

2»

Comments

  • BobvanBobvan Posts: 2,652

    Guess its dead like tufans old plugin

  • morkmork Posts: 278

    I had not tried LuxRender yet, because I had enough of things to figure out already, but I gave it a try yesterday, finally.
    Have to say...I'm astonished. It has what I'm really missing in DAZ: a render path both for AMD and NVidia.
    On my R290x Rendertime is down from 2days+ to couple of hours maximum, which is a HUGE improvement and which makes renders fun again. It renders that fast that I have a new problem: I need a new CPU that can provide the GPU with data quick enough - looks like I'm capped by the CPU right now at ~2.0 - 5.0 MS/s.
    Downside is the transfer of (iray) materials and G3F, as well as hair sometimes and there's room for improvement on the default skin shader, but I really hope Reality will improve on this.
    Would have been so cool if DAZ had support for AMD as well, but it is obviously not going to happen, for quite a while at least. :(

  • nDelphinDelphi Posts: 1,850

    I am waiting for the update to Reality and I really want to start using LuxRender again. If it is faster I will be using it more and more. The one thing I like about LuxRender is the ability to continue a render from where it last saved its progress. Something that should have been added to iRay in DAZ Studio. I have lost renders at nine-hours because of a sudden loss in electricity and the battery backup not doing its job correctly. It isn't fun.

  • morkmork Posts: 278
    nDelphi said:

    The one thing I like about LuxRender is the ability to continue a render from where it last saved its progress.

    I like that feature very much as well, it's really cool. Also, it saves the image every so often, so, even if it crashes and you could not resume, which you can, you still have the latest render and it's not lost like in DAZ/Iray. Happened a few times already to me and it's really not cool when you've rendered for two days and everything goes *poof*.
    And if it does not crash, there are still my cats which LOVE to jump on the tower and stomp on the power switch with their feet - *poof* power off. Linux asks me if I really would like to shut down, windows just powers off. I love my cats, but sometimes... :-D

  • BobvanBobvan Posts: 2,652
    edited September 2015

    From 2 plus days to a few hours killer! I am familiar with R4 so it will be easy to resume using it. This took 18 hours http://fav.me/d8t9l8g so to have it in an hour or less would be great! My kitty Moo, sometimes steps on my power bar as well. How can you can get mad at that...

    10292538_10152451550705863_3159970783765819878_n.jpg
    604 x 453 - 38K
    Post edited by Bobvan on
  • Ghosty12Ghosty12 Posts: 2,006
    edited September 2015

    Just got a email notification from Preta-3D about Reality 4.1..

    Quoted from the email I recieved.. :D

    "Dear Reality artist.

    Reality 4.1 has reached Gamma stage. This means that the code for the program has being frozen, and changes will not be made, not even cosmetic ones. This is it! We are just making sure that we covered all the bases and that we test every part of the program as much as possible.

    At this stage we are updating the documentation and we will be announcing a release date soon."

    So not long to go before we get Reality 4.1.. :)

    Post edited by Ghosty12 on
  • TaozTaoz Posts: 9,789
    ghosty12 said:

    Would be the case as when the 570GTX was in my old I7 920 system the case was one of those cheap cases with terrible ventilation would often have the side off to get air going in the case, so that did not help, but as before the 650GTX TI I have been given I want to keep as I can't afford a new card right now..

    Yes, case ventilation is very important  - even the best CPU/video card coolers/fans are useless if the temperature inside the case is too high.

  • TaozTaoz Posts: 9,789
    mtl1 said:

    *All* semiconductor products fail eventually.

    I've heard claims that the old Commodore 64 lasted longer if they remained turned on all the time, despite the chips got quite hot. Allegedly the reason was that the constant expansion and contraction of the materials when heating up / cooling down did more damage than the constant heat. How it is with the chips they make today I don't know.
  • nicstt said:

    Cards will wear out quicker using them for rendering, providing it is a lot of rendering; they are consumer cards and not meant for the long-term wear that prolonged and extensive rendering gives; rendering pushes a card more than gaming on average.

    I thought this too, but my experience with my 980ti has been different. Playing GTA5 for a little while, will have the card easily hitting 65C with a reasonably aggressive fan profile. When rendering in iRay, it will never get above 57C, with the same fan profile. Which worries me a little as I'm concerned its not being driven as hard as it should.... 

  • thd777thd777 Posts: 932
    Apothis68 said:
    nicstt said:

    I thought this too, but my experience with my 980ti has been different. Playing GTA5 for a little while, will have the card easily hitting 65C with a reasonably aggressive fan profile. When rendering in iRay, it will never get above 57C, with the same fan profile. Which worries me a little as I'm concerned its not being driven as hard as it should.... 

    That's quite normal. When you render with a Gaming GPU you are only using a subset of the hardware on the card (the parallel processing architecture of the card). You are using that part very extensively, but at the same time many other functions on the GPU hardware (antialiasing, video coding, physx, ...) are not used. My cards also get significantly warmer when playing for example Witcher 3, but it depends on the game of course. 

    TD

  • mtl1 said:
    acanthis said:
    nicstt said:

    Cards will wear out quicker using them for rendering, providing it is a lot of rendering; they are consumer cards and not meant for the long-term wear that prolonged and extensive rendering gives; rendering pushes a card more than gaming on average.

    Please can you provide a quote from NVidia that confirms this - or is this just a personal point of view?

    The more a semiconductor product is run at full load, the more thermal energy it dissipates and thus reducing its thermal budget. That's discounting the normal wear and tear of mechanical parts such as fans, TIM, etc.

    *All* semiconductor products fail eventually.

    Yes, that much is obvious and well known. But where is the specific detailed quote from NVidia warning users not to use their cards for long iRay renders in case they cause heat damage?

  • thd777 said:
    Apothis68 said:
    nicstt said:

    I thought this too, but my experience with my 980ti has been different. Playing GTA5 for a little while, will have the card easily hitting 65C with a reasonably aggressive fan profile. When rendering in iRay, it will never get above 57C, with the same fan profile. Which worries me a little as I'm concerned its not being driven as hard as it should.... 

    That's quite normal. When you render with a Gaming GPU you are only using a subset of the hardware on the card (the parallel processing architecture of the card). You are using that part very extensively, but at the same time many other functions on the GPU hardware (antialiasing, video coding, physx, ...) are not used. My cards also get significantly warmer when playing for example Witcher 3, but it depends on the game of course. 

    TD

    Interesting, thankyou. It wasnt my experience with my previous 670 card, which was definitely pushed hotter by iRay than by gaming.

  • MistaraMistara Posts: 38,675

    does this pertain to Luxus for DS also?

    thanks!

  • BobvanBobvan Posts: 2,652
    edited September 2015

    Just as im getting into iray with much faster times. It will be good to have both run faster no more 6 to 20 hour times yay.. I brought my tower wich I used to render all the lux stuff sounds like my ATI card has had it. I bought a plan so it will most likely be relaced with nvidia since they may no longer make my card..

    Post edited by Bobvan on
  • grinch2901grinch2901 Posts: 1,246
    mjc1016 said:
    mtl1 said:
    acanthis said:
    nicstt said:
    *All* semiconductor products fail eventually.

    And it's those mechanical parts that are the weak links...because in most cases once they go, it's all over...and the question is regular or extra crispy?

    I'm working on a project that uses the newer Galium Nitride (GaN) semiconductor for a very high power transmitter. Holy cow, those things are BEASTS. We tried to cook them runnig at elevated heat and full power for months at a time and couldn't do it. The boards will melt before the GaN amplifiers do.  Plust their faster and use less energy. Once they get those things into PC electronics it will make a huge change in the industry.  

  • mjc1016mjc1016 Posts: 15,001

     The boards will melt before the GaN amplifiers do.  Plust their faster and use less energy. Once they get those things into PC electronics it will make a huge change in the industry.  

    Yeah...you'll be able to hook up a water line, run it over your CPU and make your coffee, without having to move away from your computer.

  • grinch2901grinch2901 Posts: 1,246
    mjc1016 said:

     The boards will melt before the GaN amplifiers do.  Plust their faster and use less energy. Once they get those things into PC electronics it will make a huge change in the industry.  

    Yeah...you'll be able to hook up a water line, run it over your CPU and make your coffee, without having to move away from your computer.

    I prefer hot chocolate!  But yea, heat will always be an issue. The difference here is that the chipsets won't be the weakest link anymore.  Now please pass the whipped cream, I have an Iray render going and so it's a perfect time to brew that cocoa!

  • RAMWolffRAMWolff Posts: 10,155
    mjc1016 said:
    mtl1 said:
    acanthis said:
    nicstt said:
    *All* semiconductor products fail eventually.

    And it's those mechanical parts that are the weak links...because in most cases once they go, it's all over...and the question is regular or extra crispy?

    I'm working on a project that uses the newer Galium Nitride (GaN) semiconductor for a very high power transmitter. Holy cow, those things are BEASTS. We tried to cook them runnig at elevated heat and full power for months at a time and couldn't do it. The boards will melt before the GaN amplifiers do.  Plust their faster and use less energy. Once they get those things into PC electronics it will make a huge change in the industry.  

    WOW.. bet those will run a hideously pretty penny too!  surprise

  • mjc1016mjc1016 Posts: 15,001
    RAMWolff said:
    mjc1016 said:
    mtl1 said:
    acanthis said:
    nicstt said:
    *All* semiconductor products fail eventually.

    And it's those mechanical parts that are the weak links...because in most cases once they go, it's all over...and the question is regular or extra crispy?

    I'm working on a project that uses the newer Galium Nitride (GaN) semiconductor for a very high power transmitter. Holy cow, those things are BEASTS. We tried to cook them runnig at elevated heat and full power for months at a time and couldn't do it. The boards will melt before the GaN amplifiers do.  Plust their faster and use less energy. Once they get those things into PC electronics it will make a huge change in the industry.  

    WOW.. bet those will run a hideously pretty penny too!  surprise

    So did the plain old silicon we are currently running...back in the day.  Remember when memory was measured in kilobytes and cost an arm and a leg per chip?

  • <nod> And with the speed prices are coming down these days, it might be a surprisingly short time before it's affordable. I still remember clearly when USB sticks over 1GB were hideously expensive; a few years ago I looked at the prices after ignoring them for a long time and got an amazingly pleasant surprise.  wink

  • RAMWolffRAMWolff Posts: 10,155

    We shall see.... New stuff never comes down in price right away, bet something like that will have an elevated tag for a few years at worst... Might see some competition coming up with similar chip and technology but ............. yea, we shall see..............

  • RAMWolffRAMWolff Posts: 10,155

    Just got a news letter, looks like the new Reality 4.1 is going to trump iRay in allot of ways.  COOL!  Compitition is always good...

    http://preta3d.com/announcing-reality-4-1/

    Watch the video on the page. 

    Full listing of features:

    http://preta3d.com/reality4-1-in-detail/?utm_source=Reality+users&utm_campaign=2c485c2811-Watch+the+new+Reality+4.1+render&utm_medium=email&utm_term=0_95f5d1828e-2c485c2811-140738077

  • nDelphinDelphi Posts: 1,850
    edited September 2015
    RAMWolff said:

    Just got a news letter, looks like the new Reality 4.1 is going to trump iRay in allot of ways.  COOL!  Compitition is always good...

    The announcement mentions a release date, as well. It will be released on September 21st, 2015. Not long to wait.

    Post edited by nDelphi on
  • nDelphinDelphi Posts: 1,850
    edited September 2015

    iRay will still have the upper hand as far as popularity is concerned as it is built right into DAZ Studio and all new figures and props will be designed with iRay in mind.

    Post edited by nDelphi on
  • RAMWolffRAMWolff Posts: 10,155

    True but doing a render with not having to have a NVIDIA card on board or other specific specs will be great for allot of other folks.  I stiill mostly use 3Delight.  I'm building up my iRay bits slowly and I'll jump in head first at some point but not in any hurry quite honestly! 

  • IceCrMnIceCrMn Posts: 2,122

    Check your emails everyone, I found one this morning from preta-3D

     

    ***Spoiler***

     

    it's due Sep 21st

    and it does automatic iray shader conversion :)

  • RAMWolff said:

    Just got a news letter, looks like the new Reality 4.1 is going to trump iRay in allot of ways.  COOL!  Compitition is always good...

    http://preta3d.com/announcing-reality-4-1/

    It looks really nice.

Sign In or Register to comment.