«1

Comments

  • LeanaLeana Posts: 11,728

    Iray is developped by Nvidia and delivered as a "black box" to Daz team, who only integrate it into Daz Studio.

    So while that project seems interesting, I'm not sure Daz could do anything with it.

  • Seven193Seven193 Posts: 1,080
    edited February 13

    CUDA and iRay are two entirely different things.  Anyone who uses iRay must pay nVidia a license fee, so I see no changes here.

    Post edited by Seven193 on
  • I think that luda is software/firmware that runs at the driver/video card level, so it should make an AMD video card "look" like an NVidia Cuda card.  Changes to IRay and other rendering software should not be necessary, just lots of testing needs to be done.  As for CUDA and iRay - I recall that CUDA is the underlying library/interface that iRay uses and DAZ has already licensed iRay.

  • PerttiAPerttiA Posts: 10,024

    nakamuram002 said:

    I think that luda is software/firmware that runs at the driver/video card level, so it should make an AMD video card "look" like an NVidia Cuda card.  Changes to IRay and other rendering software should not be necessary, just lots of testing needs to be done.  As for CUDA and iRay - I recall that CUDA is the underlying library/interface that iRay uses and DAZ has already licensed iRay.

    Iray is looking for example, at the version of the nVidia driver to determine if it does support the Iray version used

  • NylonGirlNylonGirl Posts: 1,819
    So we just need them to spoof that driver number and make it report the ID of a modern Nvidia card and maybe it will finally be possible to do an all AMD system. Maybe I could even run my large language model locally and get on with my ingenious plan.
  • GoggerGogger Posts: 2,400

    CUDA cores refer to actual math processing hardware on the Nvidia cards. Not really sure how you would emulate that without some serious hardware behind it on another card to run the emulation software.

    Please, FTLOG, don't make us deal with yet another render engine with proprietary textures!  

  • NylonGirlNylonGirl Posts: 1,819

    Gogger said:

    CUDA cores refer to actual math processing hardware on the Nvidia cards. Not really sure how you would emulate that without some serious hardware behind it on another card to run the emulation software.

    Please, FTLOG, don't make us deal with yet another render engine with proprietary textures!  

    It doesn't matter that they named their little processing units "cuda cores". The software is meant to run on lots of little parallel processors. All of the graphics cards have lots of little parallel processors. I've read from a random iffy website that the performance isn't much difference running the apps on AMD instead of Nvidia. This is probably something to celebrate. It makes us less dependent on proprietary things like a true Nvidia driver and graphics card.

  • WendyLuvsCatzWendyLuvsCatz Posts: 38,226

    if it ran iray it would use it's MDL shaders

    I don't think iray is the priority though for the developers of this

    it is Large Language Models and Latent Space Algorithms 

  • The iRay developemnt site is hosed on GitHub - https://github.com/vosen/ZLUDA. ; I just inquired if anyone has tried ZLUDA with DS - https://github.com/vosen/ZLUDA/discussions/83. ; Maybe someone has. 

    If anyone wants to try ZLUDA, the source code and builds can be obtained on GitHub.  I would give it a shot, but I do not have a Radeon Card.

  • scyhocescyhoce Posts: 69
    edited February 14

    Leana said:

    Iray is developped by Nvidia and delivered as a "black box" to Daz team, who only integrate it into Daz Studio.

    So while that project seems interesting, I'm not sure Daz could do anything with it.

     

    I'm not sure daz needs to do anything with it, from what I'm seeing, you can run this at any software that needs cuda cores to run. For example, they ran it on blender's renderer that only uses cuda cores and it worked, they did not modify blender in anyway. along with some other software too 

    Post edited by scyhoce on
  • scyhocescyhoce Posts: 69

    nakamuram002 said:

    The iRay developemnt site is hosed on GitHub - https://github.com/vosen/ZLUDA. ; I just inquired if anyone has tried ZLUDA with DS - https://github.com/vosen/ZLUDA/discussions/83. ; Maybe someone has. 

    If anyone wants to try ZLUDA, the source code and builds can be obtained on GitHub.  I would give it a shot, but I do not have a Radeon Card.

    Yes, we need to find people with amd gpus to test it with daz. I'm pretty sure no one here has one for obvious reasons.  

  • richardandtracyrichardandtracy Posts: 5,688
    edited February 14

    Are Radeon cards enough of a leap ahead of nVidia to be worth it? Asking because I don't know anything about Radeon cards, not because I'm trying to throw in a hand grenade. I assume there would be a processing overhead, so the Radeon's would probably have to be 3-5% faster at the same price point to get an equivalent performance for the money when being used for Iray - do they show this?

    Regards,

    Richard

    Post edited by richardandtracy on
  • prixatprixat Posts: 1,588

    richardandtracy said:

    Are Radeon cards enough of a leap ahead of nVidia to be worth it?

    I would say yes. Not necessarily in terms of 'a leap ahead' but because you can get 20GB Radeons for half the price of the equivalent VRAM Geforce.

    So very much worth finding out if this works.

  • LeanaLeana Posts: 11,728

    scyhoce said:

    Leana said:

    Iray is developped by Nvidia and delivered as a "black box" to Daz team, who only integrate it into Daz Studio.

    So while that project seems interesting, I'm not sure Daz could do anything with it.

    I'm not sure daz needs to do anything with it, from what I'm seeing, you can run this at any software that needs cuda cores to run. For example, they ran it on blender's renderer that only uses cuda cores and it worked, they did not modify blender in anyway. along with some other software too 

    For it to work we would need DS and Iray to consider the AMD GPU + Zluda as a valid Iray rendering device. I don't know if there would be changes needed in DS / Iray for that, for example in the window where you select which devices you render with, or when Iray checks if you have the minimum driver version required, and so on. The fact that they managed to use it with Blender "as is" is promising though.

    A point to consider is that Nvidia drivers and Iray evolve quite often. Not sure how much work would be required on Zluda side to "keep up" with that, but judging by the first article, the original developper himself doesn't seem very interested in keeping it updated. It's open source, so other developpers might decide to contribute, but there's no guarantee someone will.

  • LauritaLaurita Posts: 222
    edited February 14

    I used Radeon cards exclusively before venturing into DAZ and still have one running on my main computer. All in all I very much prefer them, especially the driver stability. If I could bin NVidia I'd do it in a heart beat,

    Post edited by Laurita on
  • Interesting..

    One of the reaons I still use Poser is that I prefer AMD, always have. If this works, I'll actually consider trying out DS again

    I'll keep an eye on this!

  • outrider42outrider42 Posts: 3,679

    prixat said:

    richardandtracy said:

    Are Radeon cards enough of a leap ahead of nVidia to be worth it?

    I would say yes. Not necessarily in terms of 'a leap ahead' but because you can get 20GB Radeons for half the price of the equivalent VRAM Geforce.

    So very much worth finding out if this works.

    I strongly disagree. While AMD offers more VRAM for cheaper, the performance in apps like this are terrible. While you can get AMD GPUs to render in Blender, they are much slower. Comically slower even.

    That also discounts the many hoops you need to jump through to use AMD for many applications. With Nvidia, I have to quote the CEO here...it just works. CUDA is the standard for a reason. It is not because Nvidia cornered the market, it is because Nvidia simply did it better.

    ZLUDA development has already been killed. It is not going to receive support from AMD, nor is it going to get updates from the fellow who created it. It has been made open source, but that is no guarantee that others will pick it up. It still faces many challenges going forward.

    The benchmarks that have been posted showing ZLUDA in action are deceptive. I shall explain. Here is one ofthe Blender benches posted.

    At first glance, this looks pretty good. The AMD cards are not so far off their Nvidia counterparts. And with more VRAM in many models, this sounds great, doesn't it?

    However the benchmark is done with one very important catch...this is pure CUDA rendering, Optix was not used. And that is the problem. Who in their right mind is going to use pure CUDA rendering if they have an RTX card? Literally nobody. When you turn on Optix, the AMD cards get absolutely destroyed. 

    People talk about VRAM, well, Nvidia has a 16GB 4060ti that can still beat these AMD cards at rendering tasks. Oh, and while using half the electricity, too.

    Here is that same classroom scene rendered with Optix, tested by the same people who made the chart above:

    There is no 4060ti in this chart, but the 4060 non ti renders the scene in 22.88 seconds. That places it on par with the beefy 7900xt, the 4060ti would be closer to the fastest AMD GPU, the 7900xtx. Last I checked the 4060ti is CHEAPER than the 7900xt by a far margain. So I am not seeing any value here, guys. And surely the new 4070ti Super which has 16GB will be even faster, as the 4070ti is significantly faster than the 4060 in Iray. See our own benchmarks. Oh, and 4070ti Super also uses less power than the big AMD cards. Good grief, even the 2060 Super is faster than a brand new 7700xt. That is embarrassing.

    If you specifically want 20GB, then sure, you have no such option with Nvidia. And that does suck. But this idea that AMD is the cheaper and better option just doesn't work when it comes to creation software like Daz Studio and Iray. AMD is a distant second place for a reason, guys. These are not video games. If you play video games, then AMD is a decent choice for GPU, but it becomes a handicap with creation software. Ray tracing is not an optional feature for Iray, it an absolute must.

    And this is all assuming you can even get AMD to work with Iray. Iray is built on Optix, yes, the same Optix shown just above. So if ZLUDA cannot run Optix then ZLUDA cannot possibly run Iray at all. You will need to jump through many hoops to make this work, even if it is possible.

    Ultimately if you want to use Daz Studio Iray, you need Nvidia. There is no way around it, and this ZLUDA is just wishful thinking. I suppose you export all your Daz scenes into Blender if you want to use AMD that much, but why punish yourself? I always recommend getting the hardware for the software you use most. Sometimes AMD is a fine fit for certain software, but not here.

  • GordigGordig Posts: 10,062

    outrider42 said:

    So I am not seeing any value here, guys.

    The value is for people who already have AMD cards and can't afford to replace them, and as AMD and NVidia cards don't like to play together in the same system, also don't have the option of supplementing their AMD card with an NVidia. This is, admittedly, subject to the Sunk Cost Fallacy, and I would never discourage anyone interested in rendering from getting an NVidia card.

    If you specifically want 20GB, then sure, you have no such option with Nvidia.

    Not in the consumer line, but the RTX 4000 and RTX 4000 SFF have 20GB of VRAM and can be purchased directly from NVidia for prices comparable to, an in some cases much lower than, many of the 16GB cards. They also have smaller form factors (single-slot or short form factor dual-slot, respectively) and use a lot less power than GeForce cards, though at the cost of some performance.

    Ultimately if you want to use Daz Studio Iray, you need Nvidia.

    It's not just Iray, either. Octane requires an NVidia card to function, and many other renderers that can be GPU-accelerated can only do so with an NVidia card. Whatever value one might gain from emulating CUDA in different render engines would be subject to the same limitations as in Iray.

  • outrider42outrider42 Posts: 3,679

    Gordig said:

    outrider42 said:

    So I am not seeing any value here, guys.

    The value is for people who already have AMD cards and can't afford to replace them, and as AMD and NVidia cards don't like to play together in the same system, also don't have the option of supplementing their AMD card with an NVidia. This is, admittedly, subject to the Sunk Cost Fallacy, and I would never discourage anyone interested in rendering from getting an NVidia card.

    If you specifically want 20GB, then sure, you have no such option with Nvidia.

    Not in the consumer line, but the RTX 4000 and RTX 4000 SFF have 20GB of VRAM and can be purchased directly from NVidia for prices comparable to, an in some cases much lower than, many of the 16GB cards. They also have smaller form factors (single-slot or short form factor dual-slot, respectively) and use a lot less power than GeForce cards, though at the cost of some performance.

    Ultimately if you want to use Daz Studio Iray, you need Nvidia.

    It's not just Iray, either. Octane requires an NVidia card to function, and many other renderers that can be GPU-accelerated can only do so with an NVidia card. Whatever value one might gain from emulating CUDA in different render engines would be subject to the same limitations as in Iray.

    I still don't see the value in trying to force oneself to make AMD work for this, regardless of their situation. The amount of energy and effort to make AMD work in this area is simply not worth it, period. If you have an AMD card, I suggest selling it. If you cannot afford to buy Nvidia, I am not sure what to say. If you can afford to shop Daz, which you kind of need to do to get anywhere with Daz, then you must have something for a piece of hardware to use Daz Studio properly. I am not trying to argue, this is just how it is. Trying to force AMD to work is going to end in pain for the user. At some point you just have to suck it up and move on.

    When I started using Daz, I owned an ATI 5870. That's how old it was, it predated AMD's purchase of ATI. And this card was doing fine in the video games of the day, but it was useless here. I was pretty poor at this time, but I saved up and dropped a little over $100 on a used GTX 670 from ebay. Every GPU I bought up until 2020 was used. You do what you have to do. Some people may not want to buy used, but...are you really going to suffer with AMD for years on end because...reasons? Have a yard sell, whatever it takes.

    Very few people really look at the pro cards for rendering because the price to performance is simply terrible. The cards you linked are still $1250. They are also slow. They might have 20GB, but that is their one and only advantage for Iray users, since Iray doesn't use any of the features that pro cards offer. At the price given, you would be far better just going all in on a 4090 which doesn't cost much more than these (all things considered.) I mean, if you got $1250 to drop on a GPU, I am going to assume you can push a few hundred more for a 4090. The 4090 has more than 10,000 CUDA cores than these cards. But that is not all. Both RTX 4000 cards are down clocked, they are running at reduced clockspeeds. So not only do they have drastically fewer CUDA cores, those cores are running slower. This makes the 4090 so much faster that isn't even funny, and thus choosing one of these over a 4090 is not very logical. Unless you have a single slot system, which is an odd setup to have. The pro cards also have no HDMI outputs, so anybody using a TV as a monitor will require an awkward adapter (which is happening more and more these days. The LG 42" C series OLED is a popular choice and I know people who have one for PC). But the Display Port to HDMI adapter cuts out numerous features in the process. So it is very much not ideal.

    Also, going used again, you can get 24GB on discount with a 3090. There are 3090s in the $750 range on ebay right now. Research the seller, make sure you have buyer's protection, and you can have 24GB and a pretty fast Nvidia GPU. If I didn't already own a 3090 I would be pretty tempted to get one myself off ebay, even knowing some may have been used for mining.

    Indeed, working in the creative field in general practically requires Nvidia. CUDA is Nvidia's key feature, their entire product stack revolves around CUDA. So many software, and not just render engines, use CUDA as a base. Which makes it all the more important to have Nvidia if someone is interested in using any creative software at all, not just Daz Studio. There are many software where Nvidia is the only GPU that is supported. In other software that can use AMD, Nvidia dominates pretty all the benchmarks, and often by comical margains. In video editing software, AMD can be abymal. There are some video editors that have frequent crashes with AMD 6000 series. I have seen youtubers talk about trying to encode videos for upload to youtube, and their AMD 6000 series would crash too frequently. They said a 3070, in spite of having just half the VRAM of the 6800xt, worked far better at encoding their videos. And since they are a youtuber, they chose the 3070 over the 6800xt to ensure their videos would get made. This in spite of the fact that the 6800xt is generally stronger and has twice the VRAM of the 3070. Stable Diffusion can work on AMD now, but it took a while to get that support and even now you are still far better off using Nvidia. Nvidia cards destroy AMD at Stable Diffusion (even worse than the charts I showed above). Even in the area of gaming, while AMD can render rasteration pretty well, they lag behind badly in games that use ray tracing, and AMD's image upscaling is well known to be inferior to Nvidia's DLSS. Also, AMD GPUs do not run VR applications well. The change they made going to the 6000 series basically broke VR.

    It just goes to show that VRAM isn't everything, at least when it is AMD.

    I can go and on and on with this subject. It goes back to what I said, "It just works". With Nvidia, you can rest assured that pretty much every software is not only going to work, but work well with Nvidia GPUs. With AMD this simply is not the case. A lot of people never really put much thought into this. If you have any curiousity about trying out another software, then you should just get Nvidia. It is that significant. For example, if you play video games now, and never have used a GPU for anything else, then maybe AMD is fine. But if you have any aspiration or are just curious about something else, you should just get Nvidia, so that you will be covered for whatever it is you might want to try.

  • When looking for a new computer I did a lot of camparisons etc. my processor comparisons were between AMD and Intel, though the GPU comparisons were between various nVidia cards.

    When it came to the processor I found a number of very high benchmarks for the AMD processors compared to the Intel one I eventually went for. It looked as if the AMD processors were superb at a very limited set of actions, and when aggregated into the benchmarks, made the processors seem very fast. However, what matters for DS is single thread speed and [if the gpu drops out] multi-core rendering. The i5-12600k I went for seemed to have [at the time] a single thread speed the same as more expensive AMD's, though its multi-core rendering speed could be lower depending on what was actually needed in the rendering process, and I don't know those details. The price for a CPU tailored to the needs of DAZ Studio was cheaper for the Intel than for the AMD.

    As a result, I think I'd be cautious about the performance of an AMD GPU. Even if the software can be used to tailor the Iray input to native AMD input, it looks as if AMD tailor their hardware to be good at different things from [say] Intel. And it may not be great at rendering Iray even if the software is tweaked to allow it to work. Basically, what I'm saying is that I'd rather someone else be an early adopter and find out whether it works well or not! The idea of 20Gb of  GPU seems attractive, but I'd rather see that it's not glacial either.

    Regards,

    Richard

     

  • AMD cards are great for general gaming... I have interests in virtual pinball, and some of the hobbyist stuff, and what I'm always seeing is that certain graphics features don't work on the AMD cards (presenting artifacts), and that there are a few other issues that crop up there as well that makes it clear that AMD has a hard focus on the more casual players of games and not much else. Yes, you can get a card cheaper that will play pretty much anything in the Steam store or Epic store, but once you step outside of that into hobbyist spaces you start to get into issues.

    That said, I'd love that this be taken up by someone, if only to provide options for people who do not like the way nVidia is treating the market. The 40x0 series has had a massive price hike compared to the previous generation, albeit partly driven by the cryptominers and scalpers showing how much more the cards can go for, although I suspect nVidia has hiked them too high up and priced some people out of their cards entirely, albeit with the side effect of making them much less enticing to scalpers (it's hard to make a profit when the things are still on the shelf in the store and they have to mortgage the house to empty that shelf) and miners (by making it so that they might barely break even before destroying the card).

  • nakamuram002nakamuram002 Posts: 788
    edited February 16

    I suspect that "AMD Radeon ProRender" is the reason why AMD is no longer interested in VLuda.  ProRender supports any GPUs that support OpenCL and/or DirectX12.

    https://www.amd.com/en/products/graphics/software/radeon-prorender.html

    https://gpuopen.com/radeon-pro-render/

    https://community.amd.com/t5/radeon-prorender-discussion/any-news-for-daz-studio-prorender-plugin/td-p/466834

    On the DS5 forum, I have suggested that support for "AMD Radeon ProRender" be added to DS5.

    Post edited by nakamuram002 on
  • nakamuram002 said:

    I suspect that "AMD Radeon ProRender" is the reason why AMD is no longer interested in VLuda.  ProRender supports any GPUs that support OpenCL and/or DirectX12.

    https://www.amd.com/en/products/graphics/software/radeon-prorender.html

    https://gpuopen.com/radeon-pro-render/

    https://community.amd.com/t5/radeon-prorender-discussion/any-news-for-daz-studio-prorender-plugin/td-p/466834

    On the DS5 forum, I have suggested that support for "AMD Radeon ProRender" be added to DS5.

    Anyone could write a plug-in to support ProRender - I suspect the challenge would be adapting materials.

  • nakamuram002nakamuram002 Posts: 788
    edited February 18

    Richard Haseltine said:

    nakamuram002 said:

    I suspect that "AMD Radeon ProRender" is the reason why AMD is no longer interested in VLuda.  ProRender supports any GPUs that support OpenCL and/or DirectX12.

    https://www.amd.com/en/products/graphics/software/radeon-prorender.html

    https://gpuopen.com/radeon-pro-render/

    https://community.amd.com/t5/radeon-prorender-discussion/any-news-for-daz-studio-prorender-plugin/td-p/466834

    On the DS5 forum, I have suggested that support for "AMD Radeon ProRender" be added to DS5.

    Anyone could write a plug-in to support ProRender - I suspect the challenge would be adapting materials.

    ProRender's documentation says that it has a shader called "Uber Shader".  At the top level, it's parameter groups seem to match up with those from DS IRay Uber Base.

    https://radeon-pro.github.io/RadeonProRenderDocs/en/uber/about.html

    Post edited by nakamuram002 on
  • Seven193Seven193 Posts: 1,080
    edited February 18

    Leana said:

    A point to consider is that Nvidia drivers and Iray evolve quite often. Not sure how much work would be required on Zluda side to "keep up" with that, but judging by the first article, the original developper himself doesn't seem very interested in keeping it updated. It's open source, so other developpers might decide to contribute, but there's no guarantee someone will.

    Yes, official support in this area would probably be the first step for anyone putting money and their time behind it. And it has to come from AMD or nVidia, or no go, imho.

    Post edited by Seven193 on
  • Seven193 said:

    Leana said:

    A point to consider is that Nvidia drivers and Iray evolve quite often. Not sure how much work would be required on Zluda side to "keep up" with that, but judging by the first article, the original developper himself doesn't seem very interested in keeping it updated. It's open source, so other developpers might decide to contribute, but there's no guarantee someone will.

    Yes, official support in this area would probably be the first step for anyone putting money and their time behind it. And it has to come from AMD or nVidia, or no go, imho.

    Two people are trying to run Zluda on DS.  Unfortunately, they have been unsuccessfyl, so far.

    https://github.com/vosen/ZLUDA/discussions/83

     

  • NylonGirlNylonGirl Posts: 1,819

    nakamuram002 said:

    Seven193 said:

    Leana said:

    A point to consider is that Nvidia drivers and Iray evolve quite often. Not sure how much work would be required on Zluda side to "keep up" with that, but judging by the first article, the original developper himself doesn't seem very interested in keeping it updated. It's open source, so other developpers might decide to contribute, but there's no guarantee someone will.

    Yes, official support in this area would probably be the first step for anyone putting money and their time behind it. And it has to come from AMD or nVidia, or no go, imho.

    Two people are trying to run Zluda on DS.  Unfortunately, they have been unsuccessfyl, so far.

    https://github.com/vosen/ZLUDA/discussions/83

     

    I wonder if it's that Optix thing or whatever. Maybe DAZ has somethng else besides IRAY trying to use the phantom Nvidia card, and those things can be disabled. 

  • outrider42outrider42 Posts: 3,679

    Optix is the reason. AMD still cannot run it, they can only do pure CUDA. Optix uses the ray tracing cores, which ZLUDA is unable to access. Maybe you could run an older pre-Optix version of Iray, but that would probably not work because of how Iray checks for Nvidia specific drivers. You have to be able to trick Iray into thinking there is a compatible Nvidia GPU present, and who knows what else needs to be done.

    Professional and prosumer use is why Nvidia costs more than AMD. Of course I don't like that Nvidia costs what it does. But the fact is that people still buy Nvidia products at these prices. If Nvidia couldn't sell their products, the prices would drop, simple as that. Think of it this way guys, would you take a commission for $100 or one for $1000? Obviously you would take the commission for the higher prices, and company is going to do the same for their products. Some prices have indeed fallen, the "Super" refresh is a course correction of sorts. Miners and scalpers were buying GPUs before, not so much now. Mining has crashed and has not recovered. AI has taken over the demand. And AI is dominated by Nvidia, just like rendering and so many other fields. Scalping can still happen, but it nothing like it was before. Scalpers jumped on the 4090 ban in China. But also, the various GPU distributors tried to push as much 4090 stock to China before the ban as possible. So this is why the 4090 has gone up in price again. Once things cool off in China the prices will come back down again. Hard to say how long that will take. But by that time the 5000 series may around the corner.

    People getting into AI are leaping onto Nvidia cards. While they also want VRAM just like we do, it is indeed possible to run Stable Diffusion locally on your computer with 8GB just fine. You can even use less by using smaller models and using some memory saving features. Just like Daz users, the people in these fields run the full spectrum. You have some more professional users who can afford a more expensive card. But there are plenty of people who for different reasons will pick a smaller GPU. Whether it be finances or just a hobby.

    Regardless of the situation, though, they are buying Nvidia GPUs for it. As a result, AMD cannot possibly charge what Nvidia does for a GPU. It would be a horrible move on their part. Even the techtubers say this. The well respected Hardware Unboxed has said numerous times that they would expect AMD to be cheaper at similar performance levels because of the lack of feature parity.

    I believe AMD should have pumped money into Stable Diffusion when it was getting started. They should have wrote them a check for tens of millions, maybe even hundreds of millions, to ensure that they built their software to run as fast as possible on AMD hardware. It would have been a ballsy move, but it would have paid off many times over. This was their chance, with a brand new technology, to get in on the ground floor. Instead, Stable Diffusion was written with CUDA in mind from the start, and it took time for AMD to even get support at all. And even now AMD is shamefully behind in performance metrics. So any prospective buyer interested in AI, which is a lot of people right now, is going to be looking at Nvidia. AMD needs to be more aggressive in these fields. Nobody is going to just do it for them, and this has really held AMD back. That is something a lot of people don't quite understand. People like to blame Nvidia for the current conditions. Rather, I say AMD failing to compete is how we got here.

    And just think about this, many people here are rooting for AMD to compete well, yet will STILL buy Nvidia for rendering Iray because they can't buy AMD to do that. That says a lot about the situation. Certainly, I want to see healthy competition, too, but the fact is I will still buy Nvidia no matter how well AMD does because of Iray and other software. AMD has no chance like this.

  • LeanaLeana Posts: 11,728

    Seven193 said:

    Leana said:

    A point to consider is that Nvidia drivers and Iray evolve quite often. Not sure how much work would be required on Zluda side to "keep up" with that, but judging by the first article, the original developper himself doesn't seem very interested in keeping it updated. It's open source, so other developpers might decide to contribute, but there's no guarantee someone will.

    Yes, official support in this area would probably be the first step for anyone putting money and their time behind it. And it has to come from AMD or nVidia, or no go, imho.

    The first article mentions that AMD has evaluated it and decided they were not interested. As for support from nvidia, that's very, very unlikely to happen.

  • Leana said:

    As for support from nvidia, that's very, very unlikely to happen.

    Very much this. As much as they don't have to compete under the circumstances, nVidia will do nothing that even hints at breaking their effective vendor lock-in.

Sign In or Register to comment.