Cuda can now run on AMD GPUs?

2»

Comments

  • NylonGirlNylonGirl Posts: 1,819

    If Optix requires ray tracing cores specific to newer Nvidia hardware, then it must be possible to run Iray without Optix. My computer has a GTX 1050. I'm pretty sure it doesn't have those ray tracing cores. But it still runs Iray without CPU fallback.

    Regardless of any performance advantage from using Nvidia, there are people who may want to use AMD for other reasons such as preventing a monopoly or vendor lock-in. Or because they use their computer for other things besides DAZ Studio. And there may be situations in which an AMD CPU and AMD graphics card work together better than some other combination due to hardware optimization by the manufacturer.

  • NylonGirl said:

    If Optix requires ray tracing cores specific to newer Nvidia hardware, then it must be possible to run Iray without Optix. My computer has a GTX 1050. I'm pretty sure it doesn't have those ray tracing cores. But it still runs Iray without CPU fallback.

    Yes, it takes about another 1GB of memory to emulate the RTX features.

    Regardless of any performance advantage from using Nvidia, there are people who may want to use AMD for other reasons such as preventing a monopoly or vendor lock-in. Or because they use their computer for other things besides DAZ Studio. And there may be situations in which an AMD CPU and AMD graphics card work together better than some other combination due to hardware optimization by the manufacturer.

  • nonesuch00nonesuch00 Posts: 18,131

    I had waited & hoped that AMD would do real ray tracing and other AI logic stuff in their high speed cards competitive with nVidia but they still haven't made sufficient inroads.

  • When people talk about the suitability of AMD GPUs for gaming, they're correct, but only for rasterization. Unfortunately, AMD is no real competion (yet) from nVidia when it comes to Ray Tracing.

  • outrider42outrider42 Posts: 3,679

    NylonGirl said:

    If Optix requires ray tracing cores specific to newer Nvidia hardware, then it must be possible to run Iray without Optix. My computer has a GTX 1050. I'm pretty sure it doesn't have those ray tracing cores. But it still runs Iray without CPU fallback.

    Regardless of any performance advantage from using Nvidia, there are people who may want to use AMD for other reasons such as preventing a monopoly or vendor lock-in. Or because they use their computer for other things besides DAZ Studio. And there may be situations in which an AMD CPU and AMD graphics card work together better than some other combination due to hardware optimization by the manufacturer.

    If somebody uses their computer for things besides Daz Studio, the odds are extremely high those things work better with Nvidia as well. Significantly better at that, not just a small difference. There are only a few very niche situations where AMD can edge out Nvidia in performance. I mean, good grief, the performance gap in Stable Diffusion is multiplicative, guys. AMD is not just slower, they are orders of magnitude slower at times. And some software it doesn't even work right, with errors or crashes. Nothing sucks more than losing time on a project because it keeps crashing.

    As for preventing a monopoly, again, I blame AMD for this, not Nvidia. Nvidia is simply executing a strategy that almost any other business would do in their position. AMD had their chances, and blew them. When CUDA was first introduced, AMD did not do nearly enough to slow its adoption. CUDA practically took the world over night.

    Here's the thing guys, this is not charity. It is not your job to make these things work. It is AMD's. They had their chances. Did AMD have a competing render engine when Daz picked Iray? (No.) You should not have to do the work for them. AMD should be doing the work for you. I am not going to buy AMD simply to support them like they are some kind of charity. They are large company capable of fending for themselves.

    CUDA is not just a piece of hardware or even software. There is also support. The key difference is that Nvidia has helped other companies use CUDA and figure out ways to make the best use of it. Nvidia provides companies access to their engineers. If a company working with CUDA has questions, they can ask Nvidia for help. AMD has utterly failed at offering the same kind of support, and the result is what we have today. This is not simply "Oh Nvidia made a thing, let's use it", no, they actively choose Nvidia CUDA because it is a platform that works not just for them, but with them. Shoot, just look at how Nvidia added new ghost lights to Iray after initially killing them off. They actually listened.

    AMD has long had a certain philosophy towards their products: they build the hardware and just expect people to figure it out. But that doesn't work in every professional field. There are indeed organisations that have so much money that they have entire teams dedicated to creating every aspect of the software to run. However, not every company can do that, or even most of them. Many companies, even fairly large ones still prefer to have some assistance in these regards. Iray is a perfect example of such a solution. You could design and build your own ray tracing software, but that is hard to do. So Daz worked a deal to have Iray be a part of DS. Perhaps they could have chose a different one, but back in 2014 and 2015 when this came to fruition Iray was one of the first GPU rendering engines and it was easy to use. Now AMD has been getting better at helping the companies use their hardware in the past few years, but they still need to do more. They have to be significantly better if they really want to truly catch Nvidia.

    Just look at ZLUDA. AMD stopped funding it, and the developer stopped working on it. If it is ever going to get any updates, somebody else is going to have to pick up the development, and that might never happen. We might not see any more updates for ZLUDA, ever. Certainly Nvidia isn't going to. Maybe it works for Blender today, but for how long? It might not work after some driver updates, or future versions of Cycles and Blender release, the performance may also deteriate. AMD should have seen this project through and kept supporting it. Octane also tried for many years to make AMD work with its render engine, and after a long period of time they just gave up. We discussed Octane and AMD in the forum years ago, guys. Why didn't AMD do more to make that happen? Octane is a fairly popular render engine, and actually wanted to make AMD work, so it would have been logical for AMD to get more engineers on the task to help figure it out. They could have opened new doors to hardware sales. That is what this is all about.

    As for AMD ProRender, I see so few people using it. Trying to find anybody benchmarking it is not easy. I found marks from 5 years ago which are pointless now, and I couldn't find a forum like this that keeps up with it. I found one video that liked ProRender in Blender, but it still had a problem. The renders it produced looked completely different from Cycles and Optix. Now one would expect things to not look exactly the same, but this was drastic. It was very dark and coolers lacked life. Maybe lighting could be adjusted, the video suggested changing the materials. At any rate, it sounds like a lot of work.

    I get the idea that people don't want a monopoly. But at some point you also make to make a decision for yourself. If it was easier to use AMD, then great, go for it. But it is not. Using AMD for Daz Studio requires time consuming exports to other software, more time consuming efforts to adjust every single material in a scene, more time to relight the scene, and oh yeah, more time to render because AMD ray tracing is so much slower. It is a massive time sink just to say you can "stick it to Nvidia". I don't have time for that. I have a job that takes enough time as it is. When I want to use Daz Studio I want to work quickly and get it done, and no matter what AMD does it just isn't going to get the job done for me.

    Plus Nvidia works well with the other software I dabble in. I don't have to do any tricks or jump through hoops to make it work. I wish AMD the best, really I do. But I am not going to handicap myself just to use their GPUs. 

Sign In or Register to comment.