Carrara's favorite CPU?

2

Comments

  • chickenmanchickenman Posts: 1,202
    edited December 1969

    I compared both the AMD and Intel when I built my system.
    For the difference of $200 cdn I went for the i7. all other parts were the same except the CPU and Motherboard.

    I looked at going for the cheap i7 hex core but they all require the new DDR4 RAM which is still very expensive.

    I also looked at going with the dual server processer route as well but again was two cost prohibative

  • protovuprotovu Posts: 194
    edited December 1969

    Yes, the DDR4 is expensive!
    I wonder when it will come down?
    I looked at a 5930 last night at my local Central Computer ( they are out of 4930s):
    With motherboard, would be about $900.
    To max out the ram.....$1000.

  • CoolBreezeCoolBreeze Posts: 207
    edited December 1969

    There's honestly nothing wrong going with a ddr3 based system.

    It comes down to how much u want to spend. And in some cases like mine, ddr4 based ram isn't even going to be a viable option for a few years yet (I went with a xeon based system). What I have for cpu and 64gb of ddr3 I'm getting great render performance results .

    Games usually are more critical for the fastest ram speeds as well as over clocking. Faster ram speeds means higher latency too.

  • protovuprotovu Posts: 194
    edited December 1969

    Thanks, Mohandai.

    More good information.
    So you have 12 cores?
    This system was rather expensive, would you say?

    I am not a game guy, but would be very happy to have large files open and save faster. So, the RAM speed is less critical for Carrara, but
    maxed out RAM is still good?

  • mikael-aronssonmikael-aronsson Posts: 566
    edited February 2015

    DDR4 is more about lower power consumtion=less heat (more so for sleep mode on laptops because it does not need to refresh the ram in the same way as DDR3).

    In terms of performance it's not that much difference so DDR3 is a much better choice (more memory for the money).

    3D eat memory and as RAM is slooooooow compared to CPU big fast cache is much better to go for and get more DDR3 instead if you ask me.

    Post edited by mikael-aronsson on
  • protovuprotovu Posts: 194
    edited December 1969

    I see, thanks. So really in terms of value and performance, it would be better to go with a 4930, because a 5930 cannot use DDR3, yes?

  • mikael-aronssonmikael-aronsson Posts: 566
    edited February 2015

    Yes 5930 require DDR4.

    A 5930 is a little faster and has a little larger L3 cache, that's about it, so you get much more for your money with a 4930.

    Post edited by mikael-aronsson on
  • CoolBreezeCoolBreeze Posts: 207
    edited December 1969

    There's honestly nothing wrong going with a ddr3 based system.

    It comes down to how much u want to spend. And in some cases like mine, ddr4 based ram isn't even going to be a viable option for a few years yet (I went with a xeon based system). What I have for cpu and 64gb of ddr3 I'm getting great render performance results .

    Games usually are more critical for the fastest ram speeds as well as over clocking. Faster ram speeds means higher latency too.
    Not 12 cores... that would be the E5-2697v2 series xeon cpu, 12 physical cores / 24 threads @ 2.7gh , 3.5ghz turbo

    I went for the E5-2690V2 which is 10 cores / 20 threads @
    3.0 Ghz / 3.6ghz turbo ...same L2 and L3 cache sizes and at the time almost $1000.00 cheaper, and right now still $600.00 cheaper per chip.

    Same reasoning here for the price to performance ratio comparison as I mentioned about the5930k vs 5820k
    The e5-2690v2 is still a better price / performance ratio to the e5-2697v3 even ...

    When idle, win 8.1 throttles down the cpu to 1.1ghz so they're quite power efficient.

    That being said if u wanted to save up about $3000.00 you could go for a single e5-2690v2 and a dual socket asus board, and 32 gb ram, and add another e5-2690v2 later down the road and you'd be scaling your performance 2x ... although that's quote high end extreme for just rendering. I did it but this is a once in a lifetime sort of thing and while I have my previous rig for 7 years and still have it, I plan on keeping this one just as long... this where you also look at cost of ownership and return on investment ... my previous quad core cost $1200.00 and brek it down in 8 yes for example.

  • 3drendero3drendero Posts: 2,024
    edited February 2015

    3drendero said:
    For the price of one 3930k system, I can build 2 AMD FX systems and use both to render in Carrara. Performance per dollar is also important for those of us on limited budgets.

    Sure, but even a cheap quad core 4770k or 4790k will give you 9590 performance and not break the bank.

    Cheap 4770k, never seen that...
    A cheap CPU is the AMD FX8300, can be bought for about 100 US$ compared to the 4770k at 300+.
    The 8300 can be overclocked to 4,5-5GHz and considering that there are cheap AM3+ mainboards compared to expensive z97 only for intel, it should be possible to build 2 AMD rigs for a similar price as a 4770k rig.
    For a render farm on a budget, AMD wins.

    http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9494387

    They even have bundle deals of a complete AMD based PC for the same price of an 4770k CPU only:
    http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9496733&csid=_61
    But overclocking is limited since the mainboard is too cheap.

    Post edited by 3drendero on
  • mikael-aronssonmikael-aronsson Posts: 566
    edited December 1969

    The FX8300 is not bad in any way, and yes it's cheaper but I am not sure you should compare them much either, they are two different beasts.

    First of all the L2 cache is silly small on the FX8300 and with 3D graphics you need all you can get to get decent performance, it will not help how much you overclock the CPU if it will just sit idle waiting for the memory to keep up (FX8300 needs more juice also so with overclocking you will need a good rig that can remove the heat, not sure the cheap ones are so good at this).

    Single threaded performance on the 4770 is about 80% faster, multithreaded about 40% so if the FX8300 would not be cheaper it would be junk.

    And setting up a render farm is one thing, having a good fast machine for modeling is another things, it all depends on what you want to use it for.

  • protovuprotovu Posts: 194
    edited December 1969

    Hi Guys,
    For what it is worth, regarding render farming,( and I am new to it), I am finding that some of my files just don't play nicely with nodes. So far, it seems that
    the larger files are the sticky wickets (300 MB, though there could be some other reason). They will get stuck, not rendering the first frame. For this reason, I would definitely lean towards the single heartier Intel machine that is a proven renderer. And, as has been pointed out to me, not all cores are equal.

  • evilproducerevilproducer Posts: 9,050
    edited December 1969

    I've network rendered files easily that large or larger with C7.2 Pro, which is 32 bit. Are you sure you're giving the file long enough to load? Are there plugins or custom leaves on the host machine that are not installed on the node machine(s)?

  • protovuprotovu Posts: 194
    edited December 1969

    That is really good to know. So, with these really large files, what happens is the nodes seem to be rendering, but the master is stuck.
    There are no tiles or "n"s to be seen. Then I bail out.
    I am using some shader ops, but this does not seem to be the problem. When it was missing from the node machine, I still got frames, but just with unwanted red for the missing shader op shader.
    Perhaps, I should just hang on longer?

  • evilproducerevilproducer Posts: 9,050
    edited December 1969

    What I see when I render over a network, is the host machine loads the file, then after it is loaded it starts to send the data to the node machines. There is no apparent activity on the nodes, until the scene is sent to them.

    Once the scene is sent, a progress bar appears in the render node telling me that it is loading the scene. If I have my Activity monitor set to monitor my network, I can see MBs of data being sent to the node as the Loading Scene progress bar advances. It is my experience that this is when the image maps are being transferred from the host to the node. When the node has finished loading the scene, you may get progress bars for filling the grid (if there are lots of replications), light calculations, etc. Once all those are done, it looks as if the node is just sitting there. If it is a complex scene, it is not uncommon for the network tiles to be delayed a bit before appearing in the hosts' render window.

    Like I mentioned above, the host doesn't send data to the nodes until the host finishes loading the scene, so if you're using Indirect Light as an example, it will be sending the data to the nodes as the host is working on it's own light calculations.

    It's kind of odd that the host doesn't display anything. Have you checked CPU usage and RAM usage to make sure Carrara hasn't frozen on the nodes or host? I know you mentioned that you get a red shader if you use the shader plugin, but I wonder if it's possible that not having the plugin installed makes a larger scene a bit more unstable?

    One other question. Are you networked wirelessly with the nodes? In the past, Carrara's network rendering was more unstable with wifi networks and seemed to prefer hard wired networks. The main reason that I remember reading was that wifi networks were a little more forgiving of packet corruption, and Carrara wasn't. Just a possibility, as wifi technology has improved.

  • protovuprotovu Posts: 194
    edited December 1969

    Thanks, Evil.
    All hard wired.
    Actually, the failed network render shows a static grey image with no tiles, for what its worth.

    When I have a successful network render, it goes as follows.
    The first frame or so, maybe 3 frames, will be rendered by the master before any "n" tiles appear.

    Honestly, I have not had the time to dig into the failed network renders yet, but I think your suspicions regarding loading are probably right.
    It is as if the nodes get busy too early. The Render Nodes manager show the node machines progressing, with the master stuck, or the RNM will show
    all frames stuck on 1,2,3. I have checked Performance in the Task manager, and it showed high CPU activity on the master machine.

    I will be trying again next week. Will inform if I can figure anything out.

  • umblefuglyumblefugly Posts: 53
    edited December 1969

    3drendero said:
    3drendero said:
    For the price of one 3930k system, I can build 2 AMD FX systems and use both to render in Carrara. Performance per dollar is also important for those of us on limited budgets.

    Sure, but even a cheap quad core 4770k or 4790k will give you 9590 performance and not break the bank.

    Cheap 4770k, never seen that...
    A cheap CPU is the AMD FX8300, can be bought for about 100 US$ compared to the 4770k at 300+.
    The 8300 can be overclocked to 4,5-5GHz and considering that there are cheap AM3+ mainboards compared to expensive z97 only for intel, it should be possible to build 2 AMD rigs for a similar price as a 4770k rig.
    For a render farm on a budget, AMD wins.

    http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9494387

    They even have bundle deals of a complete AMD based PC for the same price of an 4770k CPU only:
    http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9496733&csid=_61
    But overclocking is limited since the mainboard is too cheap.

    No way you will ever convince me to go to amd ever again, and i wouldnt ever recommend them. Intel will whip amd EVERY time. The benchmarks should speak for themselves. I can also buy some old xeons for cheap and build a single system renderfarm that would beat the crap out of any amd rig you could ever come up with.

    But i wont convince an obvious amd fanboy so good luck. :)

  • evilproducerevilproducer Posts: 9,050
    edited December 1969

    3drendero said:
    3drendero said:
    For the price of one 3930k system, I can build 2 AMD FX systems and use both to render in Carrara. Performance per dollar is also important for those of us on limited budgets.

    Sure, but even a cheap quad core 4770k or 4790k will give you 9590 performance and not break the bank.

    Cheap 4770k, never seen that...
    A cheap CPU is the AMD FX8300, can be bought for about 100 US$ compared to the 4770k at 300+.
    The 8300 can be overclocked to 4,5-5GHz and considering that there are cheap AM3+ mainboards compared to expensive z97 only for intel, it should be possible to build 2 AMD rigs for a similar price as a 4770k rig.
    For a render farm on a budget, AMD wins.

    http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9494387

    They even have bundle deals of a complete AMD based PC for the same price of an 4770k CPU only:
    http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9496733&csid=_61
    But overclocking is limited since the mainboard is too cheap.

    No way you will ever convince me to go to amd ever again, and i wouldnt ever recommend them. Intel will whip amd EVERY time. The benchmarks should speak for themselves. I can also buy some old xeons for cheap and build a single system renderfarm that would beat the crap out of any amd rig you could ever come up with.

    But i wont convince an obvious amd fanboy so good luck. :)

    Are you here for a discussion or to spout troll-isms? :blank:

    Myself, I could care less about AMD Vs. Intel.

  • JoeMamma2000JoeMamma2000 Posts: 2,615
    edited December 1969

    Personally, I'd recommend the latest and greatest Intel 8086 processor. It screams at 10MHz and blows the shorts off the competition. A full 8 bit data bus. Although you may have to install the latest liquid cooling and a huge refrigeration unit in your home to handle the heat this thing generates. And don't be surprised if your lights dip when you render a Carrara scene. This thing needs LOTS of POWER. :) :) :)

  • JoeMamma2000JoeMamma2000 Posts: 2,615
    edited February 2015

    By the way, there's nothing I love more than a good old fashioned debate over computer processors. Especially in a hobbyist community where people do 3 hour renders because they are unaware of how to modify and optimize their scenes so they can cut the render time down to 12 minutes, but instead they are fiercely concerned that they use the best processor so they shave 12 seconds off their 3 hour renders.

    Don'tcha love it? :) :) :)

    Post edited by JoeMamma2000 on
  • JonstarkJonstark Posts: 2,738
    edited December 1969

    The thought of shaving 12 seconds off a 3 hour render made me chuckle. Because I think that's pretty close to the mark :)

  • JonstarkJonstark Posts: 2,738
    edited December 1969

    I remember reading somewhere that there was a next generation of cpu tech coming out (I think it was supposed to be out in November 2014?) that would be more than twice as fast as the previous stuff. I don't know where I read it or what ever happened with that, maybe it's out now for those willing to spend big bucks.

    I'm pretty happy with 8 buckets rendering away on my laptop, but I wouldn't object if the technology filters down that it becomes affordable to have 16 cores or more rendering on a laptop within my price range.

  • umblefuglyumblefugly Posts: 53
    edited December 1969

    No one is spouting any BS I offered sound advice. Take it or leave it. Have fun guys. I'm out.

  • JonstarkJonstark Posts: 2,738
    edited December 1969

    Whoa I must have missed something... don't know how I insulted anyone, sorry if I did somehow umblefugly, not sure what I said but I apologize and didn't mean to offend.

  • umblefuglyumblefugly Posts: 53
    edited December 1969

    I'm not referring to you Jonstark :) Read up a little. :)

  • evilproducerevilproducer Posts: 9,050
    edited February 2015

    No one is spouting any BS I offered sound advice. Take it or leave it. Have fun guys. I'm out.

    How is this factual advice? Sounds like a statement designed to generate an argument to me:


    But i wont convince an obvious amd fanboy so good luck.

    Post edited by evilproducer on
  • ProPoseProPose Posts: 527
    edited December 1969

    Personally, I'd recommend the latest and greatest Intel 8086 processor. It screams at 10MHz and blows the shorts off the competition. A full 8 bit data bus. Although you may have to install the latest liquid cooling and a huge refrigeration unit in your home to handle the heat this thing generates. And don't be surprised if your lights dip when you render a Carrara scene. This thing needs LOTS of POWER. :) :) :)

    I still have one of those, ha ha. Haven't fired it up in years. Runs on Dos 6.22 Also has Word for DOS

  • TangoAlphaTangoAlpha Posts: 4,584
    edited December 1969

    Somewhere in the attic I've got a 4.7MHz 8088 with 512K of Ram and two 5 ¼ inch floppys (no hard drive)...

  • mikael-aronssonmikael-aronsson Posts: 566
    edited December 1969

    I beat you, I have a 1MHz Z80 machine, with 1KB of RAM..... (708 bytes available for applications)I, and it's still working.

  • 3drendero3drendero Posts: 2,024
    edited December 1969

    No way you will ever convince me to go to amd ever again, and i wouldnt ever recommend them. Intel will whip amd EVERY time. The benchmarks should speak for themselves. I can also buy some old xeons for cheap and build a single system renderfarm that would beat the crap out of any amd rig you could ever come up with.

    But i wont convince an obvious amd fanboy so good luck. :)

    Running an Intel system, so your guess about being an AMD fanboy is wrong.
    At the time of buying, AMD was not cheap so Intel I went.

    If you would bother to look up actual benchmark data at Anand Bench, you would see that a new Intel is at best 25 % faster at some 3D rendering apps (Cinebench, 5%POVRay) than the cheapest overclocked AMD FX, at >200% higher cost.
    Some of the other benchmarks the AMD even wins, so it all depends on which apps you run, but you can still buy 2 AMD rigs for an Intel that has a small edge in some apps and loses in others.

    Never seen any cheap xeons, got any suggestions? What cheap mainboards?

  • JoeMamma2000JoeMamma2000 Posts: 2,615
    edited December 1969

    I beat you, I have a 1MHz Z80 machine, with 1KB of RAM..... (708 bytes available for applications)I, and it's still working.

    Yeah, now you're talking...

    My first computer was a Heathkit H89 with a Z80 processor...I remember staying up all night building the kit. And I finally finished and turned it on and saw ">C:\" on the screen and was jumping with joy.

Sign In or Register to comment.