Realistically, how many years from now can we render the whole Pixar Frozen at home in one hour?

iSeeThisiSeeThis Posts: 552
edited June 2017 in The Commons

The number of character rigs in Frozen is 312 and the number of simulated costumes also reached 245 cloth rigs, which were far beyond all other Disney films to date. Fifty effects artists and lighting artists worked together on the technology to create "one single shot" in which Elsa builds her ice palace. Its complexity required 30 hours to render each frame, with 4,000 computers rendering one frame at a time.

Above is from Wiki. Wonder how many years can we use a home-user computer to render the whole movie in one hour. By the way, I still can't understand if Frozen needs 30 hours to render a frame (and The Cars 2 needs 11 hours for one), how can they do it sooner than a hundred years? (it's said 4,000 computers simul for a frame, not lesser than best supercomputer for sure.)

Post edited by iSeeThis on
«13

Comments

  • CybersoxCybersox Posts: 8,950
    edited December 1969

    dungtrin said:
    The number of character rigs in Frozen is 312 and the number of simulated costumes also reached 245 cloth rigs, which were far beyond all other Disney films to date. Fifty effects artists and lighting artists worked together on the technology to create "one single shot" in which Elsa builds her ice palace. Its complexity required 30 hours to render each frame, with 4,000 computers rendering one frame at a time.

    Above is from Wiki. Wonder how many years can we use a home-user computer to render the whole movie in one hour. By the way, I still can't understand if Frozen needs 30 hours to render a frame (and The Cars 2 needs 11 hours for one), how can they do it sooner than a hundred years? (it's said 4,000 computers simul for a frame, not lesser than best supercomputer for sure.)


    Well, the inhouse renderfarm that Pixar used on Monsters University had 2000 computers with a total of 12,500 cores. However, the mega-renderfarm that Disney used for Frozen had more than double that, a whopping 30,000 cores. You can see that insane sucker here: https://www.usenix.org/sites/default/files/conference/protected-files/lisa13_geibel_johnson_slides.pdf

    So even if each frame really did take 30 hours to render (which they obviously all didn't), given that Frozen is 108 minutes, that would mean 24 frames per second X 60 seconds (1440 frames) X 108 minutes for a grand total of 155,520 frames. Multiply that by 30 hours rendertime each and it would be 4,665,600 hrs. However, divide that by 30,000 and you're only looking at 155.52 hours, or just under a week. Easy peasy, assuming that you don't suck the entire western electric grid dry. :)

    As for the answer to the question as to when this is in the home... looks to be about 10 years using this new tech being developed by Disney - http://io9.com/disneys-new-rendering-technique-could-usher-in-a-new-e-1467435361/all

  • CypherFOXCypherFOX Posts: 3,401
    edited April 2014

    Greetings,
    Alternatively, using Cybersox13's numbers without a new rendering technology, you would need 2^14 times as much CPU horsepower to bring it down to about a 2-core system. So if CPU power doubles every 24 months (cf. Moore's Law), then you need 14 doublings, or 28 years.

    That's in theory; there may be switching limitations and speed-of-light concerns at that point, but...folks have been giving that caveat for a lot of years and it hasn't really slowed down that I can tell.

    [Edit: That's to match the 155 hours described above. To bring that down to under a single hour requires approximately 8 more doublings, or 16 more years, for a total of 44 years. To be clear, since the movie is more than an hour, that implies rendering projection-movie resolution at significantly better than real-time.]

    -- Morgan

    Post edited by CypherFOX on
  • iSeeThisiSeeThis Posts: 552
    edited December 1969

    Thanks Cybersox13. Actually this topic is a somewhat big controversy around the net. But, unbelievably, I've got the correct answer from the first reply of this small thread. I've read a lot and it took many hours but finally got the short right answer from you here in one minute! Thanks much!

  • iSeeThisiSeeThis Posts: 552
    edited December 1969

    Thank you Cypherfox. You really broaden my knowledge.

  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited December 1969

    You only take CPU power in account. I'm not sure a single computer alone could render that. There's also the weight of each asset, but may be we'll have computers with many Tera of memory in 10 years ?

  • TotteTotte Posts: 13,876
    edited December 1969

    Cypherfox said:
    Greetings,
    Alternatively, using Cybersox13's numbers without a new rendering technology, you would need 2^14 times as much CPU horsepower to bring it down to about a 2-core system. So if CPU power doubles every 24 months (cf. Moore's Law), then you need 14 doublings, or 28 years.

    That's in theory; there may be switching limitations and speed-of-light concerns at that point, but...folks have been giving that caveat for a lot of years and it hasn't really slowed down that I can tell.

    [Edit: That's to match the 155 hours described above. To bring that down to under a single hour requires approximately 8 more doublings, or 16 more years, for a total of 44 years. To be clear, since the movie is more than an hour, that implies rendering projection-movie resolution at significantly better than real-time.]

    -- Morgan

    Morse Law ugly applies to cpu cores, not processors. Just look now the GPU industry totally broke morse law by going four to 4096 cores or kernels in one chip.

    The really limiting factor for CPUs are basically "the speed of light" followed by other physical laws like energy turning to heat,
    Remember that it takes todays computers to design tomorrows computers.

  • mjc1016mjc1016 Posts: 15,001
    edited December 1969

    That also assumes current rendering methods will still be used.

    By switching to a different renderer, maybe a GPU based one, you have to work it all out again...and I'm not even going to attempt a guess, because there is no baseline to use, just that something like Octane should be a lot quicker per frame than a CPU based one (the current limit on GPU based renderers is memory).

  • CypherFOXCypherFOX Posts: 3,401
    edited December 1969

    Greetings,
    Just to clarify, Moore's Law is not about CPU cores, it's about transistor density. If you use those transistors in a more efficient purpose-built manner it doesn't 'break' Moore's Law in any significant way. General purpose CPUs are not purpose-built. My approximations above are using whatever existing method they did use to render the scenes, and using the same rendering technology as I indicated, so if they're already using GPUs, it encompasses that.

    Because Moore's Law applies at a lower level, we can also extrapolate that in order for 16TB to be as common as 16GB are today, it will take a multiple of 1024, or 2^10, or 10 doublings (at 24 months per doubling) before we get there. So in 20 years what you pay for a system that (today) would have 16GB will have 16TB of RAM.

    (Again, these are extrapolations. Past performance is no guarantee of future results. But it's a pretty good indicator. ;) )

    -- Morgan

  • TotteTotte Posts: 13,876
    edited December 1969

    Morse law stopped to work around 2006 or so, since then the curve has flattened out.

    To compensate for the limitations of physics multicore and manycore has been the new way.

    Saw a talk on this in 2008 by intel lead CPU engineering team.

  • MorpheonMorpheon Posts: 738
    edited December 1969

    "Reallistically, how many years from now can we render the whole Pixar Frozen at home in one hour?" -- I guess the more appropriate question would be "Why would you want to?" I'm sorry, but "Frozen" was simply not that great a movie.

  • SickleYieldSickleYield Posts: 7,633
    edited December 1969

    dungtrin said:
    The number of character rigs in Frozen is 312 and the number of simulated costumes also reached 245 cloth rigs, which were far beyond all other Disney films to date. Fifty effects artists and lighting artists worked together on the technology to create "one single shot" in which Elsa builds her ice palace. Its complexity required 30 hours to render each frame, with 4,000 computers rendering one frame at a time.

    Above is from Wiki. Wonder how many years can we use a home-user computer to render the whole movie in one hour. By the way, I still can't understand if Frozen needs 30 hours to render a frame (and The Cars 2 needs 11 hours for one), how can they do it sooner than a hundred years? (it's said 4,000 computers simul for a frame, not lesser than best supercomputer for sure.)


    Well, the inhouse renderfarm that Pixar used on Monsters University had 2000 computers with a total of 12,500 cores. However, the mega-renderfarm that Disney used for Frozen had more than double that, a whopping 30,000 cores. You can see that insane sucker here: https://www.usenix.org/sites/default/files/conference/protected-files/lisa13_geibel_johnson_slides.pdf

    So even if each frame really did take 30 hours to render (which they obviously all didn't), given that Frozen is 108 minutes, that would mean 24 frames per second X 60 seconds (1440 frames) X 108 minutes for a grand total of 155,520 frames. Multiply that by 30 hours rendertime each and it would be 4,665,600 hrs. However, divide that by 30,000 and you're only looking at 155.52 hours, or just under a week. Easy peasy, assuming that you don't suck the entire western electric grid dry. :)

    As for the answer to the question as to when this is in the home... looks to be about 10 years using this new tech being developed by Disney - http://io9.com/disneys-new-rendering-technique-could-usher-in-a-new-e-1467435361/all

    Not true! Actually you can do this tomorrow with a small crew of heavily armed people. Assuming Disney doesn't see you coming, and you've got a good strategy for holding the facility long enough to get your movie rendered.

    Seriously, though, that is a sexy, sexy render farm. Of course, by the time you CAN do that at home with one computer, Disney will be using some kind of technology not yet dreamed of to render realistic virtual reality simulation for interactive theater experiences or something even more over the top, and it will seem completely obsolete. ;)

  • mjc1016mjc1016 Posts: 15,001
    edited December 1969

    Seriously, though, that is a sexy, sexy render farm. Of course, by the time you CAN do that at home with one computer, Disney will be using some kind of technology not yet dreamed of to render realistic virtual reality simulation for interactive theater experiences or something even more over the top, and it will seem completely obsolete. ;)

    I'll be in Holodeck 3...

  • CybersoxCybersox Posts: 8,950
    edited December 1969

    "Reallistically, how many years from now can we render the whole Pixar Frozen at home in one hour?" -- I guess the more appropriate question would be "Why would you want to?" I'm sorry, but "Frozen" was simply not that great a movie.

    And no Frozen thread would have be complete without that. :roll:

    Seriously, whether someone personally liked it or not, the fact is that a large percentage of the world's film goers apparently do given that It came out on DVD & Blu-ray nearly a month ago and is still in the top twenty in U.S. theaters. (Not to mention that it looks to end up replacing as the #8 all time box office champion.)

    And needless to say, the technical aspects of the film are dazzling, which I believe was the point of the original post.

  • Richard HaseltineRichard Haseltine Posts: 99,525
    edited April 2014

    mjc1016 said:
    Seriously, though, that is a sexy, sexy render farm. Of course, by the time you CAN do that at home with one computer, Disney will be using some kind of technology not yet dreamed of to render realistic virtual reality simulation for interactive theater experiences or something even more over the top, and it will seem completely obsolete. ;)

    I'll be in Holodeck 3...

    No way am I going into a holodeck - I'd sooner be a redshirt on an away mission.

    Post edited by Richard Haseltine on
  • robkelkrobkelk Posts: 3,259
    edited December 1969

    "Reallistically, how many years from now can we render the whole Pixar Frozen at home in one hour?" -- I guess the more appropriate question would be "Why would you want to?" I'm sorry, but "Frozen" was simply not that great a movie.

    Was the threadcrapping really necessary?

    Maybe you could do a better job - which would require you to render an entire movie, thus bringing us back on-topic.

  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited December 1969

    "Reallistically, how many years from now can we render the whole Pixar Frozen at home in one hour?" -- I guess the more appropriate question would be "Why would you want to?" I'm sorry, but "Frozen" was simply not that great a movie.

    Technically every time these people with each movies push the limit further. You should have a look at what was behind in term of research and dev just for the snow

    The movie is for kid so that is not always everyone's taste but I was impressed by the tech all along.


    mjc1016 said:
    Seriously, though, that is a sexy, sexy render farm. Of course, by the time you CAN do that at home with one computer, Disney will be using some kind of technology not yet dreamed of to render realistic virtual reality simulation for interactive theater experiences or something even more over the top, and it will seem completely obsolete. ;)

    I'll be in Holodeck 3...

    No way am I going into a holodeck - I'd sooner be a redshirt on an away mission.

    I'll be doing stuff with my quantum supercomputer

  • Subtropic PixelSubtropic Pixel Posts: 2,379
    edited December 1969

    Frozen was a fabulous movie! The first one I've seen in 5 years with an actual plot twist. Great visuals, and it had everything but sex. Happiness, family, tragedy, loss, violence, passion, love, kindness, deviance, deception and betrayal, loyalty, mystery, beautiful scenery, great characterizations, awesome music and singing, and a talking snowman! Without a brain!

    I am sure we won't have the technology to do that piece of work on any one single computer system within the lifetime of anybody who can read this. And to have any hope, I think we need to prevent World War III from happening. First things first, please.

  • starionwolfstarionwolf Posts: 3,670
    edited December 1969

    Frozen? I think that movie is on DVD now? I usually don't follow movies much. I haven't even seen the trailer for Frozen.

  • ghastlycomicghastlycomic Posts: 2,531
    edited December 1969

    "Reallistically, how many years from now can we render the whole Pixar Frozen at home in one hour?" -- I guess the more appropriate question would be "Why would you want to?" I'm sorry, but "Frozen" was simply not that great a movie.

    Technically every time these people with each movies push the limit further. You should have a look at what was behind in term of research and dev just for the snow

    The movie is for kid so that is not always everyone's taste but I was impressed by the tech all along.


    mjc1016 said:
    Seriously, though, that is a sexy, sexy render farm. Of course, by the time you CAN do that at home with one computer, Disney will be using some kind of technology not yet dreamed of to render realistic virtual reality simulation for interactive theater experiences or something even more over the top, and it will seem completely obsolete. ;)

    I'll be in Holodeck 3...

    No way am I going into a holodeck - I'd sooner be a redshirt on an away mission.

    I'll be doing stuff with my quantum supercomputer

    20 years from now I'll be almost 70. So I'll probably be getting ready to retire this ugly bag of mostly water and going for a nice comfortable Vite-Rack.

  • arcadyarcady Posts: 340
    edited December 1969

    You only take CPU power in account. I'm not sure a single computer alone could render that. There's also the weight of each asset, but may be we'll have computers with many Tera of memory in 10 years ?

    In my opinion the biggest issue is going to be heat, not CPU power.

    We almost "hit the limit" on Moore's law in the 1990s when the Pentiums around the time of Windows 95 started melting in the labs... took them a bit of time to figure out how to keep them cool.

    Eventually we will simply hit the limit of physics on keeping things cool - and Moore's law itself is capped by the size of a molecule that can conduct, and the speed of light - but I don't know how far we are from those limits (all electricity is already going at close to the speed of light - and you can't speed that up... so you run the limit when you hit the molecule limit).

    Unless somebody proves Einstein was an idiot - we're going to hit a point at some time in the next 50-200 years where we will have invented everything that can be invented, but are still stuck on this rock.

  • Herald of FireHerald of Fire Posts: 3,504
    edited August 2014

    Totte said:
    Morse Law ugly applies to cpu cores, not processors. Just look now the GPU industry totally broke morse law by going four to 4096 cores or kernels in one chip.

    The really limiting factor for CPUs are basically "the speed of light" followed by other physical laws like energy turning to heat,
    Remember that it takes todays computers to design tomorrows computers.Actually, the real limiting factor now is the real world physics of getting more transistors onto a chip. We've been shortening the gap between them for quite some time, but we're now getting close to the point where quantum physics physically prevents them from getting any smaller. The smaller things get, the more erratic their behaviour. At very small scales, this means electrons can occasionally make the jump between transistors that you didn't want it to, which would corrupt the data stored in the system, and provide incorrect results in a calculation.

    New materials might be able to fix this to some degree, but that limit will ultimately still define just how much processing power we can squeeze out of any new CPU we produce, so sooner or later we're going to have to move from conventional PC's to something different such as quantum computing.

    Eventually we will simply hit the limit of physics on keeping things cool - and Moore's law itself is capped by the size of a molecule that can conduct, and the speed of light - but I don't know how far we are from those limits (all electricity is already going at close to the speed of light - and you can't speed that up... so you run the limit when you hit the molecule limit).

    A minor correction. While the energy wave moves at the speed of light, the speed of the electrons is actually considerably slower. Since it's these electrons which ultimately store the information inside the transistors, we'll need to revise the way we look at computer design entirely in order to build upon what we've already accomplished.
    Post edited by Herald of Fire on
  • iSeeThisiSeeThis Posts: 552
    edited December 1969

    I've got much knowledge from this thread. The best thread of mine here. Thanks you guys. You're really top-notch.

  • Rashad CarterRashad Carter Posts: 1,799
    edited December 1969

    Yes, but quantum computing would truly change everything.

    If you could entangle the spins of multiple electrons as cubits, then in theory one could calculate thousands to millions of operations in what we would call an "instant." It wouldn't require lots of heat, in fact, there would be no thermal footprint at all because no energy was exchanged. The electrons didn't move, the just spun up or down during a given calculation.

    The issue with quantum computing isn't the speed of the calculations, as adding more cubits quickly increases the processing power of the system. but the uploading of the data for processing in the first place and the downloading of that data once calculated for later usage. How can we get memory access fast enough to take advantage of a quantum cubit system for me is the real question.

    This is a really fun topic to pontificate upon.

  • iSeeThisiSeeThis Posts: 552
    edited December 1969

    >> This is a really fun topic to pontificate upon.

    It's fun because someone like you, Rashad Carter, hand us what you know and what you think. Thank you very much!

  • StratDragonStratDragon Posts: 3,167
    edited December 1969

    Frozen? I think that movie is on DVD now? I usually don't follow movies much. I haven't even seen the trailer for Frozen.


    There's an Oscar Wilde zinger in there somewhere but damned if I know what it is. Do yourself a favor and go buy the movie on DVD and buy yourself a DVD player if you don't own one if just to see this film.

  • CybersoxCybersox Posts: 8,950
    edited December 1969


    I am sure we won't have the technology to do that piece of work on any one single computer system within the lifetime of anybody who can read this. And to have any hope, I think we need to prevent World War III from happening. First things first, please.

    That depends upon how you define "computer" and how much that computer costs. The future of computers is currently going in the direction of small home terminals linked to the cloud and you can already tap into almost unlimited render space that way. That redefines the real limitations down to 1.) the speed of internet connections and data transfers and2.) the actual cost of renting that much virtual computer time. Given that the former the former HAS to be increased in order to continue servicing the streaming video markets as the industry moves from 1080 to 4K, and since companies like Amazon are making online commuting cheaper and cheaper, and that full 4K animated feature films like the Blender org film Big Buck Bunny have already been produced online in exactly this fashion, my guess is that we'll easily hit the possibility of making a Frozen level film online within the next ten years. At that point the only real question becomes how much it would actually cost in terms of services and how long it would take to input and process all the data.

  • Sfariah DSfariah D Posts: 26,136
    edited December 1969

    I use my computer as a movie player. I actually bought myself a blueray player but right now my mum is borrowing it as she needed it more than I do. She had an old DVD player that was broken. She has a big screen Tv and the DVD player could not play videos correctly. It keeps fading in and out which made dark scenes hard to watch.
    I have a bluray writer on my desktop which can play bluray and DVD movies. Also my computer has a tv Turner. I converted my tv into a second monitor and I actually use the computer to record tv shows.

    Isn't frozen a bit over an hour long? I do not think those render farms that pixar has can render frozen in one hour.

  • LordHardDrivenLordHardDriven Posts: 937
    edited August 2014

    I used to use as a benchmark for future technological advances, "When they make a video card with 1 gigabyte or more of onboard memory". However that's been done for some time now so I've had to re-evaluate my benchmark and revise it to, "When they make a video card with 1 terabyte or more of onboard memory" :)

    Post edited by LordHardDriven on
  • TaozTaoz Posts: 9,883
    edited August 2014

    Someone predicted many years ago that we'd end up using some sort of organic/biological computers. Just imagine how fast our brains are working - when you dream or daydream/visualize you're actually rendering a complete movie in real time. They're actually working on it now:

    "In a previous Nanowerk Spotlight we reported on the concept of a full-fledged massively parallel organic computer at the nanoscale that uses extremely low power ("Will brain-like evolutionary circuit lead to intelligent computers?"). In this work, the researchers created a process of circuit evolution similar to the human brain in an organic molecular layer. This was the first time that such a brain-like 'evolutionary' circuit had been realized.

    The research team, led by Dr. Anirban Bandyopadhyay, a senior researcher at the Advanced Nano Characterization Center at the National Institute of Materials Science (NIMS) in Tsukuba, Japan, has now finalized their human brain model and introduced the concept of a new class of computer which does not use any circuit or logic gate.

    Over the past decades, digital computers have consistently increased in speed and complexity – China's Tianhe-2, currently the fastest supercomputer in the world, can execute a blistering 33.86 petaFLOPS, or 33.86 quadrillion floating point operations per second. Nevertheless, these machines are limited by their reliance on sequential processing of instructions; i.e. no matter how fast they are, they still process only one bit at a time.

    By contrast, individual neurons in our brain are very slow: they fire at only about 1000 times per second; however, since they are operating in a massively parallel way, with millions of neurons working collectively, they are able to complete certain tasks more efficiently than even the fastest supercomputer. Another important distinction of our brain is that, during computing, information processing circuits evolve continuously to solve complex problems....

    The brain jelly architecture has negligible power consumption. While existing supercomputers have enormous power requirements, a brain jelly computer works on a few watts only."

    http://www.nanowerk.com/spotlight/spotid=34328.php

    Post edited by Taoz on
  • StratDragonStratDragon Posts: 3,167
    edited December 1969

    The production of this film was performed by hundreds of highly skilled technicians who were tasked with pushing the limits of 3D animation and optimizing every aspect of the images to make rendering as efficient as possible. The time needed to render would undoubtedly be significantly longer if it was one person who would not have all the insights of professional teams who do this for a living., A light or group of lights or a reflective surface from different angles can be optimized to work very quickly in the right set of variables or slow a render down to a trickle without the knowhow of what the science is behind the required modeling techniques to produce those results.
    That being said CPU gains have slowed drastically from the leaps they were making a few years ago, so has internal storage and RAM capacity we may see the same decline with next GPU power and that slowdown may last a while . All things considered the formula for the computing power needed to produce Disney's Monsters U was a supercomputer with 24,000 cores and took approximately 29 hours to render a single frame of that film, with 1 core of that computer that drops down to a single frame every 10,000 years. Personally I don't see rending a film like Frozen in an hour happening in our lifetimes.

    No, Homer. Very few cartoons are broadcast live. It's a terrible strain on the animators' wrists.

Sign In or Register to comment.