Realistically, how many years from now can we render the whole Pixar Frozen at home in one hour?

2

Comments

  • arcadyarcady Posts: 340
    edited December 1969

    Pixar is a 1 square block walled compound in Emeryville, CA - just over the border from Oakland. Actually about a half mile from my work.

    Half of the space inside is a giant lawn, and the buildings in there are about 4 to 6 stories tall (been a while since I looked in).

    Probably not hundreds of them in there, but I suspect a few of those buildings hold nothing but render farm machines.

    Half the space likely goes to executives, sales types, and space for the celebrity voice actors. Everybody else is probably stuck in a tiny cubicle off to the left... :)

  • SickleYieldSickleYield Posts: 7,633
    edited December 1969

    arcady said:
    Pixar is a 1 square block walled compound in Emeryville, CA - just over the border from Oakland. Actually about a half mile from my work.

    Half of the space inside is a giant lawn, and the buildings in there are about 4 to 6 stories tall (been a while since I looked in).

    Probably not hundreds of them in there, but I suspect a few of those buildings hold nothing but render farm machines.

    Half the space likely goes to executives, sales types, and space for the celebrity voice actors. Everybody else is probably stuck in a tiny cubicle off to the left... :)

    I bet they have the best air conditioning on the Western seaboard in there, though. :D

  • arcadyarcady Posts: 340
    edited December 1969

    Probably. Now if they'd built on the other side of the bay - by the ocean, its so cold and foggy you can skip that. :)

    (Actually that might be one reason they stay in Emeryville - of the East Bay cities its pretty cool.)

  • DustRiderDustRider Posts: 2,717
    edited December 1969

    arcady said:
    Pixar is a 1 square block walled compound in Emeryville, CA - just over the border from Oakland. Actually about a half mile from my work.

    Half of the space inside is a giant lawn, and the buildings in there are about 4 to 6 stories tall (been a while since I looked in).

    Probably not hundreds of them in there, but I suspect a few of those buildings hold nothing but render farm machines.

    Half the space likely goes to executives, sales types, and space for the celebrity voice actors. Everybody else is probably stuck in a tiny cubicle off to the left... :)


    It's hard to say just how many people work at Pixar, but I'd say there are easily over 500. I just did a rough count of the parking lot on Bing Maps and got about 450 cars. Keep in mind that they have more than one film in production (4 or more?) at the same time, and few staff work on more than one film.

    There was a news special on Pixar a few years back, and it definitely looks like one of the best places to work. A lot of geat perks, like a volleyball court, basketball court, swimming pool, exercise room, free breakfast bar, restaurant, and of course a lot of really smart, talented, and creative people to work with.

    Since you are that close, you should really try to get a tour of the studio. From what I understand, it's not easy, but it's not impossible, even though their official stance is "No Public Tours". I know if I was that close, after seeing the news special, I'd definitely try to get a tour. Here is a link to some ideas to get a tour: http://www.wikihow.com/Get-a-Tour-of-Pixar-Studios

  • jorge dorlandojorge dorlando Posts: 1,156
    edited December 1969

    I used to use as a benchmark for future technological advances, "When they make a video card with 1 gigabyte or more of onboard memory". However that's been done for some time now so I've had to re-evaluate my benchmark and revise it to, "When they make a video card with 1 terabyte or more of onboard memory" :)

    Well, still far from what you expect, but next month, comes in the next generation of PCs:
    http://www.redsharknews.com/technology/item/1920-the-next-generation-of-pcs-is-about-to-arrive?utm_source=www.lwks.com+subscribers&utm_campaign=63c9da3053-RSN_Aug19_2014&utm_medium=email&utm_term=0_079aaa3026-63c9da3053-75117061

  • RorrKonnRorrKonn Posts: 509
    edited August 2014

    all games are in real time and they look killer.
    with real time render engines like octane that works with daz studio ,unity ,unreal.
    ya could render frozen in real time to day.

    so a two hours movie renders in two hours.
    maybe it's not in one hour but still two hours is not that bad.

    Post edited by RorrKonn on
  • nemesis10nemesis10 Posts: 3,390
    edited December 1969

    I was once given a tour of ILM (with my brother) by the Director of Compositing and we got a peek at the ILM (Lucas Studios) render farm and it was impressive. They had just finished some stuff for one of the Pirates of the Caribbean movies so it was idle but the place was sprawling and linked with fibre optic cable. My sister was doing some job interviews at the same time (the now defunct Lucas Arts and Pixar) and her impression was that Pixar was all smaller scale but either company books other rendering farms all the time.

  • Sfariah DSfariah D Posts: 26,136
    edited December 1969

    The actual question I think should be how long will it take a regular person to remake the movie Frozen in under a year at home. That includes getting the idea to remake the movie till it is finished rendering?

  • nicsttnicstt Posts: 11,715
    edited September 2015
    Totte said:

     

    Cypherfox said:

    Greetings,
    Alternatively, using Cybersox13's numbers without a new rendering technology, you would need 2^14 times as much CPU horsepower to bring it down to about a 2-core system. So if CPU power doubles every 24 months (cf. Moore's Law), then you need 14 doublings, or 28 years.

     

    That's in theory; there may be switching limitations and speed-of-light concerns at that point, but...folks have been giving that caveat for a lot of years and it hasn't really slowed down that I can tell.

    [Edit: That's to match the 155 hours described above. To bring that down to under a single hour requires approximately 8 more doublings, or 16 more years, for a total of 44 years. To be clear, since the movie is more than an hour, that implies rendering projection-movie resolution at significantly better than real-time.]

    -- Morgan

     

    Morse Law ugly applies to cpu cores, not processors. Just look now the GPU industry totally broke morse law by going four to 4096 cores or kernels in one chip.

    The really limiting factor for CPUs are basically "the speed of light" followed by other physical laws like energy turning to heat,
    Remember that it takes todays computers to design tomorrows computers.

    Quantum tunnelling is an issue at such small microscopic scales; it is one of the reasons why silicon is close to end of life; Quantum Tunnlling, is whereby electrons don't follow the path on the silicon wafer, but  'tunnel' through to somewhere else (which is apparently unknown). There is a finite chance of quantum tunnelling at any time, but the continued shrinkage in dies increases the liklihood.

    The production of this film was performed by hundreds of highly skilled technicians who were tasked with pushing the limits of 3D animation and optimizing every aspect of the images to make rendering as efficient as possible. The time needed to render would undoubtedly be significantly longer if it was one person who would not have all the insights of professional teams who do this for a living., A light or group of lights or a reflective surface from different angles can be optimized to work very quickly in the right set of variables or slow a render down to a trickle without the knowhow of what the science is behind the required modeling techniques to produce those results.
    That being said CPU gains have slowed drastically from the leaps they were making a few years ago, so has internal storage and RAM capacity we may see the same decline with next GPU power and that slowdown may last a while . All things considered the formula for the computing power needed to produce Disney's Monsters U was a supercomputer with 24,000 cores and took approximately 29 hours to render a single frame of that film, with 1 core of that computer that drops down to a single frame every 10,000 years. Personally I don't see rending a film like Frozen in an hour happening in our lifetimes.

    No, Homer. Very few cartoons are broadcast live. It's a terrible strain on the animators' wrists.


    I'd be cautious about making predictions. Consider the last fifty years of computers, and what was possible, or mainly not, and what now is possible.

    Personally, I'm looking forward to seeing what happens next; and I keep wondering if Quantum computers will become mainstream; can't help but feel the various agencies would love their ability to read everyones encrypted data to remain theirs alone. But then there would be Quantum Cryptography - apparently it is literally unbreakable as it takes advantage of the fact that observing an atom, fixes it in a state. So strange...

    Post edited by nicstt on
  • nicstt said:
    Quantum tunnelling is an issue at such small microscopic scales; it is one of the reasons why silicon is close to end of life; Quantum Tunnlling, is whereby electrons don't follow the path on the silicon wafer, but  'tunnel' through to somewhere else (which is apparently unknown). There is a finite chance of quantum tunnelling at any time, but the continued shrinkage in dies increases the liklihood.

    Quantum tunneling is when an electron (or other particle) gets from point A to point B despite an apparent barrier. In a semi-conductor device that means that there may be a 1 where there should be a 0, or vice versa. The electron doesn't disappear.

  • SimonJMSimonJM Posts: 5,971

    Way past time to coment, but isn't/wasn't Moore's Law based on a doubling every 18 months, not 24?

  • nicsttnicstt Posts: 11,715
    edited September 2015
    nicstt said:
    Quantum tunnelling is an issue at such small microscopic scales; it is one of the reasons why silicon is close to end of life; Quantum Tunnlling, is whereby electrons don't follow the path on the silicon wafer, but  'tunnel' through to somewhere else (which is apparently unknown). There is a finite chance of quantum tunnelling at any time, but the continued shrinkage in dies increases the liklihood.

    Quantum tunneling is when an electron (or other particle) gets from point A to point B despite an apparent barrier. In a semi-conductor device that means that there may be a 1 where there should be a 0, or vice versa. The electron doesn't disappear.

    Never said it did disappear, its destination is unknown, which isn't the same. And whilst it can be other particles, it shouldn't be in electronics, but I suppose not impossible due to the nature of Quantum Mechanics; it is just such a weird field, fascinating though.

    Post edited by nicstt on
  • SimonJMSimonJM Posts: 5,971
    nicstt said:
    Quantum tunnelling is an issue at such small microscopic scales; it is one of the reasons why silicon is close to end of life; Quantum Tunnlling, is whereby electrons don't follow the path on the silicon wafer, but  'tunnel' through to somewhere else (which is apparently unknown). There is a finite chance of quantum tunnelling at any time, but the continued shrinkage in dies increases the liklihood.

    Quantum tunneling is when an electron (or other particle) gets from point A to point B despite an apparent barrier. In a semi-conductor device that means that there may be a 1 where there should be a 0, or vice versa. The electron doesn't disappear.

    If memory serves, that is the explanation for Hawking radiation (energy 'given out' by a black hole) isn't it?

  • Joe CotterJoe Cotter Posts: 3,259
    edited September 2015

    Predicting the future based on current trends provide many potholes of error. New things come from left field that revolutionize everything we know or think and technology that people expect to be around the corner continues to be 'just around the corner' 30 years later. But for speculation's sake, there is HP's take on the future of computing. On a related note, as to quantum computing.. I don't have a link unfortunately but there were a couple articles recently that explained how quantum computing wasn't really the silver bullet everyone has thought of it as. It is good for certain case scenarios but provides no improvement for much of computing over traditional designs it ends up (at least according to the articles I'm referring to, again, appologies for not having the link.)

    As to how long would it take, re: aspects beyond simple increase in computing power... that comes down to refined tools that make once complex tasks more approachable, more efficient, more intuitive then current tools, which should also continue to evolve, but who knows how or how fast. Speculation can be fun, but it remains speculation. ;)

    If I was a betting man though, I think I'd bet the future would involve optical.

    Post edited by Joe Cotter on
  • Gedd said:
     

    If I was a betting man though, I think I'd bet the future would involve optical.

    I say it will be optical-mental.  The interface would be mounted at birth, Matrix-like, to the back of the human head.  The interface itself could be fibre optic with a small interchangable module on the back of your head, maybe with WIFI or Bluetooth.  There would be no electrical exposure, and therefore should be waterproof and dirtproof.

    Hopefully the Nvidia OptiMent Plug-N-Think with CUDA (tm) interface connector will be very small and won't interfere with lady's hairstyles or bald men's shaving practices.  surprise

     

  • iSeeThis said:

     

    Wonder how many years can we use a home-user computer to render the whole movie in one hour.

    Render only or to set up the scene? I think within my lifetime (I'm 41 right now) it'll be possible... The advances in computing science have been astounding so far... When I was in school a TSR-80 or a IBM 8088 with cassette tape recording was a highly desirable machine, and in Highschool, when 3 1/2" floppies and 100MB harddrives ruled,  we marveled at the popular science article about a 1" cube being used to store 1 GB of data, and couldn't even fathom how much 1 GB really was, Now we have 1TB flash drives ... With light based and quantum computers on the horizon, I'm not even going to try and predict what the future will hold, it's just not possible, history has shown most of those who say "we won't need more than X", to be worng more often than not.

     

    Now as far as set-up goes... unless they come up with a user interface that makes the keyboard/mouse obsolete (such as some sort of mind reading halo or direct brain interface)  The limiting factor here will always be the person setting the scene. It takes me days to put a scene together, and choose lighting/materials/placement, to get it to a point I like. If we get to the point where we can just think a scene and it appears, I suspect rendering computer images and even watching them to some degree will lose much entertainment/artform value.


  • That being said CPU gains have slowed drastically from the leaps they were making a few years ago, so has internal storage and RAM capacity we may see the same decline with next GPU power and that slowdown may last a while.

    I'd agree if we stick with the current silicon transistor tech that Moores Law will fade, as it is already being noted in tech industry... however, I don't see silicone based computers being the final form. If quantum or light based computers take over, we may  see a whole new Moores Law in the next 10-20 years.

  • Meh, in the future they will all be organic and able to grow things you need, you will interface with them via brain synapses like matrix but on a more organic chemical level, as our DNA mutates we shall become more like them until human and organic computer is the one thing, we shall have living spacecraft, cities etc growing its own fungi and vegetation to feed itself and recycling its and our wastes.

    movies and animation as such will be chemical mind experiences.

  • Meh, in the future they will all be organic and able to grow things you need, you will interface with them via brain synapses like matrix but on a more organic chemical level, as our DNA mutates we shall become more like them until human and organic computer is the one thing, we shall have living spacecraft, cities etc growing its own fungi and vegetation to feed itself and recycling its and our wastes.

    movies and animation as such will be chemical mind experiences.

    Personally, I don't discount this as a possiblity, but I really hope not to live to see it.

     

  • daveleitzdaveleitz Posts: 459
    edited September 2015

    Rendering a movie is so 20th Century!  cheeky

    Seriously, video games can already be played in 4k resolutions and look pretty danged good, too!  Fallout 4 is coming out soon, and you can bet that within a short amount of time there will be mods to add even more "realism" to the game.  The future is interactive, and the gaming industry is outpacing Hollywood already.  So, while a small portion of the CG enthusiast market frets over how long their still frames render, the vast majority of development and money is going into real time rendering using increasingly powerful gaming engines.

    Post edited by daveleitz on
  • ValandarValandar Posts: 1,417

    One thing to note - Rendering is very nearly the last thing for making a movie like Frozen. You also have:

    1. Plotting
    2. Storyboarding
    3. Scripting
    4. Art Direction
    5. Concept Art
    6. Modelling
    7. Rigging
    8. Texturing
    9. Voice Acting
    10. Animating
    11. Lighting
    12. Cinematography
    13. Sound Design / editing
    14. And FINALLY rendering.
    15. Then comes compositing, editing, etc.

    Oh, and I've seen a minor misunderstanding crop up in some posts - if a single frame of "Frozen" takes 29 hours, that doesn't mean their 20,000 cores (or what have you) EACH took 29 hours, it means 29 CPU-hours. So for 20,000 cores, that would be roughly 5.22 seconds per frame. That's like references to a manufactured product requiring so many "man-hours" to produce - a total of all workers on the project, and how long each worked on that project, not the time from start to finish.

  • IceCrMnIceCrMn Posts: 2,127

    I got that number from an article I read ( http://jimhillmedia.com/editor_in_chief1/b/jim_hill/archive/2013/10/09/countdown-to-disney-quot-frozen-quot-the-flaky-design-idea-behind-the-look-of-elsa-s-ice-palace.aspx )

    "..... Jennifer Lee, co-director of Disney "Frozen." "And that shot is so complex that just one frame took 30 hours to render. That's a perfect example of how much the 'Frozen' production team has put into this movie. And it really shows. The finished film is just breathtaking." " .

    I'm sure not every frame took that long, but IMO, the wait was worth it. :)

     

  • nicsttnicstt Posts: 11,715

    There is something else to bear in mind; that post was two years ago, and Frozen started a year or so before that; there have been advances since then.

    Just look at what this generation of fgx cards can do, and what NVidia is expecting from its next generation.

    Also as technology developes, speeds increase, there is a constant desire to get the hardware and software to do more; so it isn't just rendering faster, it is adding features onto that rendering process - more so with real-time than static, but not completely. So sometimes there might not seem to be much of a speed increase, but the rendering engine is doing far more and a little quicker to.

    Take the simple process of increasing resolution, that increases render times, simply because of the sheer increase of the number of pixels being rendered over previously; did they render it at 1080 or higher and then reduce the size, or pick the exact portion from the rendered scene they wanted to appear?

  • StratDragonStratDragon Posts: 3,167
    edited September 2015
    nicstt said:
    Totte said:


    I'd be cautious about making predictions. Consider the last fifty years of computers, and what was possible, or mainly not, and what now is possible.

    no prediction was made by me. I said "personally" because it was my opinion.

     

    P.S. the quoting system on this software is not liking my brower or it works poorly, I can't tell.

    Post edited by StratDragon on
  • larsmidnattlarsmidnatt Posts: 4,511
    edited September 2015
    nicstt said:
     

    Take the simple process of increasing resolution, that increases render times, simply because of the sheer increase of the number of pixels being rendered over previously; did they render it at 1080 or higher and then reduce the size, or pick the exact portion from the rendered scene they wanted to appear?

    i dont know, but I would imagine they would render at IMAX resolutions, I thought they showed this on IMAX but I could be wrong.

     

    StratDragon, I hate the quote system too.

    Post edited by larsmidnatt on
  • I'm holding my breath for graphene processors. IBM got a wireless receiver made out of graphene working, that was 10,000x faster than it's silicon couterpart. There are obvious and monsterous hurdles to overcome, but IBM has dumped around $3b into development of the technologies and expects to have a working processo by as early as 2019. I believe right now the theoretical clock is 1THz.

    That's 2 orders of magnitude faster than our current processors, and then some. It's the same step as going from MB to GB. It's big.

  • IceCrMnIceCrMn Posts: 2,127

    I'm holding my breath for graphene processors. IBM got a wireless receiver made out of graphene working, that was 10,000x faster than it's silicon couterpart. There are obvious and monsterous hurdles to overcome, but IBM has dumped around $3b into development of the technologies and expects to have a working processo by as early as 2019. I believe right now the theoretical clock is 1THz.

    That's 2 orders of magnitude faster than our current processors, and then some. It's the same step as going from MB to GB. It's big.

    That's amazing.The highest clock speed I've ever read about is 10GHz on an AMD CPU using liquid nitrogen as the coolant.I think I read somewhere that 5GHz is more or less the max stable frequency that can be had from silicon, and that's one of the reasons we don't see consumer CPUs exceeding much more than 4GHz from the factory.

  • The 1THz number is the theoretical limit right now, I believe. I would expect a commercial grade processor to be ~200GHZ if that were the case. This is all speculation at this point, however. Still exciting.

  • nicsttnicstt Posts: 11,715
    edited September 2015
    nicstt said:
     

    Take the simple process of increasing resolution, that increases render times, simply because of the sheer increase of the number of pixels being rendered over previously; did they render it at 1080 or higher and then reduce the size, or pick the exact portion from the rendered scene they wanted to appear?

    i dont know, but I would imagine they would render at IMAX resolutions, I thought they showed this on IMAX but I could be wrong.

     

    StratDragon, I hate the quote system too.

    I was reading/watching about the 4k resolution, and how they film in larger resolutions already, because it allows to to pick the exact part they want.

    We're used to increses like 1, 2, 3, 4... this is linear increasing; it is shown on a graph as a straight line; with Moore's Law, however, we've had expotential increases; this has largely kept pace if one considers not just the speed increses but other factors that determine a processor's performance. This increase, which doubles every time, immediately doesn't show great increases: 1, 2, 4, 8, 16, 32, 64, 128; it does fairly quickly though show amazing increases. We have got to the stage whereby those changes, presuming Moore's Law continues to hold true, are going to be truly massive.

    A BBC program, Horizon, said that the processing power of a PS4 is greater than a military super computer of fifteen years ago; now that is pretty damn staggering.

    Think of what a super computer can do now; fifteen years time, the PS10 wil match it? cool

    Post edited by nicstt on
  • IceCrMnIceCrMn Posts: 2,127

    Judging from what we have today, in 15 more years the line between what is real and what isn't will be impossible to discern with the naked eye.The actors on ones favorite TV show could be Daz Studio characters and you wouldn't be able to see that they are only CGI.

     

    ...or have we crossed that line already?

    nicstt said:
    nicstt said:
     

    Take the simple process of increasing resolution, that increases render times, simply because of the sheer increase of the number of pixels being rendered over previously; did they render it at 1080 or higher and then reduce the size, or pick the exact portion from the rendered scene they wanted to appear?

    i dont know, but I would imagine they would render at IMAX resolutions, I thought they showed this on IMAX but I could be wrong.

     

    StratDragon, I hate the quote system too.

    I was reading/watching about the 4k resolution, and how they film in larger resolutions already, because it allows to to pick the exact part they want.

    We're used to increses like 1, 2, 3, 4... this is linear increasing; it is shown on a graph as a straight line; with Moore's Law, however, we've had expotential increases; this has largely kept pace if one considers not just the speed increses but other factors that determine a processor's performance. This increase, which doubles every time, immediately doesn't show great increases: 1, 2, 4, 8, 16, 32, 64, 128; it does fairly quickly though show amazing increases. We have got to the stage whereby those changes, presuming Moore's Law continues to hold true, are going to be truly massive.

    A BBC program, Horizon, said that the processing power of a PS4 is greater than a military super computer of fifteen years ago; now that is pretty damn staggering.

    Think of what a super computer can do now; fifteen years time, the PS10 wil match it? cool

     

Sign In or Register to comment.