OT: Effects of GPU's on Room Temps

ebergerlyebergerly Posts: 3,255
edited December 2017 in The Commons

A while back I recall a discussion here regarding how or whether the power drawn by GPU's could raise the room temps so much that it turns on the air conditioner, thereby giving an additional cost to run the GPU. Coincidentally, today gamers nexus posted a video where they did some tests along those lines. Of course, the results can vary immensely depending on a ton of variables (size of room, room temps, etc.), but I though it was kind of interesting what they found for their particular room setup.

They ran a 900+watt mining rig with 3 1080ti's and 1 1080 and that did cause the air conditioner to turn on, with the room starting at 68F (20C), and the thermostat in the next room set to 70F (21C). After an hour or so the temps at the thermostat crept up a couple degrees, enough to turn it on.  

Though with a 600 watt rig, the temps never got high enough to turn on the A/C, since the temps in the other room were pretty flat for the 7 hour test. The temps in the test room with the rig only increased steadily by about 5C after 7 hours (going from 68F to 77F). Of course that's a heavy load, more than 2 1080ti's running flat out for 7 hours.

And with a 400watt GTX 1080, the temps in the test room and adjacent room were pretty much flat, with negligible rise throughout the test. 

So can a GPU cause your air conditioner to go on? Not likely. Maybe if you're running multiple cards flat out continously and you're in a small room and the thermostat is in the same room and you leave the door closed or something like that. Otherwise, probably not.  

Post edited by ebergerly on

Comments

  • nonesuch00nonesuch00 Posts: 18,120

    My GPU+CPU generates about 35 watts per hour when it is rendering.

  • JamesJABJamesJAB Posts: 1,760

    It's math, your computer is a fancy space heater.  The "can your computer heat a room?" argument should be pitting various computer configurations up against known wattage heat dispensers equipped with fans at the same wattage (Hair Dryers and space heaters for example)  Would be intersting to see if a computer that uses 1100w under load heas a room at the same rate as a 1100w hair dryer, or computers equaling the rated output of a space heater heat a room at the same speed?

  • ebergerlyebergerly Posts: 3,255
    JamesJAB said:

    It's math, your computer is a fancy space heater.  The "can your computer heat a room?" argument should be pitting various computer configurations up against known wattage heat dispensers equipped with fans at the same wattage (Hair Dryers and space heaters for example)  Would be intersting to see if a computer that uses 1100w under load heas a room at the same rate as a 1100w hair dryer, or computers equaling the rated output of a space heater heat a room at the same speed?

    The answer to your question is it makes no difference what device dissipates the power and generates heat. If heat is generated it will dissipate into the room. What changes is the speed that the heat is transferred into the room, and that depends on whether there are fans blowing the heat. And reflective surfaces like in a space heater also affect the transfer of heat. But no matter whether it's a space heater or hair dryer or GPU, it will move from the hot thing to the cooler thing (the room). 

  • If you search for "data center heat load" you'll see this is a real issue for when you have lots of computer equipment in a location.

  • dragotxdragotx Posts: 1,138

    There is a noticeable difference in temperature between the living room and the dining room (where I have my computers set up) when I have either of my machines rendering for a while.  Bigger difference if they have both been running.  If the thermostat was in this room with my computers and the ambient temperature was close to where it was set, then the temperature difference would be enough to kick on my AC.  It's not enough of a difference to heat the living room enough to affect the thermostat over on the other side of the room though

  • GatorGator Posts: 1,294

    There's certainly a notice able difference here if both rigs are rendering or mining or a combination of both, we're talking two hybrid water cooled 1080 Ti and two water cooled Titan Xs.  Mention the water cooling, that's important as with water cooling they do get pushed harder even if you don't overclock, so they would put out more heat.  With stock air cooling, likely they will wind up with throttling kicking with lower clocks and possibly voltage.

  • nonesuch00nonesuch00 Posts: 18,120

    I've used 1500W space heaters in about a 200 sqft room in winter and you practically have to hug them to feel the heat if there is any draftiness in the room. They are effective in modern houses but not so effective in older houses that have more outdoor air draft circulation.

    I've also been system administration to about 300 Solaris servers and those server rooms got so hot it'd been over 100F in them. Great for getting warm after being outside in the winter. The machines got so hot that when we replaced broken parts in them, the various heat sinks often would just fall out of the machines when we tipped them up.

  • AllenArtAllenArt Posts: 7,169

    My last computer used to get pretty darn toasty when it was rendering, but nothing raised the temp in my room more than my plasma TV. That WAS a heater in and of itself. LOL

    Laurie

  • joseftjoseft Posts: 310

    Does there really need to be tests to confirm if a computer can heat up a room?

    Mine certainly does. Even with ducted air conditioning, there is a very noticable temperature difference between rooms when my machine is under load.

    great in winter...not so great in summer

  • PetercatPetercat Posts: 2,321

    Since I run electric heat in my home, this is why I replace the LED lights in my home with incandescents during heating seasons.
    No point in burning the expensive ones when there's no savings on my electric bill.
    If your computer uses 1500 watts, it gives off as much heat as a 1500 watt space heater. The only practical difference is that
    you can aim a space heater to put the heat where you want it, like the small one I have under my desk for my feet.

  • DustRiderDustRider Posts: 2,739
    joseft said:

    Does there really need to be tests to confirm if a computer can heat up a room?

    Mine certainly does. Even with ducted air conditioning, there is a very noticable temperature difference between rooms when my machine is under load.

    great in winter...not so great in summer

    +1

    Especially since it's a well know fact that any large server rooms require their own cooling systems. We had to have air conditioning in the server rooms I was in charge of, even when temps outside dipped below 0 degrees F (yes, it would have made sense to just bring in the cold air .... but the HVAC people couldn't figure that one out). I fondly remember spending my New Years Eve setting up fans to cool the server room (was below zero outside) when the AC froze up, and monitoring the temp for a couple of hours to make sure I had enough cooling, and didn't need to go back to WalMart for more fans. Anywhere I worked, my office was always the hottest office in the building because I had three to four computers working pretty hard all day long (other employees would come into my office to warm up).

  • fred9803fred9803 Posts: 1,564

    They're forcasting 40 degrees (104F) for Thursday, so I might give rendering a miss. But I haven't had any overheating issues with ambient temps like this before.

    I have a standalone airconditioner that vents hot air out the window. A PVC pipe and duct tape, position your PC near a window, and Bob's your uncle with regard to heating your room.

  • ebergerlyebergerly Posts: 3,255
    edited December 2017
    joseft said:

    Does there really need to be tests to confirm if a computer can heat up a room?

    Mine certainly does. Even with ducted air conditioning, there is a very noticable temperature difference between rooms when my machine is under load.

    great in winter...not so great in summer

    Any heat source in a room can "warm it up". Even your body, and everyone in the room, generates at least 100 watts each just sitting there. As do light bulbs and sun comining in the window and so on. The question is how much. "Noticeable" can mean anything, and can be caused by many things. Actual measurements like this erase misconceptions. Like the misconception of "if I buy a GTX 1080ti I'll pay higher electric bills for air conditioning". In general thats probably not true. Of course there are exceptions, but for most users its probably not true
    Post edited by ebergerly on
  • McGyverMcGyver Posts: 7,050
    edited December 2017

    On some CGI forum a couple of years ago I saw someone post their solution for their monster computer's heat issues... They made a duct that hooked up to a small window A/C that hooked up to a cabinet the computer was kept in... It seemed like a "goodish" idea, but the cabinet and duct work was crummy and the door to the cabinet looked inconveniently placed. The way it was made it didn't seem like there was good airflow or it was well planned, but the idea was probably good... Also it was not apparent how he dealt with a thermostat.

    Small window A/Cs can be pretty cheap... But if you don't think out your airflow, thermostat and ductwork well, you could probably cause condensation damage to the computer... But done right, that's pretty dry, cool air... The cabinet would be like a mini server room.

    Post edited by McGyver on
  • MasterstrokeMasterstroke Posts: 1,983
    edited December 2017

    LoL My Pc is right next to my right hand. No more freezing "mouse hand" while rendering anymore. :-D

    Post edited by Masterstroke on
  • Peter WadePeter Wade Posts: 1,622

    I'm in the UK, I haven't got air conditioning but if the GPU could help with the room heating it would be a bonus. Unfortunatly my 1050ti isn't powerful enough to make much difference.

  • dreamfarmerdreamfarmer Posts: 2,128

    I have, more than once, started a render because my feet were cold.

  • ebergerly said:
     

    So can a GPU cause your air conditioner to go on? Not likely. Maybe if you're running multiple cards flat out continously and you're in a small room and the thermostat is in the same room and you leave the door closed or something like that. Otherwise, probably not.  

    Sure it can. The main factor is isolation. If the room is completely isolated, the generated heat will warm up the room. If the room is badly isolated but the house containing the room is well isolated, then you're trying to heat up the house. Otherwise....

    Depending on the volume to heat and wether there are thermal exchange with the "outside" the time needed to warm up the room will vary

    As you have no datas about room size and how much heat is exchanged/lost with the "exterior", the test doesn't mean anything

    If the room is badly isolated, there is certainly a minimal quantity of energy to afford in order to heat the room. Otherwise it is completely dissipated outside (which would explain the result with just a 400W gtx 1080). With bad isolation, you're trying to heat up the planet with your GPU (let's say just your city)

    Why not another meaningless test ? Try to cold a room with the fridge's door open

  • drzapdrzap Posts: 795
    ebergerly said:
     

    So can a GPU cause your air conditioner to go on? Not likely. Maybe if you're running multiple cards flat out continously and you're in a small room and the thermostat is in the same room and you leave the door closed or something like that. Otherwise, probably not.  

    Sure it can. The main factor is isolation. If the room is completely isolated, the generated heat will warm up the room. If the room is badly isolated but the house containing the room is well isolated, then you're trying to heat up the house. Otherwise....

    Depending on the volume to heat and wether there are thermal exchange with the "outside" the time needed to warm up the room will vary

    As you have no datas about room size and how much heat is exchanged/lost with the "exterior", the test doesn't mean anything

    If the room is badly isolated, there is certainly a minimal quantity of energy to afford in order to heat the room. Otherwise it is completely dissipated outside (which would explain the result with just a 400W gtx 1080). With bad isolation, you're trying to heat up the planet with your GPU (let's say just your city)

    Why not another meaningless test ? Try to cold a room with the fridge's door open

    I once designed a theater for a home where the room was so densely insulated that more than two or three people in the room would noticably heat it.  He over insulated it to keep the sound energy from escaping but then had to spend more money to have a properly engineered cooling system for the room.  The guy must have spent $200,000 for that room after all was said and done.  But yes, you are correct.  The thermal efficiency of the ventilation and insulation makes a world of difference.  In China, all houses are basically concrete boxes.  Even the expensive ones.  I have yet to see any insulation installed anywhere.  Heat escapes the room almost as fast as it is generated.

  • ebergerlyebergerly Posts: 3,255
    edited December 2017

    Sure it can. The main factor is isolation. If the room is completely isolated, the generated heat will warm up the room. If the room is badly isolated but the house containing the room is well isolated, then you're trying to heat up the house. Otherwise....

    The question is not whether "it can". Of course anyone can come up with a configuration where a single GPU can cause the air conditioner to turn on. Which is why I said "not likely". But that's not the issue. The issue is whether it is reasonable to assume it applies to many or most users here, or to the average user. Is it a statement that is generally true that a GPU will turn on your air conditioner? Or just a rare occurrence in a special situation? 

    Guaranteed, in a small enough room with, as you say, good enough insulation, and a closed door, AND THE THERMOSTAT IN THE SAME ROOM, and after enough time, and with certain settings on the thermostat and certain starting room temperature, after a long enough time of rendering flat out, you can probably get the air conditioner to turn on solely because of the addtional power from the GPU. But as a general statement, will the average user have those exact conditions? Will the thermostat be in the same room, or maybe downstairs? Will the door be open, allowing heat to dissipate into the house? And so on....

    Again, an exception does not make a rule. You can always find exceptions to anything. But as a general statement, it seems reasonable, and is supported by at least this one set of tests, that a single GPU probably won't cause your air conditioner to turn on. Obviously, if you have a 5,000 watt server farm inside a building it will require major air conditioning. But that's somewhat irrelevant for users here.   

    Why not another meaningless test ? Try to cold a room with the fridge's door open

    For those who want to believe that a single (or even two) GPU's have a reasonable chance of causing their air conditioner to turn on, I encourage them to do their own unbiased tests. Personally, I tend to place more weight on actual test data, using thermometers and stuff, rather than hunches based on just personal opinion. 

     

    Post edited by ebergerly on
Sign In or Register to comment.