Memory and DAZ 4.5
This is what I did. I put in one of those W7 gadgets to monitor my CPU and RAM. The CPU stays around 2% usage, while RAM is at 38% just typing this, without DAZ running. CPU stays low when doing small stuff, but (as someone else jargonized) when the "dots" add up, so does the usage. I took Genesis, as is, and played with the res settings. This is a 64bit machine with 4Gb RAM. 3 divisions slowed it down a wee bit, but 5 crashed the machine like all the partying animals in "Hot to Trot" when Bob Goldthwait walks in all of a sudden.
The question is thus: Is the danger zone decided by RAM capacity or CPU? How much RAM can DAZ access? I'll play with this some more. Any experiences related to this are quite welcome.
Comments
How much of each depends on what your doing.
The application may not need 100% CPU for some things and then need it all for rendering or something else.
How much RAM? As much as it needs, there is no cap that you will reach between the application, the OS and the amount of physical RAM available on the market at the time of this writing unless you have 100 or 100's of 1000's of GB available to you. Studio is a notorious RAM hog but that's the nature of this application.
What's the danger zone?
Among other things; subdividing: because every time you do you exponentially raise the number of surfaces on a model and that's more computing resources needed to compensate. The question is why are you clicking it specifically are you trying to achieve something the model is incapable of in it's current poly count?
You don't need a gadget to show you what you already have built in: CTRL+ALT+DEL and bring up system monitor (I'm blanking on the name, I'm on my Mac) that is already running, no need for two of them especially when you have the one the developers of your OS designed.
As Dr. McCoy once said, "it's a phaser, not a flashlight Jim."
The gadget is really nice for this ready check and uses zilch RAM or CPU of its own. Less steps to monitor.
I loaded DAZ again. Takes about 7% of the RAM. Still have quite a bit to spare. I have the mesh for the Genesis figure adjusted to 2, now. That works well and I loaded more than one figure with no noticeable difference, except during the load process.
What I have done thus far is created morphs (that unfortunately have related back to the original Genesis figure... :( ...sigh) and work quite well for a hard edge realist for making characters for a novel I've written (modernized HG Wells' "Time Machine...") and the Morlocks, in my reworked version, aren't dressers by nature, thus let it all hang out. For that, I'd say it's all good, but not for a course on gross anatomy and that is a concern. Told you that to tell you this:
When I endeavor to create a tiny orifice, the deformer simply refuses to cooperate. I used the higher res, still no cigar. So arose the concern about memory. Does any of that memory restrict us from making tiny morphs or is there simply an inherent limit? If the latter, is there a setting to allow for tiny d-forms?
If you are wanting to know about the cpu and ram usage of your pc then go to control panel-Administration Tools-Performance monitor, then click on the blue link called "Open Resource Monitor"
That should and will tell you more about what your computer is doing and what is been used then any widget thingy.
the dforming tool in Studio "precision" work is based on the subdivisions of the figure. Too few subs and not much result. Since all parts of the dforming tool can be scaled, rotated, and situated it may require you to manipulate the dformer to get a desired result.
http://www.youtube.com/watch?v=7SEBV2GHJHA
Thingy... :)
I first heard that term in the Beatles' "Help"
That's a good bit of advice, but the thingy is right there at a glance. I dragged the performance meter next to it JIC I need some in depth stats. Of course it doesn't offer memory addresses, stack sequences and such, but usually if there's a serious problem with the stack you know it rather abruptly when the system reboots itself in mid mouse-move. Had that happen on my other machine yesterday, or at least I suspect that occurred.
Maybe Santa will bring me a nice new 64bit PC with 16Gb RAM for Xmas ;)
Hi Strat Dragon,
Is there a way to increase the resolution of only one section; the hip, per se? At res 2 for the whole Genesis, it does okay, but precision lacks. Same for res 3, but if I set it to res 5, the system slows to a near standstill. The CPU meter approaches the redline area. I neglected to check out the RAM usage at that moment, being predisposed to recovering the system resources.
Also, Strat:
I already have that video. It's a good help, but one needs to know that panes are in the window menu, not the former view menu.
She does offer a lot of instruction. I think that video opened up quite a few doors in DAZ.
Fair point dude, I understand the ease of use from a desktop thingy...
It's "what ever gets you through the night" :)
I don't think you can mix subsurf resolutions on a figure, at least I've never tired. As for dformers limitations you may want to see if it's possible to do it in Sculptris which is another application I need to revisit since I upgraded to Studio 4.5 and I had problems with it in Studio 3.
http://pixologic.com/sculptris/
it's free and once I "master" blender I'll try it again.
Next build I do; at least 32GB RAM, at the very least.
if your machine is rebooting without your consent it could be heat as in too damn much.
32Gigs... a real dream machine. Sehr viel geld or it falls from the sky. I have Sculptris. When I brought an object back in after modifying it, the riggings were gone and so were all the subsurface areas. I do like Sculptris. May be I just need to join their community and ask queries there.
I was able to do precision work on a plane in DAZ. I think I had it broken into 100 x 100 segments. I think the plane was rather large though. That of itself might be the reason it worked where the precision failed on Genesis. Do you think it might make a difference if I scaled Genesis up to Amazing Colossal size, then scaled back to normal afterward?
I don't think scaling would make any difference but I've never tried it. I have seen scalled up objects that look like they were decimated when they rendered or in preview, alot of what you can do is based on doing things your not supposed to do, that's my workflow anyway.
BTW... that's true, the machine can overheat, but programs that fail to release memory when and, more importantly where, they should can throw a critical exception error and trigger an automatic reboot. Back in the days of DOS .bat files one could program the machine to reboot at the press of a certain key. You'll note perhaps some software will do this for you, but usually after a prompt.
In those days the "work flow" you describe would be adding a mouse handling set of routines to Q-BASIC, where QuickBasic already had them. Thus one was able to do what they weren't supposed to do. I just tried MCasual's SuperLathe and watched the CPU meter run up to 50% while it did the math on a Genesis morph, then return to 2% afterward. RAM changed very little.
Sub-d quadruples the number of polygons at each level, it should never need more than level 2 for Poser/Studio models. Sub-d does not add details, it adds polygons, in fact because it averages or smooths the new polys it actually decreases detail and sharpness.
I imagine that I should keep pushing buttons, seeing what happens and customizing with add-ons. I have another query, so I'll begin an appropriate thread for that one...