Writing a custom render engine for Daz3d

I'm planning to design a standalone OpenGL software for non-photorealistic rendering. (using C++ and Python)

I'm also looking to see if it can be used as a render engine for Daz3d.

Since I'm inexperienced in writing a render engine for Daz3d, I was hoping to hear from anyone that has insights on key challenges, downfalls of implementing one instead of alternatives (e.g. writing a model exporter), etc.

For example, if I write a render engine now, will it require constant maintenance in future Daz3d versions?

How realistic is it for a one person working on it in his spare time outside his full-time job to create a render engine for blender? Is the scope feasible?

What are some design considerations in terms of best practices, etc?

Comments

  • I'm going down this very road right now, and here is what I recommend based on what's working for me. 1) Define your data. What information do you need for your renderer to produce an image? I started with the OBJ and MTL files produced by Studio, and when I have more time, I'll likely be implementing an exporter that can take a 3Delight or Iray shader and export it to a PBRT-compatible material. For now, the basic information produces a decent image, but I intend to build a modular renderer for NPR experiments (my two favorite looks right now are the Tom of Finland pencil sketch, and the early Lisa Frank look [when they were still using a physical airbrush]) and implementing techniques outlined in white papers.

    Whether you'll need to rewrite your renderer down the road depends on the architecture used as much as it does on whether DAZ changes their format. Given their longstanding support of 3Delight, I suspect that things will be stable for quite some time on the DAZ end of things, even if a new material shader format is added.

    It's definitely feasible for one person to build a render engine, especially if you build in a modular fashion. What I did was build my engine in layers. First, I built the basic renderer, using procedural shapes (triangles and spheres, then planes and boxes built from those planes). Then, I added basic diffuse, specular, and reflective materials, and patterns such as noise, checks, and stripes. Finally, adding emissive materials, I had lights. From there I built my OBJ and MTL importer, and used stb_image for reading texture files, including alpha maps. That's where I am right now. I'm using Peter Shirley's "Realistic Ray Tracing" 2nd edition, his Kindle trilogy on building a ray tracer, "Texturing and Modeling: a Procedural Approach" (3rd edition), and Morgan McGuire's "Graphics Codex". When I have exhausted these, I'm moving to "Physically Based Rendering" (3rd edition) for my final offline renderer and "Real-Time Rendering" (4th edition) to build my scene viewer/editor.

    It is incredible rewarding to build the engine through which imagery and animations can be visualized, and I've found non-photorealistic rendering as fun and challenging as photorealism. I've attached an image rendered with what I've built using just Peter Shirley's writing so far. I'm happy with where it's at quality-wise, so I'm taking some time and experimenting with mesh deformation techniques. It's been 10 years or so since I worked with OpenGL, but feel free to PM me if you have any questions on the rendering side of things. If I don't know, I can usually at least point you to a source who will.

    Owen-samplediscard-preview.png
    768 x 1536 - 670K
Sign In or Register to comment.