Tafi Announces Revolutionary Text-to-3D Character Engine

13

Comments

  • ArtiniArtini Posts: 9,471

    Very interesting.

    Would be great, if they implement a feature to use only Daz assets already purchased in the store by the user.

     

  • FirstBastionFirstBastion Posts: 7,762

    .It will be interesting to watch this evolve.  Hopefully it captures a new market segment. 

  • ArtiniArtini Posts: 9,471

    Yes, I will also watch its development with the great interest.

    As a side effect it could help develop language skills.

     

  • ANGELREAPER1972ANGELREAPER1972 Posts: 4,509

    would we have to buy an aditional license like along the lines of the editorial one?

    Another question could we create a scene normally andthen using this enhance the quality and add more effects and other elements to it 

  • hjakehjake Posts: 895
  • AllenArtAllenArt Posts: 7,169

    I have no use for this. It's a nice concept, I guess. I hope it works out. Now can we have Daz Studio 5 please? LOL

  • marblemarble Posts: 7,500
    edited June 2023

    Been away from the DAZ universe for a few months, just looking in today to see what's new. Very little, it seems. The first thing that greeted me was a Cloudflare 504 error while trying to access the forum. Same old, same old.

    Anyhow, I thought this thread might contain a glimmer of hope but now I'm discouraged after reading some of the comments - especially from @Padone . Still, NVidia Omniverse and USD seem like positive steps but it seems they will not make things that much easier for those of us who would just like to easily take characters from DAZ Studio to something else, like Blender for example, and animate them with more sophisticated tools and without having to wait for several weeks of processing time to get a few minutes of footage.

    So maybe I will return to my hibernation and come up for another look in another few months. Maybe we will have DAZ 5 (hah-hah)?

    Post edited by marble on
  • outrider42outrider42 Posts: 3,679

    Padone said:

    outrider42 said:

    Unreal Engine 5 dynamically adjusts geometry AND texture sizes on the fly as needed in a given scene, totally nullifying your key points ..

    Sure the shaders are different ..

    Again that may work for a static object as a prop, but unfortunately not for animation, especially character animation. The culprit is HD morphs and jcms especially used by G9 other than most HD figures. The game engine can of course use dynamic subdivision and textures, this is not new 3delight can do it as well that's ages ago, but this only supports the HD shape, not HD morphs. Unless you export alembic, but that would be a huge requirement in terms of memory for HD animations.

    Then there's the shaders as you agree on, that's not a minor point at all.

    People are doing this right now, and have been for a while. Their computers are not exploding under these extremely massive Genesis models that supposedly make game engines explode. Maybe it's magic? Maybe it's Maybeline?

    In this video here, a fellow sends Victoria 9 to Unreal. She is wearing just a bikini, but the point I want to make here is that he does nothing special to her materials during the process. He imports her and applies a basic Unreal skeleton and that's it. Yet somehow she looks fine? She doesn't look the same as she does in Iray, obviously, but that is to be expected, and most importantly she doesn't look like a freak. She looks just like the Victoria 9 we all know. Also, he gives her a hair from Metahuman, a video game hair. So he skips using a Daz hair. You don't have to limit yourself to Daz assets. It is possible to add things in the game engine software. But he got a huge head start on the model by using a Genesis.

    That is like the whole purpose of Daz Studio and Genesis, you don't have to create a model from scratch, but you can also modfy and add to the model if you wish. You are missing the boat here complaining that game engines cannot handle Genesis. Clearly they can.

    Daz Studio has some pretty harsh requirements itself, unless you enjoy rendering for hours on end. An older CPU can and will struggle just running the viewport in Daz. The 3d world is simply not too kind to people who lack hardware. It can be done, but painfully. There is no difference here between Daz Studio and other software in this regard. Look how many people in this forum have 3090s, 4090s, or multiple GPUs just to render Daz Studio faster. There is a wide spectrum of users here, and some are only using an old laptop with a CPU. You can do that with video games, too, BTW. Just like Daz, you don't get to do a lot with such hardware, but you can still enjoy a wide variety of games on a fairly junky laptop. I see no difference here. The people rocking high end hardware will obviously be able to take better advantage of this software. 

    But is worth repeating what Mada said, you don't need a high end PC for this new app. You don't even need a PC. You can do this on a tablet as the software runs in the cloud. Somebody can get their kicks without any decent hardware. Now certainly exporting to a game engine is not going to work very well with just a tablet, but the point is that they can use this text to Daz app on anything and make renders.

    As long as the app is not a technical disaster, which frankly is hard to say with Daz-Tafi, it has a lot of potential.

  • PadonePadone Posts: 3,700

    @outrider42 Of course yes, if you limit the daz features then it is possible to export without HD morphs and with approximated materials. That may or may not work fine enough depending on the specific figure and use requirements.

  • CHWTCHWT Posts: 1,179
    Sorry if this has been answered before, but can DAZ/Tafi claim the copyright of the final render, even though I actually own all the products 'summoned by AI'? If that's the case I am not going to use it - a similar reason to why I don't post in the DAZ gallery (I remember DAZ can use your renders for promotion if they want to)
  • wolf359wolf359 Posts: 3,828
    edited June 2023

    Here is the base genesis 9 figures in Blender rerigged with Auto rig pro  fitted with a legacy piece of poser clothing and a simple mixamo animation retargeted to it with the powerful ARP retargeting system.
    About 7 minutes of work.
    from there I could have easily sent her to UE5 or Unity with ARP's powerful game engine exporter.

     

    @outrider42 there already exists the means to get any genesis figure into a major 3DCC or game engine
    and get them re-rigged animated( FBX,DAz bridge, Reallusion Accurig, Diffeo,Blender ARP)
    However the HD details etc. you would have in natively in Daz studio get left behind making Daz genesis no different than any other character model from model from any other source.

    I even did similar with the vestigial “Jessie” model from the  Poser 6 era and dressed her with an outfit from a free “Fortnite” character from sketch fab.

    This new Tafi AI thing is cool for figure creation  ( would be even better if they added voice control) but it only exports to USD export so you can only use your game engine or omniverse to render a pre animated figure from Daz studio.

    However it does not offer anything really new to those who will still have to do the same amount of work work to get a Daz Character moving in their game engine or 3DCC.in fact despite the tedious procees in the video you posted , the Official Daz to UE bridge plugin is BETTER than USD because you still have the Daz rig for UE5 retargeting.  

     

    Post edited by wolf359 on
  • Richard HaseltineRichard Haseltine Posts: 101,017

    CHWT said:

    Sorry if this has been answered before, but can DAZ/Tafi claim the copyright of the final render, even though I actually own all the products 'summoned by AI'? If that's the case I am not going to use it - a similar reason to why I don't post in the DAZ gallery (I remember DAZ can use your renders for promotion if they want to)

    I can't immediately see why the terms would be inherently different from buying the content directly and setting the scene up by hand, but of course we don't know what the license terms will be (especially for renders done by the Daz servers).

  • I am looking forward to it both as a PA and a user.  It really is getting hard to find content, exactly what you want in the store. If you can just text promt and it'll find the relevant items and you can try them out in a scene, sounds great. I am signed up for it. I am not clear about what the payment model actually is exactly but I trust Daz to figure out a good system. I am also fascinated by other forms of AI coming out, in addition to being worried about it. My inclination is to "ride the horse in the direction it's going" and learn to use what's useful.

     

  • marblemarble Posts: 7,500

    Bring on the USD ...

  • csaacsaa Posts: 824

    Handspan Studios,

    I'm curious and excited about this too. Right now I can imagine this AI/NLP interface fitting well into the creative workflow:

    (0) Receive project brief from client. Mull over it. Let it sink in.

    (1) Do the research, looking for inspiration (Pinterest/Instagram/museums/books/etc.) and coming up with a mood board.

    (2) Nail down the ideas as words: a mind map or a concept chart.

    From here on a Daz Studio with NLP interface 'ala Midjourney comes in handy. As things are today, relying on mouse cursor, sliders and radio buttons -- the daz3d.com or Daz Studio user interface -- we break the immersion. The software kicks the brain out of the creative flow that involves words, themes and snatches of visual concepts. But with text prompts ... we stay in the flow and hashing out the visual representation becomes easier.

    I agree with the others who point out that the new prompt-interface doesn't obviate the hard tasks that follow. We still have to use Daz Studio to set up the scene, light it, customize materials. And if the end platform is something else -- a game engine like Unreal -- then the work remains as challenging as it is today.

    For all the wonders of AI, it's not a silver bullet. There's still yeoman's work to be done by human hands. But for spitballing ideas and concepts, I think this new text interface is a very useful tool.

    Cheers!

    Handspan Studios said:

    I am looking forward to it both as a PA and a user.  It really is getting hard to find content, exactly what you want in the store. If you can just text promt and it'll find the relevant items and you can try them out in a scene, sounds great. I am signed up for it. I am not clear about what the payment model actually is exactly but I trust Daz to figure out a good system. I am also fascinated by other forms of AI coming out, in addition to being worried about it. My inclination is to "ride the horse in the direction it's going" and learn to use what's useful.

     

  • ALLIEKATBLUEALLIEKATBLUE Posts: 2,970
    Very interested in AI but haven't so far. Would love to try first using something I'm familiar with. I have signed up and looking forward to it.
  • NylonGirlNylonGirl Posts: 1,825

    I think some of the people in here are convinced they'll be able to type "Olympia 9 sitting in a chair" and get a scene with Olympia 9 with hair and clothes, properly posed in a seated position in a chair. I think it's much more likely they will get a naked unposed Olympia 9 with no hair, and a message popping up saying "sitting is not found", and if you click that away, a second message will pop up saying "chair is not found". 

  • MadaMada Posts: 1,991

    I'm one of those people :) Of course you actually have to type exactly what you need, so it would be more like typing ...load Olympia 9, ...add some hair, change it to long hair, show different styles, use style no.4 and make it black....Add a long dress, make it black, change it to a short skirt, and show different colour options. 

    Then you'll move on to ... add a chair, change to a stool, and pose the figure in a sitting pose on the stool ... move her up a bit on the y trans, not that high, a bit lower. Add a bar as the environment... ...add lighting to the scene, ...make it more dim, add a spotlight on her face. 

  • PadonePadone Posts: 3,700
    edited June 2023

    As I already explained and really this is obvious. That is only a AI interface using the controls for you, there is nothing interesting about it unless may be for people with handicaps. So instead of rolling a slider you tell the AI to do that for you, what's the big deal ? In a medium complicated project you will spend more time fixing the AI "smart guessing" than working on the project itself, because this is inevitable, the AI can't possibly be in your head and you can't possibly explain all in details in a single text line or phrase. You will get a much better feedback by using the controls yourself and get exactly what you want.

    Again this AI interface is a design flaw, made only to get a new toy to play with that's really good for nothing.

    Post edited by Padone on
  • kyoto kidkyoto kid Posts: 41,062

    ...sounds like old Blender where one had to do everything via the keyboard which was cumbersome to say the least compared to a pointer driven UI..

    I imagine that if there is no spell check (or not a good one like here in the forums) some of the weird results a dyslexic like myself could end up with.

    "No ,no, no, not with a vat, I meant with a cat.".

    [Spelling and grammar for this post checked in MS Word]

  • MadaMada Posts: 1,991
    edited June 2023

    Since its not replacing Daz Studio in any shape or form - I don't see why everyone can't simply continue as before, and power use all they want. Users on ipads and mobile phones will certainly have a different opinion. So will users that don't want to learn Daz Studio. Just because its not ticking all the boxes for one person doesn't mean it has to dropped - there are a lot of users interested in it out there - the beta response has been overwhelming so that certainly indicates some interest from people who would otherwise not be involved with Daz Studio at all. :)

    Post edited by Mada on
  • GordigGordig Posts: 10,071

    marble said:

    Bring on the USD ...

    I've tried using 3D models in Resolve, but it always crashed before I could do anything interesting with it, or even successfully render something. Maybe I should try it again with USD. Part of the reason I went with C4D as opposed to Maya or Blender is that C4D had both a live link to After Effects, so I could edit a scene in C4D and it would automatically update in AE. This also came with a lite version of C4D I could play around with to get used to it.

  • DustRiderDustRider Posts: 2,744

    Mada said:

    Since its not replacing Daz Studio in any shape or form - I don't see why everyone can't simply continue as before, and power use all they want. Users on ipads and mobile phones will certainly have a different opinion. So will users that don't want to learn Daz Studio. Just because its not ticking all the boxes for one person doesn't mean it has to dropped - there are a lot of users interested in it out there - the beta response has been overwhelming so that certainly indicates some interest from people who would otherwise not be involved with Daz Studio at all. :)

    I'm intrigued by it and signed up for the beta hoping to be accepted. I can see a lot of potential uses for it. Hopefully it lives up to my expectations,  or at least some of them. But with that said, I don't really have any concrete expectations,  since this is a totally new approach to DAZ and 3D. The marketing materials give a bit of an idea, but it's all still very abstract to me. I guess I'm more of a glass half full type so I can see a lot of potential good with it. It really could be a lot of fun, and potentially a productivity boost.  Or it could be difficult to use and not something I'm interested in.  The only way I can know if I like it or not is to try it.

  • marblemarble Posts: 7,500

    Gordig said:

    marble said:

    Bring on the USD ...

    I've tried using 3D models in Resolve, but it always crashed before I could do anything interesting with it, or even successfully render something. Maybe I should try it again with USD. Part of the reason I went with C4D as opposed to Maya or Blender is that C4D had both a live link to After Effects, so I could edit a scene in C4D and it would automatically update in AE. This also came with a lite version of C4D I could play around with to get used to it.

     I don't have After Effects (or anything Adobe), nor C4D, or Maya ... all too rich for my pocket unfortunately. I do have the paid version of Davinci Resolve (Studio) and don't regret that because it is a lifetime license (no subscription) and I get some nice features over and above the free version. For example, the AI engine also helps with upscaling so animating in IRay with small dimension renders and then doubling the size while maintining quality is a big time-saver. 

    I hope the bridge add-ons between DAZ Studio and Blender keep improving as well as DAZ adding USD support because whole new workflows are opening up. Pity I'm getting too old to grasp the new technology as easily as I did in my younger years.

  • nonesuch00nonesuch00 Posts: 18,131

    AI should be a boon for disabled folk to be designers using compute AI with all the programmed manual dexterity it gives via voice. I wonder how long before AI assisted robotic exoskeletons gives mobility back to the wheelchair bound and other folk, like people that have to use walkers and such?

  • PadonePadone Posts: 3,700

    Yes and I wonder how long before AI can do "get a job and make me some money" LOL

  • csaacsaa Posts: 824
    edited June 2023

    nonesuch00 said:

    AI should be a boon for disabled folk to be designers using compute AI with all the programmed manual dexterity it gives via voice. I wonder how long before AI assisted robotic exoskeletons gives mobility back to the wheelchair bound and other folk, like people that have to use walkers and such?

    nonesuch00,

    The natural language interface certainly lowers the barrier for many. My elderly parents, for example, who've never beeen comfortable with the GUI paradigm, may find text inputs more accessible. The more AI approximates conversational interaction, the better.

    And the same goes with Daz Studio. Sure, underneath the hood it's 3D technology (mesh, vectors, PBR, etc.), but there lies the disconnect as we're working in a conceptual mindset. Words come more naturally as we're trying to rough out the look of the GenesisN character we have in mind. An NLP interface doesn't replace the GUI -- it's a way to lower the barrier as we go from fuzzy concept to visual representation. Afterwards we'll still rely on the Daz Studio GUI to fine tune the 3D asset we're working with.

    The question I have is: will the NLP be as good for non-English languages? If it falls short, a non-English speaker may stick with the buttons, pull-down menus and mouse clicks of the GUI, finding this sort of interaction more useful.

    I remain curious and excited.

    Cheers!

    Post edited by csaa on
  • DrJoeDrJoe Posts: 2

    Tell me more!

  • crosswindcrosswind Posts: 6,999
    edited July 2023

    Have you folks got a Beta version or sth. ? I just received an email nearly a month ago, then haven't got any further info so far...

    Post edited by crosswind on
  • FirstBastionFirstBastion Posts: 7,762

    This should have been available twomonths ago.

Sign In or Register to comment.