• Telorand@reddthat.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 days ago

    The example videos are pretty good. Only a few had obvious tells (like the Taylor Swift one), and the rest seemed pretty human-like.

    But before everyone goes out and invests in OmniHuman-1 systems, remember that marketing campaigns always show the best they could make, not the average case most people are likely to get. Will it be good enough to trick the average consumer who’s not looking that hard? Maybe. I guess we’ll have to see.

    But if all these generative models are so designed to replace the people upon whose videos they’re based, who/what will train the next generation of models, I wonder?

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      8 days ago

      When you are a nation-state, you can find a pretty amount of money to pay for a video of some unfavorable person committing a crime they’ve not committed. Or a dead\incapacitated\unwilling politician saying a speech. Or a person possessing authority confirming something they didn’t confirm.

      When you are an entertainment company, you can find a pretty amount of money to pay for a technology to make characters appear consistent a decade after actors died.

      When you are a multitude of clueless investors, you can together find a pretty amount of money to pay for Sun hardware in close future, then dotcom bubble burst comes. Same for this thing - it may be just a bubble.

      I think all 3 variants are not stable, for #1 people already know deepfakes exist, and also fiction has prepared us for things like Saruman’s voice, charm spells in HP, just convincing illusions in Star Trek, the Force affecting minds in Star Wars, and so on, might be why mainstream doesn’t like geek culture, or tries to present it neutered and bland, for #2 they have to be unbelievably good and generative models are still not very good at philosophy and writing plots, for #3 - I think it’s too optimistic.