• GlitterInfection@lemmy.world
    link
    fedilink
    arrow-up
    47
    arrow-down
    36
    ·
    10 months ago

    Oh Kotaku.

    AI has the potential to flesh out immersive worlds in video games in ways that are completely impossible for a team to accomplish today.

    If it’s used to augment scripted characters and stories it can only make the soulless NPCs we are used to into much more interesting characters.

    I welcome, and in fact, long for that treatment in games like the Elder Scrolls.

    There’s absolutely no need for AI to replace Link from legend of Zelda, but hells yes it should be used to stop guards from talking about my stolen sweetroll.

    This article and headline are just propaganda.

    • bionicjoey@lemmy.ca
      link
      fedilink
      arrow-up
      30
      arrow-down
      6
      ·
      10 months ago

      The point is that right now language models are only good at generating coherent text. They aren’t at the level where they can control an NPC’s behaviour in a game world. NPCs need to actually interact with the world around them in order to be interesting. That words that come out of their mouths are only part of the equation.

      • warmaster@lemmy.world
        link
        fedilink
        arrow-up
        20
        arrow-down
        7
        ·
        10 months ago

        Yes, language models are good for text. That’s their sole purpose. They can’t control characters. There are other models for that, and they are obviously not language models.

      • kakes@sh.itjust.works
        link
        fedilink
        arrow-up
        8
        arrow-down
        7
        ·
        10 months ago

        Well, they actually can, at least to an extent. All you need to do is encode the worldstate in a way the LLM can understand, then decode the LLM’s response to that worldstate (most examples I’ve seen use JSON to good effect).

        That doesn’t seem to be the focus of most of these developers though, unfortunately.

        • bionicjoey@lemmy.ca
          link
          fedilink
          arrow-up
          3
          ·
          10 months ago

          That assumes the model is trained on a large training set of the worldstate encoding and understands what that worldstate means in the context of its actions and responses. That’s basically impossible with the state of language models we have now.

          • kakes@sh.itjust.works
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            10 months ago

            I disagree. Take this paper for example - keeping in mind it’s a year old already (using ChatGPT 3.5-turbo).

            The basic idea is pretty solid, honestly. Representing worldstate for an LLM is essentially the same as how you would represent it for something like a GOAP system anyway, so it’s not a new idea by any stretch.

      • jerkface@lemmy.ca
        link
        fedilink
        arrow-up
        3
        arrow-down
        6
        ·
        10 months ago

        Right, there’s no possible way actions can be represented by a stream of symbols.

      • Psionicsickness@reddthat.com
        link
        fedilink
        arrow-up
        2
        arrow-down
        7
        ·
        10 months ago

        Did you watch the demo? The player literally told the bartender to break out the good stuff and he did just that…

      • GlitterInfection@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        10
        ·
        edit-2
        10 months ago

        They’re a massive and combinatorially exploding part of the equation, though.

        Imagine a world where instead of using AI to undermine writers and artists, we use it to explode their output. A writer could write the details that make a character unique, and the key and side quest dialogs that they write now, which could be used to customize a model for that character.

        The player can now have realistic conversations with those characters that would make everything better. You could ask for directions to something and then follow it up with more questions that the NPC should know the answer to. Etc.

        Now inconsequential filler characters, like a ramen shop owner in the example, become something potentially memorable but explicitly useful in a way that could never possibly be hand crafted.

        This article is shitting on an incredible early attempt to allow for this by taking the fact that it’s not done yet and crossing that with their biased opining and producing a kotaku-style click bait from it.

        • 50gp@kbin.social
          link
          fedilink
          arrow-up
          10
          arrow-down
          1
          ·
          10 months ago

          you do know that quality over quantity right? nobody likes bethesdas radiant fetch quests and this is that but with exposition dumping npcs

          • bionicjoey@lemmy.ca
            link
            fedilink
            arrow-up
            3
            ·
            10 months ago

            Not even just exposition. An NPC could easily go off script and start talking about stuff that breaks immersion. Like imagine you’re sitting in a tavern in Skyrim and then some NPC comes up and is like “hey, you see any good movies lately?”

    • Kbin_space_program@kbin.social
      link
      fedilink
      arrow-up
      10
      arrow-down
      2
      ·
      10 months ago

      You mention the trick yourself.

      AI can augment a real actor and script. Not replace.

      Skyrim has mods that add AI voice to non voiced mod NPCs or lines. Works great. But its only augmenting what is already there.

    • ram@bookwormstory.social
      link
      fedilink
      arrow-up
      22
      arrow-down
      18
      ·
      edit-2
      10 months ago

      AI has the potential to enshittify what would be immersive worlds in video games. Nobody wants their crafted NPC dialogue turned into ChatGPT garbage. Your comment is just propaganda.

  • blindsight@beehaw.org
    link
    fedilink
    arrow-up
    6
    ·
    10 months ago

    Just one thing to add re: the quip at the end about industry layoffs:

    Layoffs are happening in the tech sector almost entirely because of interest rates. When interest rates go up, it’s more expensive to invest in growth, so companies scale down their operations. That’s exactly the point; central banks are trying to reduce aggregate demand across the entire economy to reduce the demand-side pressure on inflation.

    It absolutely sucks that corporate greed and regulatory capture have fucked the economy and are screwing over workers, but that’s just the general situation and isn’t the cause of recent layoffs.

    Anyway, not at all surprised to hear that AI chatbots suck in games even more than they suck on text boxes; it’s an even harder problem to add voice and animations.

    LLMs aren’t really capable of creating immersive text, at least not on their own. We’ll at least need some sort of companion software to guide the LLM with what text to create. Sending raw user input into a LLM is never going to work well; it’s just not how the technology works.