Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he “absolutely” believes that Amazon will soon start charging a subscription fee for Alexa

  • Hazdaz@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    3
    ·
    9 months ago

    I don’t understand this. Hasn’t Intel or Nvidia (or someone else) been making claims about their next CPUs having AI functionality built-in?

    • ours@lemmy.film
      link
      fedilink
      English
      arrow-up
      16
      ·
      9 months ago

      Having “AI functionality” doesn’t mean they can just get rid of their big/expensive models they use now.

      If they are anything like Open AI’s LLM, it requires very beefy machines with a ton of expensive RAM.

      • Hazdaz@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        6
        ·
        edit-2
        9 months ago

        Well that’s exactly what I was thinking when these companies were making these claims… like HOW could they possibly handle this locally on a CPU or GPU when there must be a massive database that (I assume) is constantly being updated? Didn’t make sense.

        EDIT: this entire website can go fuck off. You ask a simple question about some reasonably new tech, and you get downvoted for having the audacity to learn some new stuff. People on here are downright pathetic.

        • ours@lemmy.film
          link
          fedilink
          English
          arrow-up
          5
          ·
          9 months ago

          “AI” doesn’t use databases per se, they are trained models built from large amounts of training data.

          Some models run fine on small devices (like the model running on phones to make better pictures) but others are huge like Open AI’s LLM.

          • Hazdaz@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            5
            ·
            9 months ago

            training data.

            Wouldn’t that data be stored in some kind of database?

            • ours@lemmy.film
              link
              fedilink
              English
              arrow-up
              3
              ·
              9 months ago

              No, the data will influence the model.

              Some of the data may be found in the model itself (i.e. the AI generated images outputting mangled author signatures from the original works that were used during training) but not in the traditional form of a database. You can’t directly retrieve that data back in its original form even if some models can be coerced to do something similar.

              It’s basically a statistical model built from training data.

              The training of these huge models also cost a fortune. Allegedly in the millions of $ in a data and processing-intensive process.

            • TipRing@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              1
              ·
              9 months ago

              The training data isn’t stored in the model. You can take an existing model and fine tune it on a whole bunch of additional data and the model size won’t change.

            • Dark Arc@social.packetloss.gg
              link
              fedilink
              English
              arrow-up
              2
              ·
              9 months ago

              The other answers are a bit confusing…

              Yes, that’s in a database.

              However, you can think of it like a large library of books on how to best tune a ukulele. There might be a lot of information to figure out how to tone the ukulele and a lot of skill to put all that knowledge to use, but the ukulele once tuned, is quite small and portable.

        • blazeknave@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 months ago

          You’re right. Run an llm locally adjacent to your application sandboxes and local user apps and your office will lower its heating bills.

    • tb_@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      You can record and edit videos on your own devices, but that doesn’t mean it’s suddenly free for Netflix or YouTube to stream their videos to you.

      Surely a local version of Alexa could be developed, but that development would come with its own costs.
      Some things simply can’t be done locally, such as a web search. Often your route calculations for a map application are also done in the cloud.