Gemini Nano can’t run on the smaller Pixel 8 due to mysterious “hardware limitations.”

    • BrikoX@lemmy.zipOPM
      link
      fedilink
      English
      arrow-up
      12
      ·
      8 months ago

      Access. If it can truly run fully locally, then companies can have their LLM model in everyone’s pockets, basically.

      As far desirable, it depends on the person. And it’s more about what people don’t want, then want. Google has been promising various AI features that people will be able to run on the device, but so far most of them send the data to Google servers and then return the result.

    • angelsomething@lemmy.one
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      8 months ago

      Less compute overhead on them giant and expensive datacentres as well as the ability to work offline.

      • jacksilver@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 months ago

        That explains why Google wants it, but what do phone owners get? What is being offered of value. I keep hearing all this talk about this wave of AI on phones, but don’t see what it’s providing.

        • angelsomething@lemmy.one
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          8 months ago

          Yeah at present it’s mostly a gimmick feature. The next step will be for it to be integrated to the OS where it can understand what’s going on with your phone and you can ask it, for example, “make my phone secure” and it’ll execute a bunch of steps to accomplish this, or “tell me why this app just crashed” and it’ll review the logs and tell you - or the app developer, what happened. The ultimate goal is to sell you a truly personal ai-personal-assistant. The first company that can truly achieve this will mint gold.

          • jacksilver@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 months ago

            Doesn’t that require all of those things to be developed manually though. It’s not like the LLM can just access your logs. I guess the thought is the LLM can better generalize commands, but seems like a lot of the AI there would actually be hand developed.

        • Tylerdurdon@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          Yes, same here. More compute on my phone to me means one thing: battery drain.

          Also, with companies getting more and more comfortable with looking at your searches for their “advertising,” what else can they invent with an onboard brain? Nothing we would like I’m afraid. There are no protections for something like that, and you know what companies do when there are no restrictions…

    • M500@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 months ago

      Gpt4all already lets you do this. You need to have at leave 8gb of unused memory to run it and it can be a bit slow, but it works well for the most part.

      This way you can use gpt without sharing your data online and it will work without internet access as well.

      The ones you run locally can be uncensored meaning, it will not avoid certain subjects that google and open ai deem not good to talk about.

      • Tylerdurdon@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        Interesting. I’m just paranoid when the super brain has been programmed by a corporation “to help you.” Corporations are self serving, that’s just business.

        • M500@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          I get it, this seems to be an instance where corporate interests and consumer interests align. I’m sure it will still collect a bunch of data from you, but that would happen anyway.

    • AwkwardLookMonkeyPuppet@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 months ago

      One thing that is important to me is the on-device subtitles. Pixel phones can caption any audio played on the device. That includes music, movies, random internet videos, phone calls, video calls, anything. For someone with hearing issues, this is a huge benefit. I’m not aware of any other phone that can do this.

  • anakin78z@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    8 months ago

    I mean, that’s what you get for buying the cheaper phone! Except I got the 7 Pro, and it feels like it was abandoned as soon as the 8 was announced. As soon as I read that I wasn’t getting functionality that was clearly cloud functionality, I realized that Google stopped giving 2 shits about existing users that don’t immediately upgrade to the latest phone. Also it sucked at making phone calls, so that was the last Pixel phone for me.

  • Capt. Wolf@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    8 months ago

    Personally, I’ve tried using Gemini for a couple weeks now and last night decided to disable it. It can’t even manage commands like “play music” to open Spotify, or start calls and texts without being unlocked. The whole point of a hands free assistant for me is to be hands free… I use it in my car for calls and texts, literally the primary reason to have a cell phone. Who gives a crap about “creative inspiration” if they’re not even prioritizing the basic features of a phone?

    • AwkwardLookMonkeyPuppet@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      Yeah, it’s shit. It won’t work with the phone locked for the most part. Sometimes it will, sometimes it won’t. You can’t change the voice. It can’t set stuff on the phone. Half the time it can’t even answer basic questions that the regular assistant has no issues with. Idk why they’d release this to the public when it’s clearly a beta product, if that.