Over just a few months, ChatGPT went from correctly answering a simple math problem 98% of the time to just 2%, study finds. Researchers found wild fluctuations—called drift—in the technology’s abi…::ChatGPT went from answering a simple math correctly 98% of the time to just 2%, over the course of a few months.

    • Victoria@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      50
      arrow-down
      1
      ·
      1 year ago

      It was initially presented as the all-problem-solver, mainly by the media. And tbf, it was decently competent in certain fields.

      • MeanEYE@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        1 year ago

        Problem was it was presented as problem solved which it never was, it was problem solution presenter. It can’t come up with a solution, only come up with something that looks like a solution based on what input data had. Ask it to invert sort something and goes nuts.

      • Lukecis@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        Once AGI is achieved and subsequently Sentient-super intelligent ai- I cant imagine them not being such a thing, however I’d be surprised if a super intelligent sentient ai doesn’t decide humanity needs to go extinct for its own best self interests.

    • nani8ot@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I did use it more than half a year ago for a few math problems. It was partly to help me getting started and to find out how well it’d go.

      ChatGPT was better than I’d thought and was enough to help me find an actually correct solution. But I also noticed that the results got worse and worse to the point of being actual garbage (as it’d have been expected to be).

    • Captain Poofter@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      3
      ·
      1 year ago

      Math is a language.

      Mathematical ability and language ability are closely related. The same parts of your brain are used in each tasks. Words and numbers are essentially both ideas, and language and math are systems used to express and communicate these.

      A language model doing math makes more sense than you’d think!

    • affiliate@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      it’s pretty useful for explaining high level math concepts, or at least it used to be. before chatgpt 4 launched, it was able to give intuitive descriptions of stuff in algebraic topology and even prove some properties of the structures involved.

    • danwardvs@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      1 year ago

      I’m guessing people were entering word problems to generate the right equations and solve it, rather than it being used as a calculator.

    • Fixbeat@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      3
      ·
      1 year ago

      Because it works, or at least it used to. Is there something more appropriate ?

      • bassomitron@lemmy.world
        link
        fedilink
        English
        arrow-up
        20
        arrow-down
        1
        ·
        1 year ago

        I used Wolfram Alpha a lot in college (adult learner, but that was about ~4 years ago that I graduated, so no idea if it’s still good). https://www.wolframalpha.com/

        I would say that Wolfram appears to probably be a much more versatile math tool, but I also never used chatgpt for that use case, so I could be wrong.

        • d3Xt3r@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          ·
          1 year ago

          There’s an official Wolfram plugin for ChatGPT now, so all math can be handed over to it for solving.

        • TitanLaGrange@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          How did you learn to talk to WolframAlpha?

          I want to like WA, but the natural language interface is so opaque that I usually give up before I can get any non-trivial calculation out of it.

    • lorcster123@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      It can be useful asking it certain questions which are a bit complex. Like on a plot which has the y axis linear and x axis logarithmic, the equation of a straight line is a little bit complicated. Its in the form y = m*(log(x)) + b rather than on a linear-linear plot which is y = m*x+b

      ChatGPT is able to calculate the correct equation of the line but it gets the answer wrong a few times… lol

    • Steeve@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      And why is it being measured on a single math problem lol