Today I found the weirdest bug in my life. I was making a chatbot for Signal using Ollama in Rust. I finished a basic demo and tried it. For any message I would get { }, {}, {} or { } .

Do you know how hard is to debug something like this???

What was the problem? Not my program. Ollama bug combined with ollama-rs bug (rust library for ollama). And both bugs are not even bugs if you don’t combine them.

Ollama released a new feature yesterday called “Structured outputs”. Basically you can specify a format of the output using format field in json request. Format field already existes for something but I don’t know for what. In ollama-rs you can specify the format to json or leave it empty. By default its empty. Where is the bug?

There is a difference betwen "format": null and not specifying the format at all. Ollama-rs will set format to null if you dont specify it. Ollama will interpret null as a valid format. What happens? LLM WILL ACTUALLY GIVE YOU FORMAT OF OUTPUT AS NULL - { }!

  • HeckGazer@programming.dev
    link
    fedilink
    arrow-up
    10
    ·
    17 days ago

    And both bugs are not even bugs if you don’t combine them.

    There is a difference betwen "format": null and not specifying the format at all.

    Hmmmmm, that sure does sound like a bug

  • Dave.@aussie.zone
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    16 days ago

    Ollama released a new feature yesterday

    Ollama 0.5.1, yesterday : “Fixed issue where Ollama’s API would generate JSON output when specifying “format”: null”

    Ollama-rs 0.2.1: released 08/09/24.

    Gee I wonder why it doesn’t play nicely with the latest Ollama API which uses new/updated behaviour for an option 🤔