Ollama-Buddy 0.7.1 - Org-mode Chat, Parameter Control and JSON Debugging

Continuing the development of my local ollama LLM client called ollama-buddy

https://github.com/captainflasmr/ollama-buddy

The basic functionality, I think, is now there (and now literally zero configuration required). If a default model isn’t set I just pick the first one, so LLM chat can take place immediately.

Now I’m getting more into this chat client malarkey, my original idea of a very minimal chat client to interface to ollama is starting to skew into supporting as much of the ollama RESTful API as possible. Hence in this update a more advanced approach is creeping in, including setting up various subtle model parameters and providing a debugging window to monitor incoming raw JSON (pretty printed of course). Hopefully, these features will remain tucked away for advanced users, I’ve done my best to keep them unobtrusive (but not too hidden). The tool is still designed to be a helpful companion to interface to ollama through Emacs, just now with more powerful options under the hood.

Also a note about converting the chat buffer into org-mode. My original intention was to keep the chat buffer as a very simple almost “no mode” buffer, with just text and nothing else. However, with more consideration, I felt that converting this buffer into org-mode actually held quite a few benefits:

I’m sure there are more as this list isn’t quite the “quite a few benefits” I was hoping for :(

I have a local keymap defined with some ollama-buddy specific keybindings, and as of yet I haven’t encountered any conflicts with commonly used org-mode bindings but we shall see how it goes. I think for this package it is important to have a quick chatting mechanism, and what is faster than a good keybind?

Finally, just a note on the pain of implementing a good prompt mechanism. I had a few goes at it and I think I now have an acceptable robust solution. I kept running into little annoying edge cases and I ended up having to refactor quite a bit. My original idea for this package involved a simple “mark region and send” as at the time I had a feeling that the implementation of a good prompt mechanism would be tough - how right I was!. Things got even trickier with the move to org-mode, since each prompt heading should contain meaningful content for clean exports and I had to implement a mechanism to replace prompts intelligently. For example, if the model is swapped and the previous prompt is blank, it gets replaced, though, of course, even this has its own edge cases - gives a new meaning to prompt engineering! :)

Anyways, listed below are my latest changes, with a little deeper dive into more “interesting” implementations, my next ideas are a little more advanced and are kanban’d into my github README at https://github.com/captainflasmr/ollama-buddy for those that are interested.

<2025-03-11 Tue> 0.7.1

Added debug mode to display raw JSON messages in a debug buffer

<2025-03-11 Tue> 0.7.0

Added comprehensive Ollama parameter management

Introduced parameter management capabilities that give you complete control over your Ollama model’s behavior through the options in the ollamas API.

Ollama’s API supports a rich set of parameters for fine-tuning text generation, from controlling creativity with temperature to managing token selection with top_p and top_k. Until now, Ollama Buddy only exposed the temperature parameter, but this update unlocks the full potential of Ollama’s parameter system!

Key Features:

Keyboard Shortcuts

Parameter management is accessible through simple keyboard shortcuts from the chat buffer:

<2025-03-10 Mon> 0.6.1

Refactored prompt handling so each org header line should now always have a prompt for better export

<2025-03-08 Sat> 0.6.0

Chat buffer now in org-mode

Key Features

  1. The chat buffer is now in org-mode which gives the buffer enhanced readability and structure. Now, conversations automatically format user prompts and AI responses with org-mode headings, making them easier to navigate.

  2. Of course with org-mode you will now get the additional benefits for free, such as:

    • outlining
    • org export
    • heading navigation
    • source code fontification
  3. Previously, responses in Ollama Buddy were displayed in markdown formatting, which wasn’t always ideal for org-mode users. Now, you can automatically convert Markdown elements, such as bold/italic text, code blocks, and lists, into proper org-mode formatting. This gives you the flexibility to work with markdown or org-mode as needed.

Comments

comments powered by Disqus