Ollama Buddy Version 0.2.1 - Same prompt to multiple LLMs and choose best answer!
Some improvements to my ollama LLM package…
With the new multishot mode, you can now send a prompt to multiple models in sequence, and compare their responses, the results are also available in named registers.

Letter-Based Model Shortcuts
Instead of manually selecting models, each available model is now assigned a letter (e.g., (a) mistral
, (b) gemini
). This allows for quick model selection when sending prompts or initiating a multishot sequence.
Multishot Execution (C-c C-l
)
Ever wondered how different models would answer the same question? With Multishot Mode, you can:
- Send your prompt to a sequence of models in one shot.
- Track progress as responses come in.
- Store each model’s response in a register, making it easy to reference later, each assigned model letter corresponds to the named register.
Status Updates
When running a multishot execution, the status now updates dynamically:
- “Multi Start” when the sequence begins.
- “Processing…” during responses.
- “Multi Finished” when all models have responded.
How It Works
C-c C-l
to start a multishot session in the chat buffer.- Type a sequence of model letters (e.g.,
abc
to use modelsmistral
,gemini
, andllama
). - The selected models will process the prompt one by one.
- The responses will be saved to registers of the same named letter for recalling later.
