branch: externals/ellama
commit dac2a7649c92ce9affe9447564ed436f51cffad4
Author: Sergey Kostyaev <[email protected]>
Commit: Sergey Kostyaev <[email protected]>
Documentation fixes
---
README.org | 51 +++++++++++++------------
ellama.info | 125 ++++++++++++++++++++++++++++++------------------------------
2 files changed, 89 insertions(+), 87 deletions(-)
diff --git a/README.org b/README.org
index 033483505a..89b7b62afb 100644
--- a/README.org
+++ b/README.org
@@ -18,14 +18,14 @@ Assistant". Previous sentence was written by Ellama itself.
Just ~M-x~ ~package-install~ @@html:<kbd>@@Enter@@html:</kbd>@@
~ellama~ @@html:<kbd>@@Enter@@html:</kbd>@@. By default it uses
[[https://github.com/jmorganca/ollama][ollama]]
-provider. If you ok with it, you need to install
[[https://github.com/jmorganca/ollama][ollama]] and pull
+provider. If you are OK with it, you need to install
[[https://github.com/jmorganca/ollama][ollama]] and pull
[[https://ollama.com/models][any ollama model]] like this:
#+BEGIN_SRC shell
ollama pull qwen2.5:3b
#+END_SRC
-You can use ~ellama~ with other model or other llm provider.
+You can use ~ellama~ with other models or another LLM provider.
Without any configuration, the first available ollama model will be used.
You can customize ellama configuration like this:
@@ -34,7 +34,7 @@ You can customize ellama configuration like this:
:ensure t
:bind ("C-c e" . ellama)
;; send last message in chat buffer with C-c C-c
- :hook (org-ctrl-c-ctrl-c-final . ellama-chat-send-last-message)
+ :hook (org-ctrl-c-ctrl-c-hook . ellama-chat-send-last-message)
:init (setopt ellama-auto-scroll t)
:config
;; show ellama context in header line in all buffers
@@ -43,14 +43,14 @@ You can customize ellama configuration like this:
(ellama-session-header-line-global-mode +1))
#+END_SRC
-More sofisticated configuration example:
+More sophisticated configuration example:
#+BEGIN_SRC emacs-lisp
(use-package ellama
:ensure t
:bind ("C-c e" . ellama)
;; send last message in chat buffer with C-c C-c
- :hook (org-ctrl-c-ctrl-c-final . ellama-chat-send-last-message)
+ :hook (org-ctrl-c-ctrl-c-hook . ellama-chat-send-last-message)
:init
;; setup key bindings
;; (setopt ellama-keymap-prefix "C-c e")
@@ -190,7 +190,7 @@ More sofisticated configuration example:
- ~ellama-context-add-selection~: Add selected region to context.
- ~ellama-context-add-info-node~: Add info node to context.
- ~ellama-context-reset~: Clear global context.
-- ~ellama-manage-context~: Manage the global context. Inside context
+- ~ellama-context-manage~: Manage the global context. Inside context
management buffer you can see ellama context elements. Available actions
with key bindings:
- ~n~: Move to the next line.
@@ -200,29 +200,24 @@ More sofisticated configuration example:
- ~a~: Open the transient context menu for adding new elements.
- ~d~: Remove the context element at the current point.
- ~RET~: Preview the context element at the current point.
-- ~ellama-preview-context-element-at-point~: Preview ellama context element at
+- ~ellama-context-preview-element-at-point~: Preview ellama context element at
point. Works inside ellama context management buffer.
-- ~ellama-remove-context-element-at-point~: Remove ellama context element at
+- ~ellama-context-remove-element-at-point~: Remove ellama context element at
point from global context. Works inside ellama context management buffer.
-- ~ellama-chat-translation-enable~: Chat translation enable.
-- ~ellama-chat-translation-disable~: Chat translation disable.
+- ~ellama-chat-translation-enable~: Enable chat translation.
+- ~ellama-chat-translation-disable~: Disable chat translation.
- ~ellama-solve-reasoning-problem~: Solve reasoning problem with Abstraction
- of Thought technique. It uses a chain of multiple messages to LLM and help
+ of Thought technique. It uses a chain of multiple messages to an LLM and
helps
it to provide much better answers on reasoning problems. Even small LLMs
- like phi3-mini provides much better results on reasoning tasks using AoT.
+ like phi3-mini provide much better results on reasoning tasks using AoT.
- ~ellama-solve-domain-specific-problem~: Solve domain specific problem with
simple chain. It makes LLMs act like a professional and adds a planning
step.
- ~ellama-community-prompts-select-blueprint~: Select a prompt from the
community prompt collection. The user is prompted to choose a role, and
then
a corresponding prompt is inserted into a blueprint buffer.
-- ~ellama-community-prompts-update-variables~: Prompt user for values of
- variables found in current buffer and update them.
-- ~ellama-response-process-method~: Configure how LLM responses are processed.
- Options include streaming for real-time output, async for asynchronous
- processing, or skipping every N messages to reduce resource usage.
-- ~ellama-blueprint-variable-regexp~: Regular expression to match blueprint
- variables like ~{var_name}~.
+- ~ellama-blueprint-fill-variables~: Prompt user for values of variables
+ found in current blueprint buffer and update them.
- ~ellama-tools-enable-by-name~: Enable a specific tool by its name. Use
this command to activate individual tools. Requires the tool name as input.
- ~ellama-tools-enable-all~: Enable all available tools at once. Use this
@@ -260,7 +255,7 @@ Ellama, using the ~ellama-keymap-prefix~ prefix (not set by
default):
| "s c" | ellama-summarize-killring | Summarize killring |
| "s l" | ellama-load-session | Session Load |
| "s r" | ellama-session-rename | Session rename |
-| "s d" | ellama-session-delete | Delete delete |
+| "s d" | ellama-session-delete | Session delete |
| "s a" | ellama-session-switch | Session activate |
| "P" | ellama-proofread | Proofread |
| "i w" | ellama-improve-wording | Improve wording |
@@ -309,6 +304,10 @@ There are many supported providers: ~ollama~, ~open ai~,
~vertex~,
automatically during generation. Disabled by default.
- ~ellama-fill-paragraphs~: Option to customize ellama paragraphs
filling behaviour.
+- ~ellama-response-process-method~: Configure how LLM responses are
+ processed. Options include streaming for real-time output, async for
+ asynchronous processing, or skipping every N messages to reduce resource
+ usage.
- ~ellama-name-prompt-words-count~: Count of words in prompt to
generate name.
- Prompt templates for every command.
@@ -330,7 +329,7 @@ argument generated text string.
~ellama-provider~ will be used if not set.
- ~ellama-coding-provider~: LLM coding tasks provider.
~ellama-provider~ will be used if not set.
-- ~ellama-summarization-provider~ LLM summarization provider.
+- ~ellama-summarization-provider~: LLM summarization provider.
~ellama-provider~ will be used if not set.
- ~ellama-show-quotes~: Show quotes content in chat buffer. Disabled
by default.
@@ -355,11 +354,11 @@ argument generated text string.
diverse applications.
- ~ellama-context-posframe-enabled~: Enable showing posframe with
ellama context.
-- ~ellama-manage-context-display-action-function~: Display action
- function for ~ellama-render-context~. Default value
+- ~ellama-context-manage-display-action-function~: Display action
+ function for ~ellama-context-manage~. Default value
~display-buffer-same-window~.
-- ~ellama-preview-context-element-display-action-function~: Display
- action function for ~ellama-preview-context-element~.
+- ~ellama-context-preview-element-display-action-function~: Display
+ action function for ~ellama-context-preview-element~.
- ~ellama-context-line-always-visible~: Make context header or mode line always
visible, even with empty context.
- ~ellama-community-prompts-url~: The URL of the community prompts collection.
@@ -384,6 +383,8 @@ argument generated text string.
blueprints.
- ~ellama-blueprint-file-extensions~: File extensions recognized as blueprint
files.
+- ~ellama-blueprint-variable-regexp~: Regular expression to match blueprint
+ variables like ~{var_name}~.
- ~ellama-skills-global-path~: Path to the global directory containing Agent
Skills.
- ~ellama-skills-local-path~: Project-relative path for local Agent Skills.
diff --git a/ellama.info b/ellama.info
index c250274d11..be448f02d4 100644
--- a/ellama.info
+++ b/ellama.info
@@ -122,13 +122,14 @@ File: ellama.info, Node: Installation, Next: Commands,
Prev: Top, Up: Top
**************
Just ‘M-x’ ‘package-install’ Enter ‘ellama’ Enter. By default it uses
-ollama (https://github.com/jmorganca/ollama) provider. If you ok with
-it, you need to install ollama (https://github.com/jmorganca/ollama) and
-pull any ollama model (https://ollama.com/models) like this:
+ollama (https://github.com/jmorganca/ollama) provider. If you are OK
+with it, you need to install ollama
+(https://github.com/jmorganca/ollama) and pull any ollama model
+(https://ollama.com/models) like this:
ollama pull qwen2.5:3b
-You can use ‘ellama’ with other model or other llm provider. Without
+You can use ‘ellama’ with other models or another LLM provider. Without
any configuration, the first available ollama model will be used. You
can customize ellama configuration like this:
@@ -136,7 +137,7 @@ can customize ellama configuration like this:
:ensure t
:bind ("C-c e" . ellama)
;; send last message in chat buffer with C-c C-c
- :hook (org-ctrl-c-ctrl-c-final . ellama-chat-send-last-message)
+ :hook (org-ctrl-c-ctrl-c-hook . ellama-chat-send-last-message)
:init (setopt ellama-auto-scroll t)
:config
;; show ellama context in header line in all buffers
@@ -144,13 +145,13 @@ can customize ellama configuration like this:
;; show ellama session id in header line in all buffers
(ellama-session-header-line-global-mode +1))
-More sofisticated configuration example:
+More sophisticated configuration example:
(use-package ellama
:ensure t
:bind ("C-c e" . ellama)
;; send last message in chat buffer with C-c C-c
- :hook (org-ctrl-c-ctrl-c-final . ellama-chat-send-last-message)
+ :hook (org-ctrl-c-ctrl-c-hook . ellama-chat-send-last-message)
:init
;; setup key bindings
;; (setopt ellama-keymap-prefix "C-c e")
@@ -298,7 +299,7 @@ File: ellama.info, Node: Commands, Next: Keymap, Prev:
Installation, Up: Top
• ‘ellama-context-add-selection’: Add selected region to context.
• ‘ellama-context-add-info-node’: Add info node to context.
• ‘ellama-context-reset’: Clear global context.
- • ‘ellama-manage-context’: Manage the global context. Inside context
+ • ‘ellama-context-manage’: Manage the global context. Inside context
management buffer you can see ellama context elements. Available
actions with key bindings:
• ‘n’: Move to the next line.
@@ -308,17 +309,17 @@ File: ellama.info, Node: Commands, Next: Keymap, Prev:
Installation, Up: Top
• ‘a’: Open the transient context menu for adding new elements.
• ‘d’: Remove the context element at the current point.
• ‘RET’: Preview the context element at the current point.
- • ‘ellama-preview-context-element-at-point’: Preview ellama context
+ • ‘ellama-context-preview-element-at-point’: Preview ellama context
element at point. Works inside ellama context management buffer.
- • ‘ellama-remove-context-element-at-point’: Remove ellama context
+ • ‘ellama-context-remove-element-at-point’: Remove ellama context
element at point from global context. Works inside ellama context
management buffer.
- • ‘ellama-chat-translation-enable’: Chat translation enable.
- • ‘ellama-chat-translation-disable’: Chat translation disable.
+ • ‘ellama-chat-translation-enable’: Enable chat translation.
+ • ‘ellama-chat-translation-disable’: Disable chat translation.
• ‘ellama-solve-reasoning-problem’: Solve reasoning problem with
Abstraction of Thought technique. It uses a chain of multiple
- messages to LLM and help it to provide much better answers on
- reasoning problems. Even small LLMs like phi3-mini provides much
+ messages to an LLM and helps it to provide much better answers on
+ reasoning problems. Even small LLMs like phi3-mini provide much
better results on reasoning tasks using AoT.
• ‘ellama-solve-domain-specific-problem’: Solve domain specific
problem with simple chain. It makes LLMs act like a professional
@@ -327,14 +328,8 @@ File: ellama.info, Node: Commands, Next: Keymap, Prev:
Installation, Up: Top
the community prompt collection. The user is prompted to choose a
role, and then a corresponding prompt is inserted into a blueprint
buffer.
- • ‘ellama-community-prompts-update-variables’: Prompt user for values
- of variables found in current buffer and update them.
- • ‘ellama-response-process-method’: Configure how LLM responses are
- processed. Options include streaming for real-time output, async
- for asynchronous processing, or skipping every N messages to reduce
- resource usage.
- • ‘ellama-blueprint-variable-regexp’: Regular expression to match
- blueprint variables like ‘{var_name}’.
+ • ‘ellama-blueprint-fill-variables’: Prompt user for values of
+ variables found in current blueprint buffer and update them.
• ‘ellama-tools-enable-by-name’: Enable a specific tool by its name.
Use this command to activate individual tools. Requires the tool
name as input.
@@ -377,7 +372,7 @@ Keymap Function Description
"s c" ellama-summarize-killring Summarize killring
"s l" ellama-load-session Session Load
"s r" ellama-session-rename Session rename
-"s d" ellama-session-delete Delete delete
+"s d" ellama-session-delete Session delete
"s a" ellama-session-switch Session activate
"P" ellama-proofread Proofread
"i w" ellama-improve-wording Improve wording
@@ -431,6 +426,10 @@ There are many supported providers: ‘ollama’, ‘open ai’,
‘vertex’,
automatically during generation. Disabled by default.
• ‘ellama-fill-paragraphs’: Option to customize ellama paragraphs
filling behaviour.
+ • ‘ellama-response-process-method’: Configure how LLM responses are
+ processed. Options include streaming for real-time output, async
+ for asynchronous processing, or skipping every N messages to reduce
+ resource usage.
• ‘ellama-name-prompt-words-count’: Count of words in prompt to
generate name.
• Prompt templates for every command.
@@ -452,7 +451,7 @@ argument generated text string.
‘ellama-provider’ will be used if not set.
• ‘ellama-coding-provider’: LLM coding tasks provider.
‘ellama-provider’ will be used if not set.
- • ‘ellama-summarization-provider’ LLM summarization provider.
+ • ‘ellama-summarization-provider’: LLM summarization provider.
‘ellama-provider’ will be used if not set.
• ‘ellama-show-quotes’: Show quotes content in chat buffer. Disabled
by default.
@@ -482,11 +481,11 @@ argument generated text string.
diverse applications.
• ‘ellama-context-posframe-enabled’: Enable showing posframe with
ellama context.
- • ‘ellama-manage-context-display-action-function’: Display action
- function for ‘ellama-render-context’. Default value
+ • ‘ellama-context-manage-display-action-function’: Display action
+ function for ‘ellama-context-manage’. Default value
‘display-buffer-same-window’.
- • ‘ellama-preview-context-element-display-action-function’: Display
- action function for ‘ellama-preview-context-element’.
+ • ‘ellama-context-preview-element-display-action-function’: Display
+ action function for ‘ellama-context-preview-element’.
• ‘ellama-context-line-always-visible’: Make context header or mode
line always visible, even with empty context.
• ‘ellama-community-prompts-url’: The URL of the community prompts
@@ -517,6 +516,8 @@ argument generated text string.
project-specific blueprints.
• ‘ellama-blueprint-file-extensions’: File extensions recognized as
blueprint files.
+ • ‘ellama-blueprint-variable-regexp’: Regular expression to match
+ blueprint variables like ‘{var_name}’.
• ‘ellama-skills-global-path’: Path to the global directory
containing Agent Skills.
• ‘ellama-skills-local-path’: Project-relative path for local Agent
@@ -1576,40 +1577,40 @@ their use in free software.
Tag Table:
Node: Top1379
Node: Installation3748
-Node: Commands8756
-Node: Keymap16195
-Node: Configuration19028
-Node: Context Management25863
-Node: Transient Menus for Context Management26771
-Node: Managing the Context28385
-Node: Considerations29160
-Node: Minor modes29753
-Node: ellama-context-header-line-mode31741
-Node: ellama-context-header-line-global-mode32566
-Node: ellama-context-mode-line-mode33286
-Node: ellama-context-mode-line-global-mode34134
-Node: Ellama Session Header Line Mode34838
-Node: Enabling and Disabling35407
-Node: Customization35854
-Node: Ellama Session Mode Line Mode36142
-Node: Enabling and Disabling (1)36727
-Node: Customization (1)37174
-Node: Using Blueprints37468
-Node: Key Components of Ellama Blueprints38108
-Node: Creating and Managing Blueprints38715
-Node: Blueprints files39693
-Node: Variable Management40114
-Node: Keymap and Mode40567
-Node: Transient Menus41503
-Node: Running Blueprints programmatically42049
-Node: MCP Integration42636
-Node: Agent Skills43658
-Node: Directory Structure44021
-Node: Creating a Skill45048
-Node: How it works45423
-Node: Acknowledgments45814
-Node: Contributions46525
-Node: GNU Free Documentation License46911
+Node: Commands8762
+Node: Keymap15839
+Node: Configuration18673
+Node: Context Management25874
+Node: Transient Menus for Context Management26782
+Node: Managing the Context28396
+Node: Considerations29171
+Node: Minor modes29764
+Node: ellama-context-header-line-mode31752
+Node: ellama-context-header-line-global-mode32577
+Node: ellama-context-mode-line-mode33297
+Node: ellama-context-mode-line-global-mode34145
+Node: Ellama Session Header Line Mode34849
+Node: Enabling and Disabling35418
+Node: Customization35865
+Node: Ellama Session Mode Line Mode36153
+Node: Enabling and Disabling (1)36738
+Node: Customization (1)37185
+Node: Using Blueprints37479
+Node: Key Components of Ellama Blueprints38119
+Node: Creating and Managing Blueprints38726
+Node: Blueprints files39704
+Node: Variable Management40125
+Node: Keymap and Mode40578
+Node: Transient Menus41514
+Node: Running Blueprints programmatically42060
+Node: MCP Integration42647
+Node: Agent Skills43669
+Node: Directory Structure44032
+Node: Creating a Skill45059
+Node: How it works45434
+Node: Acknowledgments45825
+Node: Contributions46536
+Node: GNU Free Documentation License46922
End Tag Table