- extra server port configuration for gui backend
-
added support of Anthropic Claude 3.5 Sonnet
-
added support of custom Azure model list
-
added "Researches" submenu to system tray
-
improved CLI options
-
improved GUI functionalities
-
improved handling of multiple keys in the same backend
-
fixed setting max tokens of chatgpt models via api client
- support Gemini 2.0 models via GenAI SDK
-
fixed using Azure AI as backends
-
fixed image creation tools
- fixed API client error message
- fixed AutoGen tools using GitHub API Keys
-
added support of backend OpenAI models via Github API keys or Azure API keys
-
improved cli options
-
added Desktop Assistant GUI in full version
-
changed tool name
execute_computing_task
totask
-
added setup menu
tmsetup -m
-
Enhanced fabric integration. Added CLI options
--chatpattern
and--searchpatterns
for ToolMate AI to use fabric pattern as chat system message. https://github.com/eliranwong/toolmate/blob/main/package/toolmate/docs/Fabric%20Integration.md This feature uses AI model assigned in ToolMate AI backend, rather than the default model set up in fabric. -
fixed
tm -py
-
enhanced integration of AutoGen agents in tools: proxy, group, agents, captain, examine_files, examine_web_content
-
fixed saving autobuilder configs
-
support use of bing api key and rapid api key for use with AutoGen captain agent
-
document about captain agent setup
-
fixed viewing server information when backend is set to chatgpt
-
changed tool name
create_agents
toagents
-
changed command name
tmteam
totmagents
-
changed command name
tminternet
totmonline
-
simplified some tool names
-
added code execution ability to tool "create_agents"
-
added config options for AutoGen parameters
-
fixed repetitive download of nltk modules on startup
-
support use of image tools with api_client
-
support non-chatgpt backends to use autobuild tool
-
added downloading nltk packages for running unstructured package
-
added group chat options to api client cli
-
fixed auto code correction feature
-
fixed display of auto code correction results with vertexai as backend
-
fixed autogen tools
-
fixed selecting google ai, vertex ai and xai models in interactive mode
-
support the latest structured output feature, offered by Ollama version 0.5.0+. Users need to upgrade Ollama to version 0.5.0+, to work with Toolmate AI.
-
unload ollama previous model when ollama model is changed
-
updated dependencies
-
added configurable command shortcuts tms1, tms2, tms3, ... tms20, for running with custom chat system messages
-
added configurable command shortcuts tmt1, tmt2, tmt3, ... tmt20, for running with particular tools
-
improved support of ollama
-
improved handling of code execution
-
implemented risk assessment agent in api client
-
added backend and model options to api client
-
skipped some plugins if no internet connection
- Support Llama.cpp server on Lite version and running on Android
-
Improved api server and client
-
Added cli
tmsetup
for setup, checktmsetup -h
for options -
backend
llama.cpp
changed to be extra module for installation
-
enhanced API server and client features, check
toolmate -h
,tm -h
,tmc -h
for options -
fixed use of default tool
-
fixed running python code
-
added an API server,
toolmateserver
-
added an API client,
toolmateclient
;tm
andtmc
are aliases totoolmateclient
;tmc
enables chat feature by default.
-
support custom config file when launched, via -c argument
-
.favourite
action renamed to.like
-
added command
toolmatelite
to full version
-
supports Google AI Studio API key
-
renamed package
toolmate_android
totoolmate_lite
-
added a combination for inserting the best-liked entry
-
a few changes on default key combination, enter
.keys
for more information -
a few tweaks
-
reload plugins after changing the tool-selection agent setting
-
improved use of backends Mistral AI and Gemini
-
improved use of system messages
-
improved a few plugins
-
validate extracted parameters for tool calling
-
fixed code generation with Mistral AI backend
-
a few fixes
-
added
@search_conversation
tool running on Android Termux
- added support of Mistral AI API Keys, read https://github.com/eliranwong/toolmate/blob/main/package/toolmate/docs/Mistral%20API%20Setup.md
- added support of using Grop cloud API keys for running tool
perplexica
-
added support using Android built-in text-to-speech
-
change directory to the most recently saved conversations when users run
.open
action. -
added tool
@uniquebible_web
-
added action menu item
.last
, to open the previously saved conversation. -
fixed loading action menu on Android
-
added
my favourite string
, inserted when users press ctrl+b -
added action command
.favourite
for users to customisemy favourite string
-
support customisation of the default tool when tool-selection agent is not enabled and a tool is not specified in a request. Enter
.tools
to customise.
-
improved integration with uniquebible app
-
support api to retrieve bible data
- improved Android support
- fixed loading agents with Ollama and Llama.cpp
- updated groqchat
-
fixed reflection tools
-
automatically saved workflow with
save
,save as
, andexport
-
added ".workflow" to display the current workflow
-
added ".shareworkflow" to share the current workflow on Android
-
action item ".workflow" in action menu to display current workflow
- integration with bible tools [optional]
- added a bundle of Android-only tools:
@show_location @show_connection @start_recording @stop_recording @phone_call @play_media @search_contacts @take_photo @selfie @read_sms @send_sms @send_email @send_whatsapp @share @share_file
- set
tool_selection_agent
to False by default
- added commentary suggestions
- added a tool
@share
to share generated result to other apps in Android.
- added commentary integration in bible tool
-
fixed custom system messages suggestion
-
fixed clipboard tools on Android
-
".txt" extension added to exported conversation by default
- automatically check if termux:api is enabled
-
added Android tools
@share_file
,@send_email
,@send_whatsapp
,@add_calendar_event
,@open_browser
-
added Android actions
.timer
,.alarm
- fixed Dalle tools on running on Android
-
updated plugin
search searxng
, so that it works on both full version and lighter Android version. -
added item
.maxonlinesearches
to action menu, for users to customise the maximum number of online search results to be retrieved.
- update input suggestions after changes made to plugin selection
- support access to local server installed outside a container running on the same machine.
- added tool
perplexica
- added plugins for audio analysis and transcription
-
tweaked plugin
fabric
-
updated help store
- minor tweaks for Android
- added two tools
o1
ando1_mini
to use reasoning model o1-preview and o1-mini
-
added support of single-turn system message for chat conversation
-
integrated predefined chat system messages and contexts with tool
@chat
- updated ollama model list
- updated chatgpt model list
-
fixed plugin
create ai assistants
-
added support optional modules for installation:
gui
install additional GUI library for running gui system tray and experimental desktop assistant
pip install --upgrade toolmate[gui]
linux
install additional packages for Linux users, i.e. flaml[automl]
, piper-tts
, pyautogen[autobuild]
pip install --upgrade toolmate[linux]
bible
install additional packages for working with bible tools
pip install --upgrade toolmate[bible]
- support multiple Open Weather Map API keys
- support multiple Elevenlabs API keys
- simplified the plugin
read aloud
-
added support of Vosk Speech Recognition Toolkit for speech recognition
-
added support of edge-tts for speech generation
- supports Python 3.12.x
- updated help store
-
added tool
@termux
intoolmate_lite
version -
support install full version
toolmate
on Android
- fixed plugin
search_searxng
-
created Android package
toolmate_lite
-
updated documentations
-
added instructions to install Ollama on Android
-
tweaked plugin
search searxng
Added tool aliases for @search_searxng
categories:
-
@apps
Search for information online in the 'apps' category. -
@files
Search for information online in the 'files' category. -
@general
Search for information online in the 'general' category. -
@images
Search for information online in the 'images' category. -
@it
Search for information online in the 'it' category. -
@lyrics
Search for information online in the 'lyrics' category. -
@map
Search for information online in the 'map' category. -
@music
Search for information online in the 'music' category. -
@news
Search for information online in the 'news' category. -
@packages
Search for information online in the 'packages' category. -
@qna
Search for information online in the 'questions_and_answers' category. -
@radio
Search for information online in the 'radio' category. -
@repos
Search for information online in the 'repos' category. -
@science
Search for information online in the 'science' category. -
@scientific_publications
Search for information online in the 'scientific_publications' category. -
@social_media
Search for information online in the 'social_media' category. -
@software_wikis
Search for information online in the 'software_wikis' category. -
@translate
Search for information online in the 'translate' category. -
@videos
Search for information online in the 'videos' category. -
@web
Search for information online in the 'web' category. -
@wikimedia
Search for information online in the 'wikimedia' category.
-
support SearXNG categories syntax for searching online with tool
internet
or@search_searxng
. Read https://github.com/eliranwong/toolmate/blob/main/package/toolmate/docs/Perplexica%20and%20SearXNG%20Integration.md#searxng-setup -
fixed searching help store
-
updated help store
-
added an alias
internet
to point to@search_searxng
-
updated Ollama model list
-
updated documentations
-
minor tweaks
-
added plugin
search searxng
-
updated help store
Updated Grop model lists
Added plugins:
-
analyze images with Groq
-
ask tavily
-
search tavily
-
unload Llama.cpp model on exit
-
support Lllam.cpp to use additional chat model
-
updated help store
-
fixed dynamic token count feature for chatgpt and letmedoit mode
-
added ui to specify context window size and gpu layers
-
tweaked using Ollama models
- fixed loading Ollama models
-
added ToolMate AI icon
-
fixed image-related plugins
-
fixed plugin "ask gemini"
-
rebuilt help vector store
-
added a help vector store for searching documentations
-
added a tool
@help
to search for offline documentations
- prompt confirm chaning embedding model
- fixed loading Ollama models for embedding
- support RAG utilities in GUI
- support Ollama models for embedding
-
updated RAG utilities
-
updated dependencies
- added risk assessment agent to safeguard from running harmful system command.
- added gui support to a few plugins
-
unload Ollama model when a chat session is finished or the app exits.
-
updated two Ollama plugins
-
enhance Tool Selection Agent, read Tool Selection Agent
-
Added an option for tool selection configurations, i.e.:
Would you like to inform the Tool Selection Agent of each tool's requirements? Doing so could improve the selection outcome, but it will consume more tokens and processing power.
-
maintain backward compatibility to LetMeDoIt mode
-
optimised tool
@recommend_tool
- get rid of old tool selection code to make launching faster
-
implemented the new tool selection agent to all other backends
-
check backward compatibility of letmedoit mode
-
brand new tool selection agent
-
implemented the new tool selection agent to Groq backend
-
updated Ollama model list
-
gui development in progress
- testing gui in developer mode
-
Run
toolmate
-
Enable
Developer Mode
, by entering.toggledeveloper
-
Run
toolmateai
-
Select
Desktop Assistant [experimental]
from the system tray.
This is a raw one, not yet ready for production.
- added tool
@create_image_flux
to create images with Flux.1.
Running Flux models locally requires GPU support, read GPU Acceleration
- added tool
@create_image_imagen3
to create images with Google Imagen 3 via Vertex AI.
For set up of Vertex AI Credentials, read https://github.com/eliranwong/toolmate/blob/main/package/toolmate/docs/Google%20Cloud%20Service%20Credential%20Setup.md
- Improved tool descriptions
- added tool
@recommend_tool
to help users to find an appropriate tool
-
added special entry - Enter
@
to read brief descriptions of all enabled tools. -
fixed fabric plugin
- fixed chat feature
- added save as and export features
- support workflows, read https://github.com/eliranwong/toolmate/blob/main/package/toolmate/docs/Workflows.md
- custom tool system message and chat system message
- fixed install on macOS, pysqlite3 failed the installation, as it is required by pyautogen[autobuild]
- fixed loading
reflection
plugin
-
fixed depencdency versions for installation
-
fixed link to Reflection Agents documentation
-
added tool
@deep_reflection
. -
added documentation on
Reflection Agents
at https://github.com/eliranwong/toolmate/blob/main/package/toolmate/docs/Reflection%20Agents.md
- added tool
@reflection
to mimic the reflection feature performed by the LLMReflection:70b
This tool is created to work with any LLM
-
updagrade a few dependencies
-
downgrade
elevenlabs
version to 1.5.0, to avoid a pydantic warning
- renamed old data directory
freegenius
totoolmate
, to facilitate migration
FreeGenius AI
has been renamed to ToolMate AI
. For the latest developement, read https://github.com/eliranwong/toolmate
-
removed triton==2.3.0 from requirements.txt, for Windows users, read
-
updated a few package versions
-
added item
.read
to Action Menu, to read assistant previous response with text-to-speech utility. -
added option to manage code execution risk, read https://github.com/eliranwong/freegenius/blob/main/package/freegenius/docs/Risk%20Management%20Agent.md
-
tool
execute_python_code
now works withconfig.toolTextOutput
for retrieval of text output. -
tweaked tool
improve_writing
-
changed ollama default models
-
support nested input suggestions via plugins
-
added fabric patterns to nested input suggestions, read https://github.com/eliranwong/freegenius/blob/main/package/freegenius/docs/Fabric%20Integration.md
-
added tool
@context
to work with predefined contexts, read https://github.com/eliranwong/freegenius/blob/main/package/freegenius/docs/Predefined%20Contexts.md
-
Updated documentation
-
Added an option
config.enable_tool_screening_agent
to enable / disable tool-screening agent. -
Added an option
config.tool_selection_agent
to enable / disable tool-selection agent.
-
Added Gemini model options: "gemini-1.0-pro-001", "gemini-1.0-pro-002", "gemini-1.5-flash-001", "gemini-1.5-pro-001 (Default)"
-
Sorted auto suggestions
-
Fixed Running tool calling with Gemini models
-
Changed default models for backend
llamacpp
:
- tool - MaziyarPanahi/WizardLM-2-7B-GGUF/WizardLM-2-7B.Q4_K_M.gguf
- chat - bartowski/Meta-Llama-3.1-8B-Instruct-GGUF/Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf
- Updating documentation ...
- Improved relative date conversion
- Updated Ollama model list
Fixed tools @command
, @append_command
, @fabric
and @append_fabric
Added three tools:
convert_relative_datetime
to convert relative dates and times in writing
add_google_calendar_event
to add Google calendar event
add_outlook_calendar_event
to add Outlook calendar event
Removed tool add_calendar_event
Added two tools:
copy_to_clipboard
copy text to the system clipboard
paste_from_clipboard
paste text from the system clipboard
Removed items .code
and .run
from action menu.
Added two tools:
extract_python_code
extract python code, enclosed by ```
execute_python_code
extract and run python code, enclosed by ```
Fixed input suggestion plugin
Removed item .content
from action menu.
Added a tool list_current_directory_contents
to list current directory contents.
Support tool without given prompt.
Fixed built-in text editor
Fixed startup
Now able to save changes after editing assistant previous response.
-
Improved plugin
fabric
-
Added config item
fabricPath
. Users can customise fabric path by editing its value inconfig.py
. -
Added two new tools:
@append_instruction
- append assistant previous response to the newly given prompt.
@improve_writing
- improve writing of the given prompt
- Added two new tools:
@command
Execute the given command
@append_command
Append assistant previous response to the given command and execute.
@command echo "Hello World!"
@append_command echo
These new tools work with multiple tools in a single prompt.
For an example, to integrate fabric
with other FreeGenius AI tools, you may do something like this:
@command /home/ubuntu/go/bin/fabric -m gemini-1.5-pro -p write_essay "What is machine learning?"
@append_command /home/ubuntu/go/bin/fabric -m llama3.1:latest -p extract_wisdom
@append_command /home/ubuntu/go/bin/fabric -m mistral-large:123b -p summarize
gemini Explain it to a five-year kid
chatgpt Translate it into Chinese
- Created two aliases:
@fabric
-> @command fabric
@append_fabric
-> @append_command fabric
The aliases were added in the plugin fabric.py
from freegenius import config
config.aliases["@fabric"] = "@command fabric"
config.aliases["@append_fabric"] = "@append_command fabric"
config.inputSuggestions += ["@fabric", "@append_fabric"]
Users may further customise, e.g. changing the fabric path, etc.
Fixed improve input entry
feature.
- Added plugins
ask_chatgpt
,ask_codey
,ask_gemini
,ask_groq
,ask_llama3_1
,ask_llamacpp
,ask_llamacppserver
,ask_ollama
,ask_palm2
to call different chatbots for collaboration. For example, with support of runningMultiple Tools in Single Prompt
, you can do something like:
@chat What is the future of AI development?
chatgpt What is your opinion?
gemini What do you disagree?
Or
llama3_1 Write code to extract mp3 audio from YouTube video
codey Review the code generated above
- Suspended features in previous version resume:
Let me Translate
feature with pre-defined contextimproved writing
feature- forcing the app to always
search_google
- Added initial support multiple-step actions in a single prompt.
Examples of Use Cases:
To guide your chosen LLM to provide you with a step-by-step response, for example:
@chat What is narrative therapy?
@chat How does it compare to other popular counselling approaches?
@chat Tell me pros and cons of this approach?
@chat Give me theories that support this approach in detail.
@chat Any controversies about it?
@chat Give me a summary of all your findings above.
To guide FreeGenius AI to perform multiple computing tasks in order.
For example, download two more songs from YouTube and play all downloaded mp3 files with VLC player:
@download_youtube_audio https://youtu.be/KBD18rsVJHk?si=PhfzNCOBIj7o_Bdy
@download_youtube_audio https://www.youtube.com/watch?v=gCGs6t3tOCU
@task Play all the mp3 files in folder `/home/ubuntu/freegenius/audio` with command `vlc`
To integrate multiple tools and chat features in a single prompt, for example:
@search_google Latest updates about OpenAI in 2024
@chat Give me a summary
@send_gmail Email your findings to [email protected] in detail
- The following features are temporarily suspended to facilitate the development of the
Multiple Tools
feature:
Let me Translate
feature with pre-defined contextimproved writing
feature- calling different
chatbots
from the main session - forcing the app to always
search_google
They may be added back or changed in coming updates
-
Special entry
@none
introduced in the last version is now changed to@chat
. It means for chat-only feature without using a tool. -
Plugin
send_email
is changed to two separate pluginssend_gmail
andsend outlook
. Their corresponding entries are:
@send_gamil
@send_outlook
- Added two plugins
download_youtube_video
anddownload_youtube_audio
, previously integrated into plugindownload_web_content
. Their corresponding entries are:
@download_youtube_video
@download_youtube_audio
- Use
@
to call a particular tool (inspired by Google Gemini App)
Changed tool calling pattern from [TOOL_{tool_name}]
to @{tool_name}
Currently supported tools:
@search_google @add_calendar_event @examine_audio @examine_files @examine_images @examine_web_content @correct_python_code @chat @build_agents @create_image @create_map @create_qrcode @create_statistical_graphics @datetimes @download_web_content @edit_text @task @install_python_package @save_memory @search_memory @modify_images @open_browser @pronunce_words @remove_image_background @search_conversations @load_conversations @search_finance @search_news @search_sqlite @search_weather @send_email @send_tweet
To disable tool in for a single turn, use @none
.
Tips: Enter @
to get input suggestions of available tools
- Removed the
improved writing
feature temporarily, will be added as a separate tool next update