You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
chat with multiple model available with @
for example:
setup : above chat box rather than pick 1 model able to pick group model (group model set in the setting)
use: type prompt in prompt box with @modelname to get answer from that model: hello who are you? what base model are you? @mistral @llama2 @mixtral
or @mistral @llama2 @mixtral hello, who are you? what base model are you?
then it answer: mistral: i am mistral how can i help you today? llama2: i am llama2 is there anything else you want to know about my existence? mixtral: i am mixtral we are mixture of expert trained in various experts, is there anything more you like to know?
The text was updated successfully, but these errors were encountered:
Hey, thank you for the suggestion. It is possible to add, but it will be slow since Ollama needs to load different models for each message, one after another.
chat with multiple model available with @
for example:
hello who are you? what base model are you? @mistral @llama2 @mixtral
or
@mistral @llama2 @mixtral hello, who are you? what base model are you?
then it answer:
mistral: i am mistral how can i help you today? llama2: i am llama2 is there anything else you want to know about my existence? mixtral: i am mixtral we are mixture of expert trained in various experts, is there anything more you like to know?
The text was updated successfully, but these errors were encountered: