Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

mini.llm plugin for ai integration (new google gemini pro, chatgpt apis etc) #624

Open
2 tasks done
bassamsdata opened this issue Dec 19, 2023 · 4 comments
Open
2 tasks done
Labels
feature-request Feature request

Comments

@bassamsdata
Copy link

Contributing guidelines

Module(s)

mini.llm

Description

Request overview

I'd like to propose a new mini plugin that integrates seamlessly with the new google gemini pro and maybe others as well (chatgpt, mistral ..) this plugin could aims to enhance capabilities in chat, diagnostics, and text generation.

Reasons for the request

ease of use and mini modules integration: utilizing established apis and established already existed UI/UX for mini modules (like mini.pick) ensures straightforward integration without the complexities. (excluding local llms)

currently, the primary ai plugin that well matured for neovim is chatgpt.nvim (excluding copilot or codeium, designed for code completion). however, with the new google gemini pro apis, offering 60 free queries per minute, there is a significant opportunity to create a valuable plugin.

thank you

@bassamsdata bassamsdata added the feature-request Feature request label Dec 19, 2023
@echasnovski
Copy link
Owner

Thanks for the suggestion!

I am not a fan of proving any functionality that is tailored to any particular provider and requires a significant upkeep (react to API changes, etc.). Plus, I don't think those "60 free queries per minute" are here for long.

I like the name though and will think about the module, but currently I am skeptical.

@bassamsdata
Copy link
Author

appreciate your thoughts!
I understand the maintenance worries around APIs (probably, the companies should make such plugins 🙂). but I'll certainly enjoy the 60 qpm while it last 😄, and the prices are still better than gpt-4 turbo.

Since i use ai daily, i might create a small wrapper to stay in neovim.

thanks again!

@gallo-s-chingon
Copy link

I was going to second this, but what if something "else" did the heavy lifting? I'm a prose writer, and was going to post another issue as I searched for it and found this.

what about something along the lines of Ollama / gen.nvim? I haven't set it up myself, but my use is for prose writing and getting different perspectives for my characters or just have AI rewrite a characters lines in the characters "voice"

at any rate, mini.deps makes "seeing" lua better

@echasnovski
Copy link
Owner

I don't have any concrete plans on integrating LLMs into 'mini.nvim'. Yet.

My current idea is something along the lines of a general purpose 'mini.repl' for REPLs (Read-Eval-Print-Loops) with configurable backends. This way it can be used both by interactive "programming" (IPython, R, Julia, etc.) and "chatting" (conversational LLMs) purposes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature-request Feature request
Projects
None yet
Development

No branches or pull requests

3 participants