Yes, you could. In this post: https://www.learnwitharobot.com/p/supporting-knowledge-graphs-in-vector , we explored how Wirepod is integrated with Together AI service. Essentially, wirepod invokes the Together AI API. Together supports Mixtral. If you want to support a local LLM, you would have to do something similar… I.e offer an API that runs Mixtral which Wirepod can call. I haven’t tried the local way because of the lack of a GPU… but sounds like something that will be a cool project.
Are there any local llm client (like lm studio or others) wich use API very similar to OpenAi or together AI ones ? (I'm not really a developper, and i don't know much in ai nor in local lm clients but i can try to make something in my free time)
I think LocalAI (https://github.com/mudler/LocalAI) is the resource you are looking for. They seem to be a very popular and well maintained open source project. Not sure if they support Mixtral, but they seem to support most of the other popular open source models such as Llama2. They also mention that one can try them without GPUs.
I will attempt to try this and write a post about my evaluation.
is it possible to connect Vector’s Knowledge Graph to local llm/instruct like mixtral throught wirepod ?
Yes, you could. In this post: https://www.learnwitharobot.com/p/supporting-knowledge-graphs-in-vector , we explored how Wirepod is integrated with Together AI service. Essentially, wirepod invokes the Together AI API. Together supports Mixtral. If you want to support a local LLM, you would have to do something similar… I.e offer an API that runs Mixtral which Wirepod can call. I haven’t tried the local way because of the lack of a GPU… but sounds like something that will be a cool project.
Are there any local llm client (like lm studio or others) wich use API very similar to OpenAi or together AI ones ? (I'm not really a developper, and i don't know much in ai nor in local lm clients but i can try to make something in my free time)
There should be easy options if you want to implement your own LLM and expose as an API. I will need to do some research and get back to you.
I think LocalAI (https://github.com/mudler/LocalAI) is the resource you are looking for. They seem to be a very popular and well maintained open source project. Not sure if they support Mixtral, but they seem to support most of the other popular open source models such as Llama2. They also mention that one can try them without GPUs.
I will attempt to try this and write a post about my evaluation.
Thank you very much !