6 Comments
Jan 15Liked by Amitabha Banerjee

is it possible to connect Vector’s Knowledge Graph to local llm/instruct like mixtral throught wirepod ?

Expand full comment
author

Yes, you could. In this post: https://www.learnwitharobot.com/p/supporting-knowledge-graphs-in-vector , we explored how Wirepod is integrated with Together AI service. Essentially, wirepod invokes the Together AI API. Together supports Mixtral. If you want to support a local LLM, you would have to do something similar… I.e offer an API that runs Mixtral which Wirepod can call. I haven’t tried the local way because of the lack of a GPU… but sounds like something that will be a cool project.

Expand full comment

Are there any local llm client (like lm studio or others) wich use API very similar to OpenAi or together AI ones ? (I'm not really a developper, and i don't know much in ai nor in local lm clients but i can try to make something in my free time)

Expand full comment
author

There should be easy options if you want to implement your own LLM and expose as an API. I will need to do some research and get back to you.

Expand full comment
author

I think LocalAI (https://github.com/mudler/LocalAI) is the resource you are looking for. They seem to be a very popular and well maintained open source project. Not sure if they support Mixtral, but they seem to support most of the other popular open source models such as Llama2. They also mention that one can try them without GPUs.

I will attempt to try this and write a post about my evaluation.

Expand full comment

Thank you very much !

Expand full comment