If you are the owner of the Vector robot, you must be knowing about Wirepod… Wirepod is pretty much the main way to have an independent functioning Vector robot. We have written a few articles about Wirepod, including an introduction, and a recent evaluation of how it works with different flavors of Vector robots.
Implementation of Vector’s Knowledge Graph in Wirepod
In this article, we will explore how Wirepod integrates with OpenAI for the Knowledge Graph. To those unfamiliar with the utility of the Knowledge Graph, it is used by Vector to respond to a question asked by you. As an example, take a look at the following video where I ask Vector the distance between San Francisco and Los Angeles:
In the above video, the response to my question comes from the Knowledge Graph. Vector then narrates the answer and tells me the distance. Wirepod integrates with Houndify and OpenAI for the Knowledge Graph. In this article, we study the code behind the Knowledge Graph.
Lets Deep Dive into the code
Let’s now look at and study the code in Wirepod which interfaces with OpenAI for the Knowledge graph.
func openaiRequest(transcribedText string) string {
sendString := "You are a helpful robot called " + vars.APIConfig.Knowledge.RobotName + ". You will be given a question asked by a user and you must provide the best answer you can. It may not be punctuated or spelled correctly. Keep the answer concise yet informative. Here is the question: " + "\\" + "\"" + transcribedText + "\\" + "\"" + " , Answer: "
logger.Println("Making request to OpenAI...")
url := "https://api.openai.com/v1/completions"
formData := `{
"model": "text-davinci-003",
"prompt": "` + sendString + `",
"temperature": 0.7,
"max_tokens": 256,
"top_p": 1,
"frequency_penalty": 0.2,
"presence_penalty": 0
}`
req, _ := http.NewRequest("POST", url, bytes.NewBuffer([]byte(formData)))
req.Header.Set("Content-Type", "application/json")
req.Header.Set("Authorization", "Bearer "+vars.APIConfig.Knowledge.Key)
client := &http.Client{}
resp, err := client.Do(req)
....
This routine openaiRequest()
is responsible for replying to the user’s question. Looking through the code, one can see the prompt used by wirepod to ask the question to OpenAI (See how the variable sendString
is set up). An interesting statement in the prompt is: “It may not be punctuated or spelled correctly”.
This is likely needed because it is possible for the audio to text translation from wirepod to have errors.
Poking around OpenAI
The OpenAI model used is text-davinci-003.
You can change the model to something new such as GPT-4. (The full list of models supported by OpenAI is available here). Be careful about the costs though, the bigger and more recent models are very pricey. The free tier of OpenAI (which offers a credit of $18 to be used in 3 months) supports only GPT-3.5
You can also play with the max_tokens
parameter, which is approximately the total number of words in the answer (Note that I said approximately, 1 token is not 1 word, longer words comprise of multiple tokens). As one example, the OpenAI model gpt-4-32k
can reply with an answer of 32000 tokens (close to 32000 words). 32000 words is long enough for a nice bedtime story, so you can potentially have Vector read out a bed time story for you, and put you to sleep.
Other services which support Knowledge Graphs
You are also not restricted to OpenAI. You can use some other service for your Knowledge Graph (Note that Wirepod supports Houndify and OpenAI for now). As an example, you can avail of the services of Together.ai, which supports implementations of many open source models. You can also develop your own Knowledge Graph. The Knowledge Graph could also be very simple, such as answering a set of questions for you. As an example, in the following video, I modified the knowledge graph to read me bedtime stories. We will explore how to integrate other knowledge graphs with Vector in future posts.
In Summary
Hope you enjoy tweaking the knowledge graph of Wirepod. It’s a lot of fun playing with these Large Language Models (LLMs), and you will also learn a lot. If you have some experience playing with Vector’s Knowledge Graph, please put your comments here. Also, please subscribe and look out for future articles on this topic.
is it possible to connect Vector’s Knowledge Graph to local llm/instruct like mixtral throught wirepod ?