Getting Started

Building AI/LLM applications in golang
[edit]

This tutorial will take you through the process of building a powerful AI/LLM application using LinGoose. We will be building a Question Answering system using a knowledge base document and Retrieval Augmented Generation (RAG).

Set up Project

Create a directory for your project, and initialise it as a Go Module:

mkdir lingoose-qa
cd lingoose-qa
go mod init github.com/[username]/lingoose-qa

Next, create a main.go file as follows. We will

package main

func main() {
	fmt.Println("LinGoose rulez!")
}

To automatically add the dependency to your go.mod run

go mod tidy

By default you’ll be using the latest version of LinGoose, but if you want to specify a particular version you can use go get (replacing VERSION with the particular version desired)

go get -d github.com/henomis/lingoose@VERSION

Project Implementation

Create an AI assistant

The assistant package provides a high-level interface for building AI/LLM applications. It is designed to be easy to use and flexible, allowing you to build a wide range of applications, from simple chatbots to complex AI systems.

To create an AI assistant, you need to create an instance of assistant.Assistant, and then configure it with the necessary components. An AI assistant needs at least one LLM model to be run.

myAssistant := assistant.New(openai.New().WithTemperature(0))

In this example, we create an AI assistant with an LLM model from OpenAI. We also set the temperature to 0, which means that the model will always return the most likely response.

Attach a Thread

The thread package provides a simple interface for managing conversations. You can create a new thread, add messages to it, and then run the thread to get the AI assistant’s response.

myAssistant := assistant.New(openai.New().WithTemperature(0)).WithThread(
  thread.New().AddMessages(
    thread.NewUserMessage().AddContent(
      thread.NewTextContent("what is the purpose of NATO?"),
    ),
  ),
)

In this example, we create a new thread and add a user message to it. We then attach the thread to the AI assistant, so that it can process the user message and generate a response.

Try your first AI response!

err := myAssistant.Run(context.Background())
if err != nil {
  panic(err)
}

fmt.Println(myAssistant.Thread())

This will print the response generated by the AI assistant. You can now use this response to continue the conversation, or to perform any other action in your application.

Attach a RAG

The rag package provides a high-level interface for building Retrieval Augmented Generation (RAG) models. You can create an instance of rag.RAG, and then configure it with the necessary components.

myRAG := rag.New(
  index.New(
    jsondb.New().WithPersist("db.json"),
    openaiembedder.New(openaiembedder.AdaEmbeddingV2),
  ),  
).WithTopK(3)

In this example, we create a RAG instance with a JSON index and an OpenAI embedder. We also set the topK parameter to 3, which means that the RAG will return the top 3 most relevant documents from the knowledge base.

Now you can add sources to the RAG to build your knowledge base.

_, err := os.Stat("db.json")
if os.IsNotExist(err) {
  err = myRAG.AddSources(context.Background(), "state_of_the_union.txt")
  if err != nil {
    panic(err)
  }
}

This will add the contents of the state_of_the_union.txt file to the knowledge base and your responses will be based on the content of this file.

Full Example

package main

import (
	"context"
	"fmt"
	"os"

	"github.com/henomis/lingoose/assistant"
	openaiembedder "github.com/henomis/lingoose/embedder/openai"
	"github.com/henomis/lingoose/index"
	"github.com/henomis/lingoose/index/vectordb/jsondb"
	"github.com/henomis/lingoose/llm/openai"
	"github.com/henomis/lingoose/rag"
	"github.com/henomis/lingoose/thread"
)

func main() {
	myRAG := rag.New(
		index.New(
			jsondb.New().WithPersist("db.json"),
			openaiembedder.New(openaiembedder.AdaEmbeddingV2),
		),		
	).WithTopK(3)

	_, err := os.Stat("db.json")
	if os.IsNotExist(err) {
		err = myRAG.AddSources(context.Background(), "state_of_the_union.txt")
		if err != nil {
			panic(err)
		}
	}

	myAssistant := assistant.New(
		openai.New().WithTemperature(0),
	).WithRAG(myRAG).WithThread(
		thread.New().AddMessages(
			thread.NewUserMessage().AddContent(
				thread.NewTextContent("what is the purpose of NATO?"),
			),
		),
	)

	err = myRAG.Run(context.Background())
	if err != nil {
		panic(err)
	}

	fmt.Println("----")
	fmt.Println(myAssistant.Thread())
	fmt.Println("----")
}