Struct ::exo::ai::Llm

Overview

The Interface to interact with large language models.

Methods

fn new() -> Llm

Creates a new Llm connection.

Example

let llm = exo::ai::Llm::new()
           .temperature(0.7)
           .system_msg("You are a helpfull assistant.");
let res = llm.send("Hello").await?;
assert!(res is String);
fn clone(self) -> Llm

Clones the LLM and returns a new independet one.

Example

let llm0 = exo::ai::Llm::new();
let llm1 = llm0.clone();
assert_eq!(llm0, llm1);
llm1.temperature = 0.25;
assert_ne!(llm0, llm1);
fn temperature(self, temperature: f64) -> Llm

Set the temperature for the LLM. The higher the value the more creative the LLM will answer your requests. This functions consumes the LLM and returns it updated.

Example

let llm = exo::ai::Llm::new().temperature(0.25);
assert_eq!(llm.temperature, 0.25);
fn model(self, model: LlmTyp) -> Llm

Set the model to use.. This functions consumes the Llm and returns it updated.

Example

let llm = exo::ai::Llm::new().model(exo::ai::LlmTyp::Mixtral_8x22b);
assert_eq!(llm.model, exo::ai::LlmTyp::Mixtral_8x22b);
fn system_msg(self, msg: String) -> Llm

Set the system message of the chat conversation. This functions consumes the Llm and returns it updated.

Example

let llm = exo::ai::Llm::new().system_msg("You are a helpfull assistant.");
assert_eq!(llm.system_msg, "You are a helpfull assistant.");
async fn chat(self, question: String) -> Result

Sends a message to the Ai and returns the answer. It also attaches the question and reponse to the actual chat conversation. This allows for a complete conversation to happen.

Example

let llm = exo::ai::Llm::new().system_msg("You are a helpfull assistant.");
let res = llm.chat("Hello").await?;
assert!(res is String);
async fn send(self, question: String) -> Result

Sends a message to the Ai and returns the answer. It's not attaching the question or answer to the conversation. This allows to ask another question towards the same context as before.

Example

let llm = exo::ai::Llm::new()
   .model(exo::ai::LlmTyp::Mixtral_8x22b)
   .system_msg("You are a helpfull assistant.");
//let res = llm.send("Hello").await?;
//assert!(res is String);

Protocols

protocol get model
let output = value.model

Allows a get operation to work.

protocol set model
value.model = input

Allows a set operation to work.

protocol get temperature
let output = value.temperature

Allows a get operation to work.

protocol set temperature
value.temperature = input

Allows a set operation to work.

protocol get tokens_used
let output = value.tokens_used

Allows a get operation to work.

protocol set tokens_used
value.tokens_used = input

Allows a set operation to work.

protocol partial_eq
if value == b { }

Allows for partial equality operations to work.

protocol string_debug
println("{:?}", value)

Allows the value to be debug printed.

protocol string_display
println("{}", value)

Allows the value to be display printed.

protocol get system_msg
let output = value.system_msg

Get the system message of the chat conversation.

Example

let llm = exo::ai::Llm::new();
llm.system_msg = "You are a helpfull assistant:";
assert_eq!(llm.system_msg, "You are a helpfull assistant:");
protocol set system_msg
value.system_msg = input

Set the system message of the chat conversation.

Example

let llm = exo::ai::Llm::new();
llm.system_msg = "You are a helpfull assistant:";
assert_eq!(llm.system_msg, "You are a helpfull assistant:");