1个稳定版本
1.0.0 | 2023年8月20日 |
---|
#1102 in 过程宏
11KB
224 行
ChatGPT-rs
这个库是OpenAI ChatGPT API的异步Rust包装器。它支持对话、消息持久化和ChatGPT功能。
关于ChatGPT功能
函数API(在v1.2.0+中可用)目前是实验性的,可能不会按预期工作。如果您遇到任何问题或未定义的行为,请在此仓库中创建一个问题!
MSRV
此库的最小支持Rust版本是1.71.1
使用方法
以下是API的简单使用示例,获取单条消息的补全。您可以在examples
目录中找到更多实用示例。
use chatgpt::prelude::*;
use chatgpt::types::{CompletionResponse};
#[tokio::main]
async fn main() -> Result<()> {
// Getting the API key here
let key = args().nth(1).unwrap();
/// Creating a new ChatGPT client.
/// Note that it requires an API key, and uses
/// tokens from your OpenAI API account balance.
let client = ChatGPT::new(key)?;
/// Sending a message and getting the completion
let response: CompletionResponse = client
.send_message("Describe in five words the Rust programming language.")
.await?;
println!("Response: {}", response.message().content);
Ok(())
}
流式响应
如果您希望逐步构建响应消息,可以使用crate的streams
功能(默认未启用)和特殊方法来请求流式响应。
以下是一个示例
// Acquiring a streamed response
// Note, that the `futures_util` crate is required for most
// stream related utility methods
let stream = client
.send_message_streaming("Could you name me a few popular Rust backend server frameworks?")
.await?;
// Iterating over stream contents
stream
.for_each(|each| async move {
match each {
ResponseChunk::Content {
delta,
response_index: _,
} => {
// Printing part of response without the newline
print!("{delta}");
// Manually flushing the standard output, as `print` macro does not do that
stdout().lock().flush().unwrap();
}
_ => {}
}
})
.await;
}
请注意,返回的流通常没有任何实用方法,因此您必须使用您选择的异步库中的StreamExt
方法(例如futures-util
或tokio
)。
对话
对话是ChatGPT可以分析先前消息并串联其思考的线程。它们还自动存储所有消息历史。
以下是一个示例
// Creating a new conversation
let mut conversation: Conversation = client.new_conversation();
// Sending messages to the conversation
let response_a: CompletionResponse = conversation
.send_message("Could you describe the Rust programming language in 5 words?")
.await?;
let response_b: CompletionResponse = conversation
.send_message("Now could you do the same, but for Kotlin?")
.await?;
// You can also access the message history itself
for message in &conversation.history {
println!("{message:#?}")
}
这种创建对话的方式使用默认的介绍消息创建它,大致如下:您是ChatGPT, 由OpenAI开发的AI模型. 尽可能简洁地回答。 今天是: {today's date}
。
但是,您可以通过这种方式指定自己的介绍消息
let mut conversation: Conversation = client.new_conversation_directed("You are RustGPT, when answering any questions, you always shift the topic of the conversation to the Rust programming language.");
// Continue with the new conversation
对话流式传输
对话也支持返回流式响应(使用streams
功能)。
注意:流式响应不会自动保存返回的消息到历史记录,因此您需要手动保存。
以下是一个示例
// Acquiring a streamed response
// Note, that the `futures_util` crate is required for most
// stream related utility methods
let mut stream = conversation
.send_message_streaming("Could you name me a few popular Rust backend server frameworks?")
.await?;
// Iterating over a stream and collecting the results into a vector
let mut output: Vec<ResponseChunk> = Vec::new();
while let Some(chunk) = stream.next().await {
match chunk {
ResponseChunk::Content {
delta,
response_index,
} => {
// Printing part of response without the newline
print!("{delta}");
// Manually flushing the standard output, as `print` macro does not do that
stdout().lock().flush().unwrap();
output.push(ResponseChunk::Content {
delta,
response_index,
});
}
// We don't really care about other types, other than parsing them into a ChatMessage later
other => output.push(other),
}
}
// Parsing ChatMessage from the response chunks and saving it to the conversation history
let messages = ChatMessage::from_response_chunks(output);
conversation.history.push(messages[0].to_owned());
函数调用
ChatGPT-rs支持函数调用API。需要functions
功能。
您可以使用gpt_function
属性宏来定义函数,如下所示
use chatgpt::prelude::*;
/// Says hello to a user
///
/// * user_name - Name of the user to greet
#[gpt_function]
async fn say_hello(user_name: String) {
println!("Hello, {user_name}!")
}
// ... within your conversation, before sending first message
let mut conversation = client.new_conversation();
// note that you need to call the function when adding it
conversation.add_function(say_hello());
let response = conversation
.send_message_functions("Could you greet user with name `maxus`?")
.await?;
// At this point, if function call was issued it was already processed
// and subsequent response was sent
如您所见,GPT函数必须有一个描述,这样模型才知道何时调用它们以及它们做什么。在ChatGPT-rs中,函数描述以简单的Rust文档的形式表示。每个参数都按以下格式进行文档说明:* {参数名} - {参数描述}
。函数参数从JSON中处理,只要它们实现了schemars::JsonSchema
和serde::Deserialize
,它们将被正确解析。
默认情况下,ChatGPT-rs使用最小的schemars
功能,启用functions_extra
功能以添加对uuid
、chrono
、url
和either
的支持,或者定义自己的结构并派生schemars::JsonSchema
和serde::Deserialize
。
use schemars::JsonSchema;
use serde::Deserialize;
#[derive(JsonSchema, Deserialize)]
struct Args {
/// Name of the user
user_name: String,
/// New age of the user
user_age: u16
}
/// Wishes happy birthday to the user
///
/// * args - Arguments
#[gpt_function]
async fn happy_birthday(args: Args) {
println!("Hello, {}, You are now {}!", args.user_name, args.user_age);
}
函数还可以返回任何数据(只要它实现了serde::Serialize
),并将其返回到模型。
/// Does some heavy computations and returns result
///
/// * input - Input data as vector of floats
#[gpt_function]
async fn do_heavy_computation(input: Vec<f64>) -> Vec<f64> {
let output: Vec<f64> = // ... Do something with the input ...
return output;
}
默认情况下,函数仅通过调用send_message_functions
方法发送到API。如果您希望在每个消息中启用自动函数发送,可以将Conversation
中的always_send_functions
属性设置为true。
当前函数的限制如下
- 它们必须是异步的。
- 由于它们被视为token,您可能想要限制函数发送和/或其描述长度。
函数调用验证
如官方ChatGPT文档所述,ChatGPT可能会产生不存在的函数或提供无效的JSON。为了减轻这种情况,ChatGPT-rs提供了FunctionValidationStrategy
。如果在客户端模型配置中将它设置为Strict
,每当模型无法正确调用函数时,都会向模型发送一个系统消息进行纠正。
会话持久性
您当前可以将会话的消息存储为两种格式:JSON或postcard。它们可以通过使用json
和postcard
功能来分别开启或关闭。
由于ChatMessage
结构实现了serde的Serialize
和Deserialize
特质,您也可以使用任何与serde兼容的序列化库,因为history
字段和Conversation::new_with_history()
方法在Conversation
结构中是公开的。
使用JSON进行持久化
需要json
功能(默认启用)
// Create a new conversation here
let mut conversation: Conversation = ...;
// ... send messages to the conversation ...
// Saving the conversation
conversation.save_history_json("my-conversation.json").await?;
// You can later read this conversation history again
let mut restored = client
.restore_conversation_json("my-conversation.json")
.await?;
使用Postcard进行持久化
需要postcard
功能(默认禁用)
// Create a new conversation here
let mut conversation: Conversation = ...;
// ... send messages to the conversation ...
// Saving the conversation
conversation.save_history_postcard("my-conversation.bin").await?;
// You can later read this conversation history again
let mut restored = client
.restore_conversation_postcard("my-conversation.bin")
.await?;
高级配置
您可以使用ModelConfigurationBuilder
进一步配置您的模型,它还允许使用代理。
// Getting the API key here
let key = args().nth(1).unwrap();
// Creating a new ChatGPT client with extra settings.
// Note that it might not require an API key depending on proxy
let client = ChatGPT::new_with_config(
key,
ModelConfigurationBuilder::default()
.api_url("https://api.pawan.krd/v1/chat/completions")
.temperature(1.0)
.engine(ChatGPTEngine::Gpt4_32k)
.build()
.unwrap(),
)?;
依赖关系
~0.6-1MB
~24K SLoC