8个重大版本发布

0.9.0 2024年6月30日
0.7.0 2024年4月3日
0.6.0 2024年3月24日

#118机器学习 中排名

Download history 11/week @ 2024-04-24 55/week @ 2024-05-01 3/week @ 2024-05-29 10/week @ 2024-06-05 4/week @ 2024-06-12 187/week @ 2024-06-26 48/week @ 2024-07-03 10/week @ 2024-07-10 79/week @ 2024-07-24 38/week @ 2024-07-31 169/week @ 2024-08-07

每月286 次下载
llm_client 中使用

MIT/Apache

270KB
6.5K SLoC

clust

Anthropic/Claude API的非官方Rust客户端。

安装

在您的项目目录中运行以下Cargo命令

cargo add clust

或将以下行添加到您的Cargo.toml中

[dependencies]
clust = "0.9.0"

支持的API

功能标志

  • :启用生成 clust::messages::Toolclust::messages::AsyncToolclust::attributse::clust_tool 属性宏。

用法

API密钥和客户端

首先,您需要创建一个新的API客户端:使用环境变量中的Anthropic API密钥创建 clust::Client: " ANTHROPIC_API_KEY"

use clust::Client;

let client = Client::from_env().unwrap();

或直接指定API密钥

use clust::Client;
use clust::ApiKey;

let client = Client::from_api_key(ApiKey::new("your-api-key"));

如果您想自定义客户端,可以使用 clust::ClientBuilder 构建器模式。

use clust::ClientBuilder;
use clust::ApiKey;
use clust::Version;

let client = ClientBuilder::new(ApiKey::new("your-api-key"))
    .version(Version::V2023_06_01)
    .client(reqwest::ClientBuilder::new().timeout(std::time::Duration::from_secs(10)).build().unwrap())
    .build();

模型和最大令牌数

您可以通过 clust::messages::ClaudeModel 指定模型。

use clust::messages::ClaudeModel;
use clust::messages::MessagesRequestBody;

let model = ClaudeModel::Claude3Sonnet20240229;

let request_body = MessagesRequestBody {
    model,
    ..Default::default ()
};

因为文本生成的最大令牌数 clust::messages::MaxTokens 依赖于模型,因此您需要创建与模型一起的 clust::messages::MaxTokens

use clust::messages::ClaudeModel;
use clust::messages::MaxTokens;
use clust::messages::MessagesRequestBody;

let model = ClaudeModel::Claude3Sonnet20240229;
let max_tokens = MaxTokens::new(1024, model).unwrap();

let request_body = MessagesRequestBody {
    model,
    max_tokens,
    ..Default::default ()
};

提示

您可以通过 clust::messages::SystemPrompt 指定系统提示,消息中没有 "系统" 角色。

use clust::messages::SystemPrompt;
use clust::messages::MessagesRequestBody;

let system_prompt = SystemPrompt::new("You are an excellent AI assistant.");

let request_body = MessagesRequestBody {
    system: Some(system_prompt),
    ..Default::default ()
};

消息和内容

通过向量 clust::messages::Message 构建消息。

use clust::messages::Role;
use clust::messages::Content;

/// The message.
pub struct Message {
    /// The role of the message.
    pub role: Role,
    /// The content of the message.
    pub content: Content,
}

可以按以下方式创建每个角色消息

use clust::messages::Message;

let message = Message::user("Hello, Claude!");
let message = Message::assistant("Hello, user!");

以及内容:clust::messages::Content

use clust::messages::ContentBlock;

/// The content of the message.
pub enum Content {
    /// The single text content.
    SingleText(String),
    /// The multiple content blocks.
    MultipleBlocks(Vec<ContentBlock>),
}

多个块是内容块的向量:clust::messages::ContentBlock

use clust::messages::TextContentBlock;
use clust::messages::ImageContentBlock;

/// The content block of the message.
pub enum ContentBlock {
    /// The text content block.
    Text(TextContentBlock),
    /// The image content block.
    Image(ImageContentBlock),
}

可以按以下方式创建内容

use clust::messages::Content;
use clust::messages::ContentBlock;
use clust::messages::TextContentBlock;
use clust::messages::ImageContentBlock;
use clust::messages::ImageContentSource;
use clust::messages::ImageMediaType;

// Single text content
let content = Content::SingleText("Hello, Claude!".to_string());
// or use `From` trait
let content = Content::from("Hello, Claude!");

// Multiple content blocks
let content = Content::MultipleBlocks(vec![
    ContentBlock::Text(TextContentBlock::new("Hello, Claude!")),
    ContentBlock::Image(ImageContentBlock::new(ImageContentSource::base64(
        ImageMediaType::Png,
        "Base64 encoded image data",
    ))),
]);
// or use `From` trait for `String` or `ImageContentSource`
let content = Content::from(vec![
    ContentBlock::from("Hello, Claude!"),
    ContentBlock::from(ImageContentSource::base64(
        ImageMediaType::Png,
        "Base64 encoded image data",
    )),
]);

请求体

请求体由 clust::messages::MessagesRequestBody 定义。

有关其他选项,请参阅 MessagesRequestBody

use clust::messages::MessagesRequestBody;
use clust::messages::ClaudeModel;
use clust::messages::Message;
use clust::messages::MaxTokens;
use clust::messages::SystemPrompt;

let request_body = MessagesRequestBody {
    model: ClaudeModel::Claude3Sonnet20240229,
    messages: vec![Message::user("Hello, Claude!")],
    max_tokens: MaxTokens::new(1024, ClaudeModel::Claude3Sonnet20240229).unwrap(),
    system: Some(SystemPrompt::new("You are an excellent AI assistant.")),
    ..Default::default ()
};

您还可以使用构建器模式与 clust::messages::MessagesRequestBuilder

use clust::messages::MessagesRequestBuilder;
use clust::messages::ClaudeModel;
use clust::messages::Message;
use clust::messages::SystemPrompt;

let request_body = MessagesRequestBuilder::new_with_max_tokens(
    ClaudeModel::Claude3Sonnet20240229,
    1024,
).unwrap()
.messages(vec![Message::user("Hello, Claude!")])
.system(SystemPrompt::new("You are an excellent AI assistant."))
.build();

API 调用

使用请求体通过 clust::Client::create_a_message 调用 API。

use clust::Client;
use clust::messages::MessagesRequestBody;

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let client = Client::from_env()?;
    let request_body = MessagesRequestBody::default();

    // Call the async API.
    let response = client
        .create_a_message(request_body)
        .await?;

    // You can extract the text content from `clust::messages::MessagesResponseBody.content.flatten_into_text()`.
    println!("Content: {}", response.content.flatten_into_text()?);

    Ok(())
}

流式传输

当您想增量流式传输响应时,可以使用带有流选项 StreamOption::ReturnStreamclust::Client::create_a_message_stream

use clust::Client;
use clust::messages::MessagesRequestBody;
use clust::messages::StreamOption;

use tokio_stream::StreamExt;

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let client = Client::from_env()?;
    let request_body = MessagesRequestBody {
        stream: Some(StreamOption::ReturnStream),
        ..Default::default()
    };

    // Call the async API and get the stream.
    let mut stream = client
        .create_a_message_stream(request_body)
        .await?;

    // Poll the stream.
    while let Some(chunk) = stream.next().await {
         // Handle the chunk.
    }

    Ok(())
}

工具使用

支持两种方法的工具使用

1. 使用 Rust 函数的 clust_tool 属性宏

当您使用如下注释定义 Rust 函数作为工具时

/// Get the current weather in a given location
///
/// ## Arguments
/// - `location` - The city and state, e.g. San Francisco, CA
fn get_weather(location: String) -> String {
    "15 degrees".to_string() // Dummy response
}

可以使用 clust::clust_macros::clust_tool 属性宏,并通过 macros 功能标志来生成代码

/// Get the current weather in a given location
///
/// ## Arguments
/// - `location` - The city and state, e.g. San Francisco, CA
#[clust_tool] // <- Generate `clust::messages::Tool` for this function
fn get_weather(location: String) -> String {
    "15 degrees".to_string() // Dummy response
}

并从函数创建名为 ClustTool_{function_name}clust::messages::Tool 实例

let tool = ClustTool_get_weather {};

clust::messages::Tool 获取工具定义以进行 API 请求

let tool_definition = tool.definition();

并使用从 API 响应获取的工具调用

let tool_result = tool.call(tool_use);

有关详细信息,请参阅工具使用示例clust_tool

2. 手动实现 clust::messages::Toolclust::messages::AsyncTool

您可以为您的工具手动实现 clust::messages::Toolclust::messages::AsyncTool

示例

创建消息

以下是从环境变量 ANTHROPIC_API_KEY 加载 API 密钥创建消息的示例:

ANTHROPIC_API_KEY={your-api-key}

是以下

use clust::messages::ClaudeModel;
use clust::messages::MaxTokens;
use clust::messages::Message;
use clust::messages::MessagesRequestBody;
use clust::messages::SystemPrompt;
use clust::Client;

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    // 1. Create a new API client with the API key loaded from the environment variable: `ANTHROPIC_API_KEY`.
    let client = Client::from_env()?;

    // 2. Create a request body.
    let model = ClaudeModel::Claude3Sonnet20240229;
    let messages = vec![Message::user(
        "Where is the capital of France?",
    )];
    let max_tokens = MaxTokens::new(1024, model)?;
    let system_prompt = SystemPrompt::new("You are an excellent AI assistant.");
    let request_body = MessagesRequestBody {
        model,
        messages,
        max_tokens,
        system: Some(system_prompt),
        ..Default::default()
    };

    // 3. Call the API.
    let response = client
        .create_a_message(request_body)
        .await?;

    println!("Result:\n{}", response);

    Ok(())
}

使用 tokio 后端流式传输消息

以下是从环境变量 ANTHROPIC_API_KEY 加载 API 密钥创建消息流的示例:

ANTHROPIC_API_KEY={your-api-key}

使用 tokio-stream 的方式如下

use clust::messages::ClaudeModel;
use clust::messages::MaxTokens;
use clust::messages::Message;
use clust::messages::MessagesRequestBody;
use clust::messages::SystemPrompt;
use clust::messages::StreamOption;
use clust::messages::StreamChunk;
use clust::Client;

use tokio_stream::StreamExt;

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    // 1. Create a new API client with the API key loaded from the environment variable: `ANTHROPIC_API_KEY`.
    let client = Client::from_env()?;

    // 2. Create a request body with `stream` option.
    let model = ClaudeModel::Claude3Sonnet20240229;
    let messages = vec![Message::user(
        "Where is the capital of France?",
    )];
    let max_tokens = MaxTokens::new(1024, model)?;
    let system_prompt = SystemPrompt::new("You are an excellent AI assistant.");
    let request_body = MessagesRequestBody {
        model,
        messages,
        max_tokens,
        system: Some(system_prompt),
        stream: Some(StreamOption::ReturnStream),
        ..Default::default()
    };

    // 3. Call the streaming API.
    let mut stream = client
        .create_a_message_stream(request_body)
        .await?;

    let mut buffer = String::new();

    // 4. Poll the stream.
    // NOTE: The `tokio_stream::StreamExt` run on the `tokio` runtime.
    while let Some(chunk) = stream.next().await {
        match chunk {
            | Ok(chunk) => {
                println!("Chunk:\n{}", chunk);
                match chunk {
                    | StreamChunk::ContentBlockDelta(content_block_delta) => {
                        // Buffer message delta.
                        buffer.push_str(&content_block_delta.delta.text);
                    }
                    | _ => {}
                }
            }
            | Err(error) => {
                eprintln!("Chunk error:\n{:?}", error);
            }
        }
    }

    println!("Result:\n{}", buffer);

    Ok(())
}

使用视觉创建消息

查看 使用视觉的示例

对话

查看 对话示例

工具使用

查看 工具使用示例

其他示例

查看 示例 目录以获取更多示例。

变更日志

查看 CHANGELOG

许可证

根据您的选择,受 Apache License, Version 2.0MIT 许可证的约束。

依赖项

~4–17MB
~246K SLoC