#llm-chain #llm #chain #langchain #ChatGPT

langchain-rust

LangChain for Rust,在 Rust 中编写基于 LLM 的程序的 easiest 方法

11 个稳定版本

4.3.0 2024 年 6 月 23 日
4.1.4 2024 年 5 月 27 日
3.3.2 2024 年 4 月 15 日
3.1.1 2024 年 3 月 25 日
0.1.9 2024 年 2 月 27 日

#29 in 机器学习

Download history 626/week @ 2024-05-04 293/week @ 2024-05-11 391/week @ 2024-05-18 443/week @ 2024-05-25 505/week @ 2024-06-01 163/week @ 2024-06-08 112/week @ 2024-06-15 356/week @ 2024-06-22 265/week @ 2024-06-29 133/week @ 2024-07-06 57/week @ 2024-07-13 95/week @ 2024-07-20 233/week @ 2024-07-27 105/week @ 2024-08-03 267/week @ 2024-08-10 101/week @ 2024-08-17

735 每月下载量
用于 2 crates

MIT 许可证

395KB
10K SLoC

🦜️🔗LangChain Rust

Latest Version

⚡ 通过组合性构建 LLM 应用程序,使用 Rust!⚡

Discord Docs: Tutorial

🤔 这是什么?

这是 LangChain 的 Rust 语言实现。

当前功能

  • LLMs

  • Embeddings

  • VectorStores

  • Chain

  • Agents

  • Tools

  • Semantic Routing

  • Document Loaders

    • PDF

      use futures_util::StreamExt;
      
      async fn main() {
          let path = "./src/document_loaders/test_data/sample.pdf";
      
          let loader = LoPdfLoader::from_path(path).expect("Failed to create PdfLoader");
      
          let docs = loader
              .load()
              .await
              .unwrap()
              .map(|d| d.unwrap())
              .collect::<Vec<_>>()
              .await;
      
      }
      
    • Pandoc

      use futures_util::StreamExt;
      
      async fn main() {
      
          let path = "./src/document_loaders/test_data/sample.docx";
      
          let loader = PandocLoader::from_path(InputFormat::Docx.to_string(), path)
              .await
              .expect("Failed to create PandocLoader");
      
          let docs = loader
              .load()
              .await
              .unwrap()
              .map(|d| d.unwrap())
              .collect::<Vec<_>>()
              .await;
      }
      
    • HTML

      use futures_util::StreamExt;
      use url::Url;
      
      async fn main() {
          let path = "./src/document_loaders/test_data/example.html";
          let html_loader = HtmlLoader::from_path(path, Url::parse("https://example.com/").unwrap())
              .expect("Failed to create html loader");
      
          let documents = html_loader
              .load()
              .await
              .unwrap()
              .map(|x| x.unwrap())
              .collect::<Vec<_>>()
              .await;
      }
      
    • CSV

      use futures_util::StreamExt;
      
      async fn main() {
          let path = "./src/document_loaders/test_data/test.csv";
          let columns = vec![
              "name".to_string(),
              "age".to_string(),
              "city".to_string(),
              "country".to_string(),
          ];
          let csv_loader = CsvLoader::from_path(path, columns).expect("Failed to create csv loader");
      
          let documents = csv_loader
              .load()
              .await
              .unwrap()
              .map(|x| x.unwrap())
              .collect::<Vec<_>>()
              .await;
      }
      
    • Git 提交

      use futures_util::StreamExt;
      
      async fn main() {
          let path = "/path/to/git/repo";
          let git_commit_loader = GitCommitLoader::from_path(path).expect("Failed to create git commit loader");
      
          let documents = csv_loader
              .load()
              .await
              .unwrap()
              .map(|x| x.unwrap())
              .collect::<Vec<_>>()
              .await;
      }
      
    • 源代码

      
      let loader_with_dir =
      SourceCodeLoader::from_path("./src/document_loaders/test_data".to_string())
      .with_dir_loader_options(DirLoaderOptions {
      glob: None,
      suffixes: Some(vec!["rs".to_string()]),
      exclude: None,
      });
      
      let stream = loader_with_dir.load().await.unwrap();
      let documents = stream.map(|x| x.unwrap()).collect::<Vec<_>>().await;
      

安装

此库依赖 serde_json 进行操作。

步骤 1:添加 serde_json

首先,确保将 serde_json 添加到您的 Rust 项目中。

cargo add serde_json

步骤 2:添加 langchain-rust

然后,您可以将 langchain-rust 添加到您的 Rust 项目中。

简单安装

cargo add langchain-rust

使用 Sqlite

cargo add langchain-rust --features sqlite

https://github.com/asg017/sqlite-vss 下载额外的 sqlite_vss 库

使用 Postgres

cargo add langchain-rust --features postgres

使用 SurrialDB

cargo add langchain-rust --features surrealdb

使用 Qdrant

cargo add langchain-rust --features qdrant

请记住根据您的具体使用情况替换特征标志 sqlitepostgressurrealdb

这将将在您的Cargo.toml文件中添加serde_jsonlangchain-rust作为依赖项。现在,当您构建项目时,这两个依赖项将被获取和编译,并在您的项目中可用。

请记住,serde_json是必需的依赖项,而sqlitepostgressurrealdb是根据项目需求可能添加的可选特性。

快速开始对话链

use langchain_rust::{
    chain::{Chain, LLMChainBuilder},
    fmt_message, fmt_placeholder, fmt_template,
    language_models::llm::LLM,
    llm::openai::{OpenAI, OpenAIModel},
    message_formatter,
    prompt::HumanMessagePromptTemplate,
    prompt_args,
    schemas::messages::Message,
    template_fstring,
};

#[tokio::main]
async fn main() {
    //We can then initialize the model:
    // If you'd prefer not to set an environment variable you can pass the key in directly via the `openai_api_key` named parameter when initiating the OpenAI LLM class:
    // let open_ai = OpenAI::default()
    //     .with_config(
    //         OpenAIConfig::default()
    //             .with_api_key("<your_key>"),
    //     ).with_model(OpenAIModel::Gpt35.to_string());
    let open_ai = OpenAI::default().with_model(OpenAIModel::Gpt35.to_string());


    //Once you've installed and initialized the LLM of your choice, we can try using it! Let's ask it what LangSmith is - this is something that wasn't present in the training data so it shouldn't have a very good response.
    let resp = open_ai.invoke("What is rust").await.unwrap();
    println!("{}", resp);

    // We can also guide it's response with a prompt template. Prompt templates are used to convert raw user input to a better input to the LLM.
    let prompt = message_formatter![
        fmt_message!(Message::new_system_message(
            "You are world class technical documentation writer."
        )),
        fmt_template!(HumanMessagePromptTemplate::new(template_fstring!(
            "{input}", "input"
        )))
    ];

    //We can now combine these into a simple LLM chain:

    let chain = LLMChainBuilder::new()
        .prompt(prompt)
        .llm(open_ai.clone())
        .build()
        .unwrap();

    //We can now invoke it and ask the same question. It still won't know the answer, but it should respond in a more proper tone for a technical writer!

    match chain
        .invoke(prompt_args! {
        "input" => "Quien es el escritor de 20000 millas de viaje submarino",
           })
        .await
    {
        Ok(result) => {
            println!("Result: {:?}", result);
        }
        Err(e) => panic!("Error invoking LLMChain: {:?}", e),
    }

    //If you want to prompt to have a list of messages you could use the `fmt_placeholder` macro

    let prompt = message_formatter![
        fmt_message!(Message::new_system_message(
            "You are world class technical documentation writer."
        )),
        fmt_placeholder!("history"),
        fmt_template!(HumanMessagePromptTemplate::new(template_fstring!(
            "{input}", "input"
        ))),
    ];

    let chain = LLMChainBuilder::new()
        .prompt(prompt)
        .llm(open_ai)
        .build()
        .unwrap();
    match chain
        .invoke(prompt_args! {
        "input" => "Who is the writer of 20,000 Leagues Under the Sea, and what is my name?",
        "history" => vec![
                Message::new_human_message("My name is: luis"),
                Message::new_ai_message("Hi luis"),
                ],

        })
        .await
    {
        Ok(result) => {
            println!("Result: {:?}", result);
        }
        Err(e) => panic!("Error invoking LLMChain: {:?}", e),
    }
}

依赖项

~40–83MB
~1.5M SLoC