#api #api-client #ai-api #api-bindings #novelai

novelai_api

基于 Rust 的 NovelAI API 操作接口

4 个版本

0.2.2 2024 年 4 月 14 日
0.2.1 2024 年 4 月 13 日
0.2.0 2024 年 4 月 13 日
0.1.0 2024 年 4 月 9 日

614网络编程

Download history 1/week @ 2024-05-15 5/week @ 2024-05-22

每月 144 次下载

GPL-2.0 许可证

24KB
475

NovelAI API

NovelAI API 的 Rust API 客户端

概述

此 API 客户端为您提供访问 NovelAI API 的权限,已实现了以下端点

  • 当前 Novel AI API 版本:1.0

安装

将以下内容添加到 Cargo.toml 中的 [dependencies]

novelai_api = "0.2"

或者运行

cargo add novelai_api

文档

文档可在以下位置找到

示例使用

生成文本

use novelai_api::{api::ai_generate_text, model::*};

#[tokio::main]
async fn main() {
    let mut conf = Configuration::new();
    conf.bearer_access_token =
        Some("Your Token".to_string());

    let prompt = "Tell me about the lost world!".to_string();
    let response = ai_generate_text(
        &conf,
        AiGenerateRequest::new(
            prompt.clone(),
            novelai_api::model::TextModel::KayraV1,
            AiGenerateParameters::new(),
        ),
    )
    .await.unwrap();

    println!("{}{}", prompt, response.output);
    /*
    Tell me about the lost world! I urged him. "I'm going there in two days' time."
    "What? Are you mad?" He sounded incredulous.
    "Not yet, but I am getting that way," I told him, with a grin. "The 
    expedition was a last-minute addition to my schedule. They have been trying 
    to reach a group of natives called the Chirpsithra for years. The Chirps 
    have asked me to join the party as a liaison."
    */
}

生成语音合成

use novelai_api::{api::*, model::Configuration};
use std::fs;

#[tokio::main]
async fn main() {
    let prompt = "Hello world!";

    // API has a limit of 1000 Characters per request
    // due to this we split our string at the
    // nearest vocal pause close to 1000 chars
    let prompt: Vec<String> = process_string_for_voice_generation(prompt);

    let mut conf = Configuration::new();
    conf.bearer_access_token = Some("Your Token".to_string());

    for (i, chunk) in prompt.iter().enumerate() {
        let output = ai_generate_voice(&conf, chunk, "Aini", -1.0, true, "v2")
            .await
            .unwrap();

        // output now is string of .webm audio file
        // which you could save to a file
        fs::write(format!("./{}_output.webm", i), output).unwrap();
    }
}

自定义 AI 参数

待办:创建示例

use novelai_api::{api::ai_generate_text, model::*};

#[tokio::main]
async fn main() {
    let model = novelai_api::model::TextModel::KayraV1;
    /* Model Options
        Variant2Period7B,
        Variant6Bv4,
        EuterpeV2,
        GenjiPython6b,
        GenjiJp6b,
        GenjiJp6bV2,
        KrakeV2,
        Hypebot,
        Infillmodel,
        Cassandra,
        Sigurd2Period9bV1,
        Blue,
        Red,
        Green,
        Purple,
        ClioV1,
        KayraV1,
    */

    let generation_parameters = AiGenerateParameters {
        temperature: Some(2.8),
        min_length: 50,
        max_length: 300,
        top_a: Some(1.0),
        // (A bunch more settings Avalible)
        ..Default::default()
    };

    let request_settings = AiGenerateRequest::new(
        "Your Prompt".to_string(),
        model,
        generation_parameters
    );

    let mut conf = Configuration::new();
    conf.bearer_access_token =
        Some("Your Token".to_string());

    let output = ai_generate_text(&conf, request_settings).await.unwrap();
    println!("{}", output.output);
}

依赖项

~6–19MB
~290K SLoC