5 个版本

使用旧的 Rust 2015

0.2.0 2023 年 8 月 18 日
0.1.3 2017 年 11 月 16 日
0.1.2 2017 年 11 月 13 日
0.1.1 2017 年 11 月 12 日
0.1.0 2017 年 11 月 12 日

#339 in 机器学习

MIT 许可证

51KB
511



codecov crates

NeuroFlow 是一个快速神经网络(深度学习)Rust 包。它依赖于三个支柱:速度、可靠性,以及再次的速度。

...如果这个库将要成为 Rust 世界的第二个 PyTorch。然而,这个仓库在教育领域找到了它的位置,并且可以被年轻的 Rustaceans 用来进入神经网络的世界。

如何使用

让我们尝试逼近一个非常简单的函数 0.5*sin(e^x) - cos(e^(-x))

extern crate neuroflow;

use neuroflow::FeedForward;
use neuroflow::data::DataSet;
use neuroflow::activators::Type::Tanh;


fn main(){
    /*
        Define a neural network with 1 neuron in input layers. The network contains 4 hidden layers.
        And, such as our function returns a single value, it is reasonable to have 1 neuron in the output layer.
    */
    let mut nn = FeedForward::new(&[1, 7, 8, 8, 7, 1]);
    
    /*
        Define DataSet.
        
        DataSet is the Type that significantly simplifies work with neural networks.
        The majority of its functionality is still under development :(
    */
    let mut data: DataSet = DataSet::new();
    let mut i = -3.0;
    
    // Push the data to DataSet (method push accepts two slices: input data and expected output)
    while i <= 2.5 {
        data.push(&[i], &[0.5*(i.exp().sin()) - (-i.exp()).cos()]);
        i += 0.05;
    }
    
    // Here, we set the necessary parameters and train the neural network by our DataSet with 50 000 iterations
    nn.activation(Tanh)
        .learning_rate(0.01)
        .train(&data, 50_000);

    let mut res;
    
    // Let's check the result
    i = 0.0;
    while i <= 0.3{
        res = nn.calc(&[i])[0];
        println!("for [{:.3}], [{:.3}] -> [{:.3}]", i, 0.5*(i.exp().sin()) - (-i.exp()).cos(), res);
        i += 0.07;
    }
}

预期输出

for [0.000], [-0.120] -> [-0.119]
for [0.070], [-0.039] -> [-0.037]
for [0.140], [0.048] -> [0.050]
for [0.210], [0.141] -> [0.141]
for [0.280], [0.240] -> [0.236]

但我们不希望我们的训练好的网络那么容易丢失。因此,存在从文件中保存和恢复神经网络的函数。


    /*
        In order to save neural network into file call function save from neuroflow::io module.
        
        The first argument is the link to the saving neural network;
        The second argument is the path to the file. 
    */
    neuroflow::io::save(&mut nn, "test.flow").unwrap();
    
    /*
        After we have saved the neural network to the file we can restore it by calling
        of load function from neuroflow::io module.
        
        We must specify the type of new_nn variable.
        The only argument of the load function is the path to a file containing
        the neural network
    */
    let mut new_nn: FeedForward = neuroflow::io::load("test.flow").unwrap();

经典的 XOR 问题(没有经典的数据输入)

在项目的根目录下创建一个名为 TerribleTom.csv 的文件。此文件应包含以下内容

0,0,-,0
0,1,-,1
1,0,-,1
1,1,-,0

其中 - 是分隔输入向量与其所需输出向量的分隔符。

extern crate neuroflow;

use neuroflow::FeedForward;
use neuroflow::data::DataSet;
use neuroflow::activators::Type::Tanh;


fn main(){
    /*
        Define a neural network with 2 neurons in input layers,
        1 hidden layer (with 2 neurons),
        1 neuron in the output layer
    */
    let mut nn = FeedForward::new(&[2, 2, 1]);
    
    // Here we load data for XOR from the file `TerribleTom.csv`
    let mut data = DataSet::from_csv("TerribleTom.csv");
    
    // Set parameters and train the network
    nn.activation(Tanh)
        .learning_rate(0.1)
        .momentum(0.15)
        .train(&data, 20_000);

    let mut res;
    let mut d;
    for i in 0..data.len(){
        res = nn.calc(data.get(i).0)[0];
        d = data.get(i);
        println!("for [{:.3}, {:.3}], [{:.3}] -> [{:.3}]", d.0[0], d.0[1], d.1[0], res);
    }
}

预期输出

for [0.000, 0.000], [0.000] -> [0.000]
for [1.000, 0.000], [1.000] -> [1.000]
for [0.000, 1.000], [1.000] -> [1.000]
for [1.000, 1.000], [0.000] -> [0.000]

安装

在您的项目 cargo.toml 块中插入下一行

[dependencies]
neuroflow = "~0.2"

然后在项目根文件中

extern crate neuroflow;

许可证

MIT 许可证

归属

标志中的折纸鸟由 Freepik 制作

依赖

~3–4MB
~70K SLoC