Go to file
2024-03-21 22:17:40 +01:00
src improve xor example with relu activation fn 2024-03-21 22:02:18 +01:00
.gitignore Initial commit 2024-03-08 16:52:45 +01:00
Cargo.lock Initial commit 2024-03-08 16:52:45 +01:00
Cargo.toml rand is a required dependency 2024-03-18 21:29:40 +01:00
LICENSE add license and readme 2024-03-21 22:17:40 +01:00
README.md add license and readme 2024-03-21 22:17:40 +01:00

aicaramba

a simple neural network implementation from a perspective of linear algebra.

the library is mostly developed for recreational and educational purposes for myself and only depends on the rand-crate for randomization of newly created weight- and bias matrices.

for a usage example see src/bin/xor.rs, which simulates an XOR-logic-gate using a small neural network.


features

currently available features of the library:

  • ReLU and Sigmoid activation functions
  • the MSE loss function
  • a single, down-to-earth struct that contains the whole network

roadmap

what might happen down the road:

  • BCE loss function (requires output layer sigmoid activation - not a trivial addition)
  • serde (de-)serialization to easily store checkpoints/training progress.
  • perhaps a MNIST example (?)