Overview
Erlang has been discussed as a perfect fit for problems involving neural networks. The structure of a perceptron (neuron) can be represented as a single Erlang process that holds the weights, and connections to neighboring perceptrons. Communication between perceptrons would be realized through message passing.
During the feedforward operation, each perceptron only takes the inputs and weights of the previous layer to calculate for its output. This would imply that the perceptrons in a particular layer does not share data with each other, ergo, the calculation of each perceptron can be done concurrently.
A neural network library for Erlang is then introduced that follows a similar methods with the Fast Artificial Neural Network (FANN) library. The library automates the creation and execution of the neural network and utilizes an Erlang process to monitor the neural network's configuration (Inputs, Hidden, Output layers).
Installation
Clone the repository on Github: $ git clone https://github.com/Perroquiet/erlann.git
You may view the full documentation of the library through /edoc/index.html
Library Specifications
- Data Pre-processing: Log Normalization
- Activation Function: Sigmoid Function
- Learning Algorithm: Backpropagation
- Training Sequence: Training -> Testing -> Error Evaluation
- Evaluation Criteria: Mean Squared Prediction Error (MSPE)
References
- Erlang and Neural Networks by Wil Chung
- From Telecom Networks to Neural Networks; Erlang, as the unintentional Neural Network Programming Language By Gene Sher
- Fast Artificial Neural Network (FANN) Library
License
GNU GPL v3
Support or Contact
MSU - Iligan Institute of Technology
Andres Bonifacio Avenue, Tibanga, 9200 Iligan City, Philippines
daniellitojr.padayhag@g.msuiit.edu.ph
kevin.reyes@g.msuiit.edu.ph
ibrahim.gamoranao@g.msuiit.edu.ph