synadart 0.4.2+1
synadart: ^0.4.2+1 copied to clipboard
A simple-to-grasp, complete and fully documented Neural Network library, written from scratch in Dart.
0.4.2+1 #
- Updated package description.
0.4.2 #
- Updated
sprintversion from1.0.0+1to1.0.2+3. - Replaced the now discontinued
pedanticwith thewordslint ruleset. - Reversed the order of versions in
CHANGELOG.mdfrom ascending to descending.
0.4.1+1 #
- Refactored code.
- Removed
logger.dartin favour of thesprintpackage.
0.4.1 #
- Updated documentation.
0.4.0 #
- Organised code.
- Replaced network types such as
feed-forwardordeep feed-forwardwith a single classSequential. - Moved focus from
NetworktoLayer, so that different layers can be added to aNetwork, rather than creating new types of networks, and limiting the user to a preset model. - Updated
example.dartandREADME.md.
0.3.2 #
- Added a simple feed-forward network model.
0.3.1 #
- Added 5 new activation functions:
SeLU,Softplus,Softsign,SwishandGaussian. - Renamed the 'logistic' function to 'sigmoid'.
- Created function
abs()for obtaining the absolute value of a variable.
0.3.0 #
- Updated documentation of
Logger,BackpropagationandValueGenerator. - Created
/examplesdirectory with a fileexample.dartthat displays the network being used to recognise the number '5'.
0.2.5 #
- Renamed 'Multilayer Perceptron' to 'Deep Feed-Forward', since 'deep feed-forward' is broader as a concept than 'multi-layer perceptrons'.
0.2.4 #
- Updated documentation of
activation.dart, having added explanations for the different activation functions.
0.2.3 #
- Updated documentation of
Network. - Replaced
process()inLayerwith anoutputgetter, simplifying the method of getting eachNeuron's output.
0.2.2 #
- Updated documentation of
Layerand removed a chunk of dead code.
0.2.1 #
- Removed the feed-forward network and simple perceptrons in favour of an upcoming simpler implementation of networks, through the use of a single network model.
- Added [learningRate] as a parameter, and removed the hard-coded value of
0.2. - Updated documentation of
Neuron.
0.2.0 #
- Added a feed-forward network and simple perceptrons.
- Added
LReLU,eLUandtanhactivation functions. - Renamed 'sigmoid' to 'logistic'.
0.1.1 #
- Added
README.mdand updated formatting.
0.1.0 #
- Implemented a multilayer perceptron and a basic algorithm for backpropagation.