perl-AI-NeuralNet-BackProp-0.89-alt1.noarch.rpm


Advertisement

Description

perl-AI-NeuralNet-BackProp - A simple back-prop neural net that uses Delta's and Hebbs' rule

Property Value
Distribution ALT Linux Sisyphus
Repository Autoimports noarch
Package name perl-AI-NeuralNet-BackProp
Package version 0.89
Package release alt1
Package architecture noarch
Package type rpm
Installed size 117.89 KB
Download size 117.89 KB
Official Mirror ftp.altlinux.org
AI::NeuralNet::BackProp implements a nerual network similar to a feed-foward,
back-propagtion network; learning via a mix of a generalization
of the Delta rule and a disection of Hebbs rule. The actual
neruons of the network are implemented via the AI::NeuralNet::BackProp::neuron package.
You constuct a new network via the new constructor:
my $net = new AI::NeuralNet::BackProp(2,3,1);
The new() constructor accepts two arguments and one optional argument, $layers, $size,
and $outputs is optional (in this example, $layers is 2, $size is 3, and $outputs is 1).
$layers specifies the number of layers, including the input
and the output layer, to use in each neural grouping. A new
neural grouping is created for each pattern learned. Layers
is typically set to 2. Each layer has $size neurons in it.
Each neuron's output is connected to one input of every neuron
in the layer below it.
This diagram illustrates a simple network, created with a call
to "new AI::NeuralNet::BackProp(2,2,2)" (2 layers, 2 neurons/layer, 2 outputs).
input
/  \
O    O
|\  /|
| \/ |
| /\ |
|/  \|
O    O
\  /
mapper
In this diagram, each neuron is connected to one input of every
neuron in the layer below it, but there are not connections
between neurons in the same layer. Weights of the connection
are controlled by the neuron it is connected to, not the connecting
neuron. (E.g. the connecting neuron has no idea how much weight
its output has when it sends it, it just sends its output and the
weighting is taken care of by the receiving neuron.) This is the
method used to connect cells in every network built by this package.
Input is fed into the network via a call like this:
use AI;
my $net = new AI::NeuralNet::BackProp(2,2);
my @map = (0,1);
my $result = $net->run(\@map);
Now, this call would probably not give what you want, because
the network hasn't "learned" any patterns yet. But this
illustrates the call. Run now allows strings to be used as
input. See run() for more information.
Run returns a refrence with $size elements (Remember $size? $size
is what you passed as the second argument to the network
constructor.) This array contains the results of the mapping. If
you ran the example exactly as shown above, $result would probably
contain (1,1) as its elements.
To make the network learn a new pattern, you simply call the learn
method with a sample input and the desired result, both array
refrences of $size length. Example:
use AI;
my $net = new AI::NeuralNet::BackProp(2,2);
my @map = (0,1);
my @res = (1,0);
$net->learn(\@map,\@res);
my $result = $net->run(\@map);
Now $result will conain (1,0), effectivly flipping the input pattern
around. Obviously, the larger $size is, the longer it will take
to learn a pattern. Learn() returns a string in the form of
Learning took X loops and X wallclock seconds (X.XXX usr + X.XXX sys = X.XXX CPU).
With the X's replaced by time or loop values for that loop call. So,
to view the learning stats for every learn call, you can just:
print $net->learn(\@map,\@res);
If you call "$net->debug(4)" with $net being the
refrence returned by the new() constructor, you will get benchmarking
information for the learn function, as well as plenty of other information output.
See notes on debug() in the METHODS section, below.
If you do call $net->debug(1), it is a good
idea to point STDIO of your script to a file, as a lot of information is output. I often
use this command line:
$ perl some_script.pl > .out
Then I can simply go and use emacs or any other text editor and read the output at my leisure,
rather than have to wait or use some 'more' as it comes by on the screen.

Alternatives

Package Version Architecture Repository
perl-AI-NeuralNet-BackProp - - -

Requires

Name Value
/usr/share/perl5 -
perl(Benchmark.pm) -
rpmlib(PayloadIsLzma) -

Provides

Name Value
perl(AI/NeuralNet/BackProp.pm) = 0.890
perl-AI-NeuralNet-BackProp = 0.89-alt1

Download

Type URL
Binary Package perl-AI-NeuralNet-BackProp-0.89-alt1.noarch.rpm
Source Package perl-AI-NeuralNet-BackProp-0.89-alt1.src.rpm

Install Howto

  1. Add the following line to /etc/apt/sources.list:
    
    rpm [Sisyphus] http://ftp.altlinux.org/pub/distributions/ALTLinux/autoimports/Sisyphus noarch autoimports
    rpm [Sisyphus] http://ftp.altlinux.org/pub/distributions/ALTLinux/autoimports/Sisyphus noarch autoimports
    
  2. Update the package index:
    # sudo apt-get update
  3. Install perl-AI-NeuralNet-BackProp rpm package:
    # sudo apt-get install perl-AI-NeuralNet-BackProp

Files

Path
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/Changes
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/README
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/add.dat
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/ex_add.pl
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/ex_add2.pl
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/ex_alpha.pl
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/ex_bmp.pl
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/ex_bmp2.pl
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/ex_crunch.pl
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/ex_dow.pl
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/ex_pat.pl
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/ex_pcx.pl
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/ex_pcxl.pl
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/ex_sub.pl
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/ex_synop.pl
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/letters.dat
/usr/share/doc/perl-AI-NeuralNet-BackProp-0.89/examples/sub.dat
/usr/share/perl5/AI/
/usr/share/perl5/AI/NeuralNet/BackProp.pm

See Also

Package Description
perl-AI-NeuralNet-Hopfield-0.1-alt1.noarch.rpm A simple Hopfiled Network Implementation
perl-AI-NeuralNet-Kohonen-0.142-alt1.noarch.rpm perl module AI-NeuralNet-Kohonen
perl-AI-NeuralNet-Kohonen-Demo-RGB-0.123-alt1.noarch.rpm Colour-based demo
perl-AI-NeuralNet-Kohonen-Visual-0.3-alt1.noarch.rpm Tk-based Visualisation
perl-AI-NeuralNet-Mesh-0.44-alt1.noarch.rpm An optimized, accurate neural network Mesh
perl-AI-NeuralNet-SOM-0.07-alt1.noarch.rpm perl module AI-NeuralNet-SOM
perl-AI-PSO-0.86-alt1.noarch.rpm Module for running the Particle Swarm Optimization algorithm
perl-AI-ParticleSwarmOptimization-1.006-alt1.noarch.rpm OO Perl implementation of Particle Swarm Optimization
perl-AI-Pathfinding-AStar-0.10-alt1.noarch.rpm perl module AI-Pathfinding-AStar
perl-AI-Pathfinding-OptimizeMultiple-0.0.15-alt1.noarch.rpm optimize path finding searches for a large set of initial conditions (for better average performanc
perl-AI-Pathfinding-OptimizeMultiple-scripts-0.0.15-alt1.noarch.rpm AI-Pathfinding-OptimizeMultiple scripts
perl-AI-Pathfinding-SMAstar-0.07-alt1.noarch.rpm Simplified Memory-bounded A* Search
perl-AI-Perceptron-1.0-alt1.noarch.rpm perl module AI-Perceptron
perl-AI-Prolog-0.741-alt1.noarch.rpm Perl extension for logic programming
perl-AI-Prolog-scripts-0.741-alt1.noarch.rpm AI-Prolog scripts
Advertisement
Advertisement