### SCRA Analysis Initial results!

nataraj chakraborty <natarajchakraborty <at> gmail.com>

2006-08-29 17:52:24 GMT

Hi ANNEvolve,

This about SCRA, which has been quite for sometime left apart by me. The original definiton of SCRA can be found in the attached SCRA.txt file by Mitchel Timin.

Another file attached is Nets_5_30.piz which has to be renamed as Nets_5_30.zip has a text file listing all the SCRA topology from networks of size 5 to 30. As can be seen there are many Networks which reduces down to FCBA, many in which few of the Output nodes are left unconnected(Pruned). This analysis is to find a formula from which we can know before constructing the network whether the resulting network will be FCBA of Pruned or a Perfect desirable network.

So I call upon people of Maths for some help.

The code for this is a single C++ file, yet not updated.Also this program writes two other files where it writes the statistics of Nets. They need some bugfixing so I have yet not uploaded them in SVN. I guess I'll need SVN help preety soon.

Nataraj Chakraborty

SCRA - Sparsely Connected Recurrent ANN
by M. Timin, April, 2006
The SCRA does not have to be a binary ANN; the same concept works for an ANN
using neurons with any activation function, but for our present purpose we
will assume that the activation function is the unit step function, hence we
will have a binary ANN, with all outputs being 0 or 1.
The diagram, Fig. 1, shows an example with six neurons, but the actual neuron
count is unlimited. Several dozen would be a typical number for many of the
more difficult applications, although simple functions can be done with as few
as 2 neurons. (illustrated in VizANN, downloadable from
http://sourceforge.net/projects/annevolve)
The same diagram, if we added all possible connections, would then illustrate
the FCBA (Fully Connected Binary ANN). There would then be 36 connection
lines in the diagram, as compared to the 12 lines in this SCRA example.
The motivation for the SCRA is to avoid the weight explosion that occurs with
the FCBA when the neuron count is increased. The FCBA has at least N*N
weights, whereas the SCRA may have a small fraction of that. (N = neuron
count) For either the SCRA or FCBA, N is also the number of bits of active
memory, AKA scratchpad memory, where the net can store and retrieve data
items. For applications requiring many bits of scratchpad RAM, the SCRA can
provide them with far fewer weights. This is important because the weight
count is the primary factor influencing computation time, as well as storage
for the array that describes the ANN.
Notice in the example, beginning with the first connection, from the output of
neuron 0 to the input of neuron 0, that 2 possible connections are skipped
before the next connection, to the input of neuron 3. The 2 here is a
parameter of the net, i.e., a SCRA can be made with a skip count of 2, or 3 or
whatever. If the skip count is 0 then we would have an FCBA, so the FCBA is a
special case of the SCRA. A skip count of 1 would connect each neuron to
every other input. So skip count, in addition to N, is a parameter that is
needed to specify a particular net. The process of skipping some possible
inputs wraps around to the beginning, and continue until all neuron inputs
have been considered.
Notice that in the example neuron 0 connects to neuron 0, which is itself, but
neuron 1 connect first to neuron 2. We have advanced the relative position of
the first connection by 1. This 1 is another parameter of the net.
Similarly, the first connection of neuron 2 connects to neuron 4. Why do we
need this parameter? Suppose it were zero; if that were the case then each
neuron would input to itself. We have no reason to believe that every neuron
should have direct feedback from itself, so the "advancement" parameter needs
to be at least 1.
Let's call the three parameters N, S & A, for Neurons, Skip, and Advance.
I am not sure if there is benefit to A being larger than 1, but intuitively
I'm guessing that there is. If we first build a system where N, S & A can
evolve, then we will find out. If the best nets all have an A of 1, then we
can simplify the code to have that value built into it.
My object in designing the net in this manner is to enable it to be coded for
very fast execution. I envision the code as being similar to the updateANN()
routine in 4Play. Of course it will have to be somewhat more complex because
of the use of S and A in the code. My hope is that mostly we will just be
adding S and/or A to a pointer instead of simply incrementing it.
Recurrency:
The SCRA has several levels of feedback built into it. In the example, 0
feeds into itself directly. 1 feeds into 2 which feeds back into 1. 2 feeds
into 4 which feeds back into 2. In fact, every neuron has a 2-step feedback
path to itself, but only neurons 0 & 3 have direct feedback. I have not
analyzed this carefully, but I hope that different values of A will result in
assortment of feedback paths, or that A and S together will do so.
If processing time were not an issue we could use a random matrix of
connections, and encode it into a chromosome to guide our connections, but I'm
pretty sure that such an implementation would be much slower computationally.
(But we could consider it.)
So I'm seeking a sparse net which is very fast to compute, can be quite sparse
if necessary, and has a broad spectrum of feedback levels.
The A parameter does not effect the length of the chromosome, but N and S do.
We are liable to get chromosomes of very different lengths, and it is not
clear how to "mate" them, i.e., perform the crossover operation. We might
restrict mating to chromosomes that are within, say, 10% of each other's
length. We might even have 3-way sex, where 2 short chromosomes mate with one
long one to produce 3 offspring. (Bizarre, eh!)
How many weights:
We need to calculate how many weights are in a chromosome, given N, S, and the
number of inputs. In 4play, that was calculated with this formula:
#define CHROMSIZE(M,N) (N)*(1+(N)+(M)) /* a weight for every connection */
The 1 in the formula is for the biases, one for each neuron. Now for the
SCRA, there is one additonal parameter that affects the weight count, and that
is S. If you can devise a formula, fine, but it will be tricky because the
number of output lines from a neuron is not exactly N/(S+1). It's
approximately N/(S+1), but not exactly because that division can yield a
non-integer result, whereas the correct answer is an integer. However, I'm
sure you can invent a formula, or short algorithm to compute it.
There is another way, and it will be useful to have both, as a check to see
that the both give the same result. You may not need that in the final code,
but it will be useful during development. The other way is, assuming you have
created code that updates the SCRA, is to make a version of that that computes
how many weights have been considered. The SCRA update code considers every
weight and decides whether or not to add it to some sum that is being
accumulated. So it's an easy thing to count the number of such weights.
Since it is a bit tricky to get the SCRA update code to be correct, it is a
good check to see if the number of weight that it wants to use is the same as
your formula for the number of weights.
-~~fin~~-

-------------------------------------------------------------------------
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642