This is a long but basic question, no knowledge on neural networks is required. Posted a pic of me to offer an incentive.

http://img811.imageshack.us/img811/2...6765612379.png

So I'm currently working on implementing a neural network. It should all work...but obviously it doesn't. The project has 3 classes

-NeuralNetwork which is "where all the magic happens"

-NeuralLayer which just keeps track of the nodes in a layer

-NeuralNode which keeps track of a node's value, weights it's sending out, and bias (none of that is really important)

just thought I'd point out that there are other classes.

In the main I have among other things:

where 15 represents the number of times to run the network.Code:`net.runAll(15);`

Here's the runAll method

the important thing to note is theCode:`public void runAll(int numTimes) //number of iterations for running the network`

{

initializeRandomWeights();

int count=0;

trainingRate = .1;

while (count < numTimes)

{

forward();

calcOutputDelta();

for (int L=layers.length-1; L>0; L--)

{

updateWeights(L-1);

updateBias(L);

calcLayerDelta(L-1);

}

count++;

trainingRate *= .9999;

}

}

`initializeRandomWeights`

in the first line of the method which calls these two methods.

and this should go through each layer excluding the last/output and put in random weights for all the nodes in those layers (weights are being sent out by nodes so the last/output layer wouldnt send any)Code:`public void initializeRandomWeights() //set random weights and bias`

{

NeuralLayer nextLayer; //layer of nodes recieving the weights

double[] rWeights; //random weights

double rBias;

for (int L=0; L<layers.length-1; L++)

{

for (int n=0; n<layers[L].getSize(); n++)

{ //I initially had this statement after "nextLayer = layers[l+1];" but I realized it

rWeights = randomWeights(L); //has to be in this loop

rBias = Math.random()*2; //or else it would send the same array to all the nodes

layers[L].getNodeAt(n).setBias(rBias);

layers[L].getNodeAt(n).setWeights(rWeights);

}

}

}

public double[] randomWeights(int layer)

{

int size = layers[layer+1].getSize();

double[] w = new double[size];

for (int i=0; i<size; i++)

w[i] = Math.random()*2;

return w;

}

also in the`runAll`

method I have a call to the method`forward`

which just calculates the values for all the nodes based on their weights going forward. Here's that code:

and theCode:`public void forward()`

{

NeuralNode node;

for(int L=1; L<layers.length; L++)

{

for (int n=0; n<layers[L].getSize(); n++)

{

node = new NeuralNode(calcNode(L, n));

layers[L].setNodeAt(n, node);

}

}

}

`calcNode`

method is:

when I run the main, I get a NullPointerExceptionCode:`public double calcNode(int layer, int node)`

{

double sum = 0.0;

for (int i=0; i<layers[layer-1].getSize(); i++)

{

sum += layers[layer-1].getNodeAt(i).getValue()*getWeightFrom(layer-1, i, node);

}

sum += layers[layer].getNodeAt(node).getBias();

return sigmoid(sum);

}

and I'm assuming it means one of the weights it's trying to reference has not been initialized, butCode:`java.lang.NullPointerException`

at NeuralNode.getWeightTo(NeuralNode.java:25)

at NeuralNetwork.getWeightFrom(NeuralNetwork.java:79)

at NeuralNetwork.calcNode(NeuralNetwork.java:97)

at NeuralNetwork.forward(NeuralNetwork.java:110)

at NeuralNetwork.runAll(NeuralNetwork.java:180)

at NeuralNetwork.main(NeuralNetwork.java:206)

`initializeRandomWeights`

should initialize them all. Is there any way that`initializeRandomWeights`

is not getting called?

I realize this is a lengthy question, and I thank anyone who has taken the time to read it.