Welcome to the Java Programming Forums


The professional, friendly Java community. 21,500 members and growing!


The Java Programming Forums are a community of Java programmers from all around the World. Our members have a wide range of skills and they all have one thing in common: A passion to learn and code Java. We invite beginner Java programmers right through to Java professionals to post here and share your knowledge. Become a part of the community, help others, expand your knowledge of Java and enjoy talking with like minded people. Registration is quick and best of all free. We look forward to meeting you.


>> REGISTER NOW TO START POSTING


Members have full access to the forums. Advertisements are removed for registered users.

Results 1 to 10 of 10

Thread: Neural Network Programming - Wrong result after training

  1. #1
    Junior Member
    Join Date
    Jun 2013
    Posts
    7
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default Neural Network Programming - Wrong result after training

    I am attempting to write a Java implementation of a NeuralNetwork. This consists of:

    A "Network" class
    An abstract "Neuron" class
    An "Hidden Neuron" class, which extends Neuron
    An "Output Neuron" class, which extends Neuron
    A "Layer" class, which holds HiddenNeuron objects in an Array
    A "Link" class, which creates the link between two neurons
    A "RunNet" class used to run the network

    THe main Network classes essentially has methods to a) train and b) run the network based on a series of inputs/outputs.

    My code:

    Network.java

     import java.util.ArrayList;
     
     
        public class Network {
     
     
    	OutputNeuron outputNeuron = new OutputNeuron();
     
    	ArrayList<Layer> networkLayers = new ArrayList<Layer>();
     
    	public void AddLayer(Layer layer) {
     
    		networkLayers.add(layer);
     
    	}
     
    	public void BridgeNetwork() {
     
    		for(int i = 0; i+1 < networkLayers.size(); i++) {
     
    			Layer currentLayer = networkLayers.get(i);
    			Layer nextLayer = networkLayers.get(i+1);
     
    			for(HiddenNeuron currentLayerNeuron : currentLayer.getLayerNeurons()) {
     
    				for(HiddenNeuron nextLayerNeuron : nextLayer.getLayerNeurons()) {
     
    					Link link = new Link(currentLayerNeuron, nextLayerNeuron);
     
    					currentLayerNeuron.addForwardLink(link);
    					nextLayerNeuron.addBackLink(link);
     
    				}
     
    			}
     
    			//Now taking care of linking last hidden layer to output neuron
    			ArrayList<HiddenNeuron> lastLayer = (networkLayers.get(networkLayers.size()-1)).getLayerNeurons();
     
    			for(HiddenNeuron currentNeuron : lastLayer) {
     
    				Link link = new Link(currentNeuron, outputNeuron);
     
    				currentNeuron.addForwardLink(link);
    				outputNeuron.addBackLink(link);
     
    			}
     
    		}
     
    	}
     
    	public boolean SetupInputs(ArrayList<Double> inputs) {
     
    		//Setting up inputs		
    		ArrayList<HiddenNeuron> firstLayerNeurons = (networkLayers.get(0)).getLayerNeurons();
     
    		if(firstLayerNeurons.size() != inputs.size()) {
     
    			System.out.println("ERROR! Number of inputs must equal number of neurons in first layer!");
    			return false;
     
    		}
     
    		for(int i = 0; i < firstLayerNeurons.size(); i++) {
     
    			(firstLayerNeurons.get(i)).SetOutput(inputs.get(i));
     
     
    		}
     
    		return true;
     
    	}
     
    	public double FeedForward() {
     
     
    		//Feeding forward!
    		for(Layer currentLayer : networkLayers) {
     
    			for(HiddenNeuron currentNeuron : currentLayer.getLayerNeurons()) {
     
    				currentNeuron.FeedForward();
     
    			}
     
    		}
     
    		//Alright! We got to the output neuron!
    		outputNeuron.ComputeOutput();
     
    		return outputNeuron.getOutput();
     
     
    	}
     
    	public void BackPropagate() {
     
    		outputNeuron.BackPropagate();
     
    		for(int i = networkLayers.size()-1; i >= 0; i--) {
     
    			ArrayList<HiddenNeuron> currentLayerNeurons = (networkLayers.get(i)).getLayerNeurons();
     
    			for(HiddenNeuron currentNeuron : currentLayerNeurons) {
     
    				currentNeuron.BackPropagate();
     
    			}
     
    		}
     
    	}
     
    	public void TrainNet(ArrayList<ArrayList<Double>> myTemplates, int epochs) {
     
    		for(int a = 0; a < myTemplates.size(); a++) {
     
    			System.out.println("Training throught testSet " + a);
     
    			ArrayList<Double> currentTemplate = myTemplates.get(a);
     
    			outputNeuron.setDesired(currentTemplate.get(0));
    			currentTemplate.remove(0);
    			SetupInputs(currentTemplate);
     
    			for(int i = 0; i < epochs; i++) {
    				System.out.println("Epoch " + i + " of test testSet " + a);
     
    				FeedForward();
    				BackPropagate();
     
    			}
     
     
    		}
     
    	}
     
    	public double RunNet(ArrayList<Double> inputs) {
    		System.out.println("Executing net");
    		SetupInputs(inputs);
    		return FeedForward();
     
    	}
     
        }

    Layer.java

    import java.util.ArrayList;
     
     
    public class Layer {
     
    	ArrayList<HiddenNeuron> layerNeurons = new ArrayList<HiddenNeuron>();
     
    	public void addNeuron(HiddenNeuron neuron) {
     
    		layerNeurons.add(neuron);
     
    	}
     
    	public ArrayList<HiddenNeuron> getLayerNeurons() {
     
    		return layerNeurons;
     
    	}
     
        }

    Link.java


    public class Link {
     
    	HiddenNeuron neuronA = null;
    	Neuron neuronB = null;
     
    	double weight = 0;
     
    	public Link(HiddenNeuron neuronA, Neuron neuronB) {
     
    		this.neuronA = neuronA;
    		this.neuronB = neuronB;
     
    		double weight = Math.random();
     
    		if(weight > 0.5)
    			this.weight = weight;
    		else
    			this.weight = -1*weight;
     
    	}
     
    	protected void BackPropagate(double error) {
     
    		weight += error;
    		neuronA.receiveError(error*weight);
     
    	}
     
    	protected void FeedForward(double input) {
     
    		neuronB.ReceiveInput(weight*input);
     
    	}
     
        }
    Neuron.java

    import java.util.ArrayList;
     
     
        public abstract class Neuron {
     
    	double output = 0;
    	double derivativeOutput = 0;
    	double error = 0;
     
    	ArrayList<Double> inputs = new ArrayList<Double>();
    	ArrayList<Link> backLinks = new ArrayList<Link>();
     
    	public void addBackLink(Link link) {
     
    		backLinks.add(link);
     
    	}
     
    	protected void ComputeOutput() {
     
    		double partialSum = 0;
     
    		//Summing all inputs
    		for(double currentInput : inputs) {
     
    			partialSum += currentInput;
     
    		}
     
    		//Setting output to be result of sigmoid function taking sum of inputs as parameter
    		output = 1/(1+Math.pow(Math.E, (-1*partialSum)));
     
    		ComputeDerivative();
     
    	}
     
    	private void ComputeDerivative() {
     
    		derivativeOutput = output*(1-output);
     
    	}
     
    	protected void ReceiveInput(double input) {
     
    		inputs.add(input);
     
    	}
     
    	protected void BackPropagate() {
     
    		ComputeError();
     
    		for(Link currentLink : backLinks) {
     
    			currentLink.BackPropagate(error);
     
    		}
     
    	}
     
    	protected abstract void ComputeError();
     
     
     
        }

    HiddenNeuron.java

     import java.util.ArrayList;
     
     
        public class HiddenNeuron extends Neuron {
     
    	ArrayList<Double> myErrors = new ArrayList<Double>();
    	ArrayList<Link> forwardLinks = new ArrayList<Link>();
     
    	public void addForwardLink(Link link) {
     
    		forwardLinks.add(link);
     
    	}
     
    	public void SetOutput(double output) {
     
    		super.output = output;
     
    	}
     
    	protected void receiveError(double error) {
     
    		myErrors.add(error);
     
    	}
     
    	protected void ComputeError() {
     
    		double partialSum = 0;
     
    		for(double currentError : myErrors) {
     
    			partialSum += currentError;
     
    		}
     
    		super.error = partialSum*super.derivativeOutput;
     
    	}
     
    	protected void FeedForward() {
     
    		super.ComputeOutput();
     
    		for(Link currentLink : forwardLinks) {
     
    			currentLink.FeedForward(super.output);
     
    		}
     
    	}
     
        }

    OutputNeuron.java


    public class OutputNeuron extends Neuron {
     
    	double desired = 0.0;
     
    	public double getOutput() {
     
    		return super.output;
     
    	}
     
    	public void setDesired(double desired) {
     
    		this.desired = desired;
     
    	}
     
    	protected void ComputeError() {
     
    		super.ComputeOutput();
     
    		super.error = (desired-super.output)*super.derivativeOutput;
     
    	}
     
        }

    RunNet.java (used to train the network for an "OR" gate)

    import java.util.ArrayList;
     
     
        public class RunNet {
     
    	public static void main(String[] args){
     
    		ArrayList<Double> testA = new ArrayList<Double>();
    		testA.add(1.0);
    		testA.add(1.0);
    		testA.add(1.0);
     
    		ArrayList<Double> testB = new ArrayList<Double>();
    		testB.add(1.0);
    		testB.add(0.0);
    		testB.add(1.0);
     
    		ArrayList<Double> testC = new ArrayList<Double>();
    		testC.add(0.0);
    		testC.add(1.0);
    		testC.add(1.0);
     
    		ArrayList<Double> testD = new ArrayList<Double>();
    		testD.add(0.0);
    		testD.add(0.0);
    		testD.add(0.0);
     
    		ArrayList<ArrayList<Double>> myTemplates = new ArrayList<ArrayList<Double>>();
    		myTemplates.add(testA);
    		myTemplates.add(testB);
    		myTemplates.add(testC);
    		myTemplates.add(testD);
     
    		Network myNet = new Network();
     
    		Layer first = new Layer();
    		Layer second = new Layer();
     
    		first.addNeuron(new HiddenNeuron());
    		first.addNeuron(new HiddenNeuron());
     
    		second.addNeuron(new HiddenNeuron());
    		second.addNeuron(new HiddenNeuron());
     
    		myNet.AddLayer(first);
    		myNet.AddLayer(second);
     
    		myNet.BridgeNetwork();
     
    		ArrayList<Double> testE = new ArrayList<Double>();
    		testE.add(0.0);
    		testE.add(0.0);
     
    		myNet.TrainNet(myTemplates, 10000);
    		System.out.println(myNet.RunNet(testD));
     
    	}
     
        }

    However, when I then run the network with both A and B of the expression A OR B the network returns true... Any suggestions for improvement? :S

    Thanks


  2. #2
    Super Moderator copeg's Avatar
    Join Date
    Oct 2009
    Location
    US
    Posts
    5,235
    Thanks
    176
    Thanked 817 Times in 760 Posts
    Blog Entries
    5

    Default Re: Neural Network Programming - Wrong result after training

    I'm not sure I get the training data: don't testA and testC conflict with each other (presuming the first index is output and latter 2 the input)? What is the input layer?

  3. #3
    Junior Member
    Join Date
    Jun 2013
    Posts
    7
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default Re: Neural Network Programming - Wrong result after training

    Quote Originally Posted by copeg View Post
    I'm not sure I get the training data: don't testA and testC conflict with each other (presuming the first index is output and latter 2 the input)? What is the input layer?
    Hi copeg,

    You are right, testC appeared to have been wrong. I have fixed it to:

    ArrayList<Double> testC = new ArrayList<Double>();
    		testC.add(1.0);
    		testC.add(1.0);
    		testC.add(0.0);

    Which is basically saying for A = true and B = false return true. However, when trained the network is still returning true for A = false and B = false.

    There are no proper input neurons, what happens is inputs ae set as values of neurons in the first layer are set as outpu the various input values:

    public boolean SetupInputs(ArrayList<Double> inputs) {
     
    		//Setting up inputs		
    		ArrayList<HiddenNeuron> firstLayerNeurons = (networkLayers.get(0)).getLayerNeurons();
     
    		if(firstLayerNeurons.size() != inputs.size()) {
     
    			System.out.println("ERROR! Number of inputs must equal number of neurons in first layer!");
    			return false;
     
    		}
     
    		for(int i = 0; i < firstLayerNeurons.size(); i++) {
     
    			(firstLayerNeurons.get(i)).SetOutput(inputs.get(i));
     
     
    		}
     
    		return true;
     
    	}

    Perhaps this isn't the most wise decision?

  4. #4
    Member angstrem's Avatar
    Join Date
    Mar 2013
    Location
    Ukraine
    Posts
    200
    My Mood
    Happy
    Thanks
    9
    Thanked 31 Times in 29 Posts

    Default Re: Neural Network Programming - Wrong result after training

    I advice you to prototype all your machine learning soft on languages like Matlab or Octave and only then start programming in Java. They are well-tuned for this staff and, hence, you can see errors much easily.

  5. #5
    Junior Member
    Join Date
    Jun 2013
    Posts
    7
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default Re: Neural Network Programming - Wrong result after training

    Quote Originally Posted by angstrem View Post
    I advice you to prototype all your machine learning soft on languages like Matlab or Octave and only then start programming in Java. They are well-tuned for this staff and, hence, you can see errors much easily.
    Hi angstream, in what ways are they better? I'm not trying to sound skeptical, I'm just not very familiar with them.

  6. #6
    Member angstrem's Avatar
    Join Date
    Mar 2013
    Location
    Ukraine
    Posts
    200
    My Mood
    Happy
    Thanks
    9
    Thanked 31 Times in 29 Posts

    Default Re: Neural Network Programming - Wrong result after training

    They natively support "heavy math": linear algebra, statistics, operations with functions (like minimization) - all the staff necessary for machine learning. Hence, a thing of that kind that would take hundreds of lines of code in Java or C++ (or deep search of appropriate (and efficient) math libraries) can be easily done just in several lines of code in Octave or Matlab. This leads to simplicity of implementation of machine learning algorithms. You can spend minutes for realization your algorithm in ML, see whether it works or not, and then spend hours to implement it in Java or C++.

  7. #7
    Super Moderator copeg's Avatar
    Join Date
    Oct 2009
    Location
    US
    Posts
    5,235
    Thanks
    176
    Thanked 817 Times in 760 Posts
    Blog Entries
    5

    Default Re: Neural Network Programming - Wrong result after training

    The goal for writing a neural network hasn't been explicitly stated in this thread, but my guess is that this is an exercise, not exploratory data mining/machine learning. So while I do agree with angstrem that exploratory research is much easier in a language/program more geared towards this type of analysis (R, MATLAB, etc...) then moving (if necessary) to a more production lower level language, as an exercise in creating networks and learning nuts and bolts I don't see a problem in continuing with your project. Just be careful to bite off more than you can chew

  8. #8
    Junior Member
    Join Date
    Jun 2013
    Posts
    7
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default Re: Neural Network Programming - Wrong result after training

    Ho copeg,

    Yes, this is simply a challenge I set myself. I would like to stress that it is not an academic work that will earn me a grade, just something I decided to do. This said, I've been reading for the past days about neural nets basics, but I can't seem to understand where I'm going wrong. Would you have any suggestions?

  9. #9
    Super Moderator copeg's Avatar
    Join Date
    Oct 2009
    Location
    US
    Posts
    5,235
    Thanks
    176
    Thanked 817 Times in 760 Posts
    Blog Entries
    5

    Default Re: Neural Network Programming - Wrong result after training

    First, look at your inputs and errors Lists<Double>...the way they seem to being used they are never updated during the learning, and only accumulate values through the process. Second, given the complexity of the algorithm I'd recommend using something to help you debug, be it a debugger, lots of printlns, or some visual tool to assess step by step how the network is learning

  10. #10
    Junior Member
    Join Date
    Jun 2013
    Posts
    7
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default Re: Neural Network Programming - Wrong result after training

    Hi Copeg,

    Admittedly I was forgetting to reset lists of inputs and outputs. I added the following:

    public void ResetNetwork() {
     
    		for(int i=0; i < networkLayers.size(); i++) {
     
    			ArrayList<HiddenNeuron> currentLayerNeurons = networkLayers.get(i).getLayerNeurons();
     
    			for(HiddenNeuron currentNeuron : currentLayerNeurons) {
     
    				if(i > 0)
    					currentNeuron.ResetInputs();
     
    				currentNeuron.ResetErrors();
     
    			}
     
    		}
     
    	}

    Which is called at each step of training after BackPropagate(). Still though, I see no improvement. Can you suggest some visual tool? I'm using Eclipse debugger but with hundreds of cycles it isn't very useful...

Similar Threads

  1. Stock Picker and Neural Network
    By didsbub in forum Java Theory & Questions
    Replies: 0
    Last Post: July 29th, 2012, 07:08 PM
  2. Problem in code of artificial neural network
    By aduaitpokhriyal in forum What's Wrong With My Code?
    Replies: 3
    Last Post: May 31st, 2011, 07:44 AM
  3. Something wrong with my programming, please help
    By mrmodest in forum What's Wrong With My Code?
    Replies: 3
    Last Post: April 5th, 2011, 08:33 PM
  4. Socket programming something gone wrong
    By ambika.th in forum Java Networking
    Replies: 1
    Last Post: March 6th, 2011, 10:47 AM
  5. Neural Networks and Prediction?
    By Superstar288 in forum Java Theory & Questions
    Replies: 5
    Last Post: November 28th, 2010, 10:12 AM