Skip to content
Advertisement

DeepLearning4J Problems with INDArray

    public void playFullGame(MultiLayerNetwork m1, MultiLayerNetwork m2) {
    boolean player = false;
    while (!this.isOver) {
        float[] f = Main.rowsToInput(this.rows);
        System.out.println(f.length);// prints 42
        INDArray input = Nd4j.create(f);
        this.addChip(Main.getHighestOutput(player ? m1.output(input) : m2.output(input)), player);
        player = !player;
    }
}

I use INDArray input = Nd4j.create(f); to create the INDArray but this m1.output(input) throws following exception:

Exception in thread "AWT-EventQueue-0" org.deeplearning4j.exception.DL4JInvalidInputException: Input size (63 columns; shape = [1, 63]) is invalid: does not match layer input size (layer # inputs = 42) (layer name: layer2, layer index: 2, layer type: OutputLayer)

I do not understand why the created INDArray is two-dimensional and where the 63 is coming from..

Edit: The MultiLayerNetwork Configuration:

MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
            .seed(randSeed).optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
            .updater(new Nesterovs(0.1, 0.9)).list()
            .layer(new DenseLayer.Builder().nIn(numRows * numColums).nOut(63).activation(Activation.RELU)
                    .weightInit(WeightInit.XAVIER).build())
            .layer(new DenseLayer.Builder().nIn(63).nOut(63).activation(Activation.RELU)
                    .weightInit(WeightInit.XAVIER).build())
            .layer(new OutputLayer.Builder(LossFunction.NEGATIVELOGLIKELIHOOD).nIn(numRows * numColums).nOut(7)
                    .activation(Activation.SOFTMAX).weightInit(WeightInit.XAVIER).build())
            .build();

Advertisement

Answer

Your 63 is coming from the neural network itself. You have a shape mismatch on the number of inputs and outputs.

If you’re new to neural networks, just understand that the basic 2d neural networks have a number of inputs and outputs specified. The next layer should have the number of inputs match the number of outputs of the previous layer.

For dense layers, the number of inputs in the first layer needs to match the number of input columns in your dataset. I would make sure whatever your dataset is the number of columns matches.

Of note here is setting the number of inputs and outputs for every layer of a neural network can be error prone. Instead, just set the number of outputs for each layer and use dl4j’s setInputType api instead with the number of columns. In your case add InputType.feedforward(https://github.com/eclipse/deeplearning4j/blob/master/deeplearning4j/deeplearning4j-nn/src/main/java/org/deeplearning4j/nn/conf/inputs/InputType.java#L107)

MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
            .seed(randSeed).optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
            .updater(new Nesterovs(0.1, 0.9)).list()
            .layer(new DenseLayer.Builder().nIn(numRows * numColums).nOut(63).activation(Activation.RELU)
                    .weightInit(WeightInit.XAVIER).build())
            .layer(new DenseLayer.Builder().nIn(63).nOut(63).activation(Activation.RELU)
                    .weightInit(WeightInit.XAVIER).build())
            .layer(new OutputLayer.Builder(LossFunction.NEGATIVELOGLIKELIHOOD).nIn(numRows * numColums).nOut(7)
                    .activation(Activation.SOFTMAX).weightInit(WeightInit.XAVIER).build())
            .setInputType(InputType.feedForward(numRows * numColums))
            .build();

More examples (mainly cnns) here: https://github.com/eclipse/deeplearning4j-examples/search?q=setInputType

User contributions licensed under: CC BY-SA
8 People found this is helpful
Advertisement