Skip to main content
Fig. 1 | Genetics Selection Evolution

Fig. 1

From: Deep learning versus parametric and ensemble methods for genomic prediction of complex phenotypes

Fig. 1

a Representation of a multilayer perceptron (MLP) network. Each unit is connected to the units of previous layers by a weighted linear summation, here represented by weight matrices Wi, and an activation function. Redrawn from: http://www.texample.net/tikz/examples/neural-network/. b Representation of a convolutional neural network (CNN). (i) The input layer consists of the SNP markers. (ii) Convolution layer consists of k filters, which capture the information in input layer by moving filters horizontally with a stride of “s” SNPs. (iii) Pooling layer involves of filters, combining the output of the previous convolution layer at certain locations into a single neuron. (iv) Fully connected layers connect every neuron in previous layer to every neuron in next layer. ‘ReLU’ indicates the rectified linear unit; softReLU indicates smooth rectified linear unit; Dropout indicates the dropout conduct layer

Back to article page