Skip to main content

Advertisement

Fig. 1 | BMC Genomics

Fig. 1

From: Architectures and accuracy of artificial neural network for disease classification from omics data

Fig. 1

The basic architectures for MLP (A) and CNN (B). Because MLP’s and CNN’s basic architectures both had a single hidden/convolution layer of 16 units or kernels, they were both coded as “1L_16U.” Starting from 1L_16U, variant architectures with increasing number of units on hidden layers and/or additional hidden layers were included into the testing panel (Additional file 1: Table S1). While not shown in the plot, the architectures by default have a dropout layer immediately prior to the output layer with a dropout rate of 0.5

Back to article page