Skip to main content
Fig. 1 | BMC Genomics

Fig. 1

From: AMPlify: attentive deep learning model for discovery of novel antimicrobial peptides effective against WHO priority pathogens

Fig. 1

Model architecture of AMPlify. Residues of a peptide sequence are one-hot encoded and passed to three hidden layers in order: the bidirectional long short-term memory (Bi-LSTM) layer, the multi-head scaled dot-product attention (MHSDPA) layer and the context attention (CA) layer. The output layer generates the probability that the input sequence is an AMP

Back to article page