A Survey on Deep Learning in Medical Image Analysis
Geert Litjens, Thijs Kooi, Babak Ehteshami Bejnordi, Arnaud Arindra Adiyoso Setio, Francesco Ciompi, Mohsen Ghafoorian, Jeroen A.W.M. van der Laak, Bram van Ginneken, Clara I. Sa ́nchez
Diagnostic Image Analysis Group Radboud University Medical Center Nijmegen, The Netherlands
Geert Litjens, Thijs Kooi, Babak Ehteshami Bejnordi, Arnaud Arindra Adiyoso Setio, Francesco Ciompi, Mohsen Ghafoorian, Jeroen A.W.M. van der Laak, Bram van Ginneken, Clara I. Sa ́nchez
Diagnostic Image Analysis Group Radboud University Medical Center Nijmegen, The Netherlands
https://www.sciencedirect.com/science/article/pii/S1361841517301135?via%3Dihub
"Currently, the most popular models are trained end-
to-end in a supervised fashion, greatly simplifying
the training process. The most popular architectures
are convolutional neural networks (CNNs) and recur-
rent neural networks (RNNs). CNNs are currently
most widely used in (medical) image analysis, although
RNNs are gaining popularity. "
https://en.wikipedia.org/wiki/Softmax_function
In mathematics, the softmax function, or normalized exponential function,[1]:198 is a generalization of the logistic function that "squashes" a K-dimensional vector of arbitrary real values to a K-dimensional vector of real values, where each entry is in the range (0, 1], and all the entries add up to 1.
The second key difference between CNNs and MLPs,
is the typical incorporation of pooling layers in CNNs,
where pixel values of neighborhoods are aggregated using a permutation invariant function, typically the max
or mean operation. This induces a certain amount of
translation invariance and again reduces the amount of
parameters in the network. At the end of the convo-
lutional stream of the network, fully-connected layers
(i.e. regular neural network layers) are usually added,
where weights are no longer shared. Similar to MLPs,
a distribution over classes is generated by feeding the
activations in the final layer through a softmax function
and the network is trained using maximum likelihood.
https://en.wikipedia.org/wiki/Softmax_function
In mathematics, the softmax function, or normalized exponential function,[1]:198 is a generalization of the logistic function that "squashes" a K-dimensional vector of arbitrary real values to a K-dimensional vector of real values, where each entry is in the range (0, 1], and all the entries add up to 1.
No comments:
Post a Comment