Advances In Morphological Neural Networks: Training, Pruning And Enforcing Shape Constraints
In this paper, we study an emerging class of neural networks, the Morphological Neural networks, from some modern perspectives. Our approach utilizes ideas from tropical geometry and mathematical morphology. First, we state the training of a binary morphological classifier as a Difference-of-Convex optimization problem and extend this method to multiclass tasks. We then focus on general morphological networks trained with gradient descent variants and show, quantitatively via pruning schemes as well as qualitatively, the sparsity of the resulted representations compared to FeedForward networks with ReLU activations as well as the effect the training optimizer has on such compression techniques. Finally, we show how morphological networks can be employed to guarantee monotonicity and present a softened version of a known architecture, based on Maslov Dequantization, which alleviates issues of gradient propagation associated with its "hard" counterparts and moderately improves performance.
WOS:000704288404017
2021-01-01
978-1-7281-7605-5
New York
3825
3829
REVIEWED
Event name | Event place | Event date |
ELECTR NETWORK | Jun 06-11, 2021 | |