Dimitriadis, NikolaosMaragos, Petros2021-12-042021-12-042021-12-042021-01-0110.1109/ICASSP39728.2021.9415123https://infoscience.epfl.ch/handle/20.500.14299/183520WOS:000704288404017In this paper, we study an emerging class of neural networks, the Morphological Neural networks, from some modern perspectives. Our approach utilizes ideas from tropical geometry and mathematical morphology. First, we state the training of a binary morphological classifier as a Difference-of-Convex optimization problem and extend this method to multiclass tasks. We then focus on general morphological networks trained with gradient descent variants and show, quantitatively via pruning schemes as well as qualitatively, the sparsity of the resulted representations compared to FeedForward networks with ReLU activations as well as the effect the training optimizer has on such compression techniques. Finally, we show how morphological networks can be employed to guarantee monotonicity and present a softened version of a known architecture, based on Maslov Dequantization, which alleviates issues of gradient propagation associated with its "hard" counterparts and moderately improves performance.AcousticsComputer Science, Artificial IntelligenceComputer Science, Software EngineeringEngineering, Electrical & ElectronicImaging Science & Photographic TechnologyComputer ScienceEngineeringtropical geometrymorphological neural networksmonotonicitypruningmaslov dequantizationAdvances In Morphological Neural Networks: Training, Pruning And Enforcing Shape Constraintstext::conference output::conference proceedings::conference paper