Meet Mish — New State of the Art AI Activation Function. The successor to ReLU?

Source: Deep Learning on Medium

A new paper by Diganta Misra titled “Mish: A Self Regularized Non-Monotonic Neural Activation Function” introduces the AI world to a new…

Continue reading on Medium »