Content deleted Content added
Expand and move the "other names" paragraph out of the first paragraph. |
m punct. |
||
Line 4:
The '''ramp function''' is a [[unary function|unary]] [[real function]], whose [[graph of a function|graph]] is shaped like a [[ramp]]. It can be expressed by numerous [[#Definitions|definitions]], for example "0 for negative inputs, output equals input for non-negative inputs". The term "ramp" can also be used for other functions obtained by [[scaling and shifting]], and the function in this article is the ''unit'' ramp function (slope 1, starting at 0).
In [[machine learning]], it is commonly known as a [[Rectifier_(neural_networks)|ReLU]] [[activation function]]<ref name='brownlee'>{{cite web |last1=Brownlee |first1=Jason |title=A Gentle Introduction to the Rectified Linear Unit (ReLU) |url=https://machinelearningmastery.com/rectified-linear-activation-function-for-deep-learning-neural-networks/ |website=Machine Learning Mastery |access-date=8 April 2021 |date=8 January 2019}}</ref><ref name="medium-relu">{{cite web |last1=Liu |first1=Danqing |title=A Practical Guide to ReLU |url=https://medium.com/@danqing/a-practical-guide-to-relu-b83ca804f1f7 |website=Medium |access-date=8 April 2021 |language=en |date=30 November 2017}}</ref> or a [[Rectifier (neural networks)|rectifier]] in analogy to [[half-wave rectification]] in [[electrical engineering]]. In [[statistics]] (when used as a [[likelihood function]]) it is known as a [[tobit model]].
This function has numerous [[#Applications|applications]] in mathematics and engineering, and goes by various names, depending on the context.
|