This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||
|
What is the purpose of the column " when "? It's unclear what the expression means, as I wrote in the {{ clarify}} template. What I don't understand are:
These ambiguities apparently lead to some disagreement to what should actually be the truth value of that expression (yes or no?). I think we need to answer these questions, otherwise I don't know if it makes any sense to keep the column. — Kri ( talk) 11:56, 5 May 2016 (UTC)
Take a look at these two links
Is it possible to add these to the table? — Preceding unsigned comment added by 183.179.55.44 ( talk) 16:54, 21 January 2020 (UTC)
The terminology in this subsection has little support in literature. The embedded link to fold functions is neither particularly relevant. This section should be removed or updated. — Preceding unsigned comment added by 88.94.64.26 ( talk) 16:39, 17 October 2022 (UTC)
On July 7th 2020, many of the existing activation functions were removed from this Wikipedia article. The full list is available at: https://en.wikipedia.org/?title=Activation_function&oldid=966536154 — Preceding unsigned comment added by 185.107.13.4 ( talk) 20:49, 26 July 2020 (UTC)
The user who removed these activation functions gave number of citations as their criterion for removal. Is it clear how citations were counted? Is 20 a reasonable threshold? Kaylimekay ( talk) 12:18, 20 September 2020 (UTC)
There are hundreds of different activation functions, most with minimal traction. Usage is a better criterion than citations: https://paperswithcode.com/methods/category/activation-functions Even now, there are probably too many irrelevant activations listed; are there any state-of-the-art models that use the sinc, atan, or sin as an activation function? Another good heuristic is this: what activations are included in PyTorch, Jax, Tensorflow, or MXNet? User:Ringdongdang 9 November, 2020
I removed sinc, sin, atan since those aren't activation functions in any SOTA architecture nor are they used in common neural networks. https://paperswithcode.com/methods/category/activation-functions Someone added "squashing functions" referencing papers from October 2020 with 0 citations; once the paper has more traction (like it being implemented in the core library of tensorflow and pytorch) the person can feel to re-add the activation. It may be useful to add activations that people actually use, like GLU. User:Ringdongdang 20 November, 2020 — Preceding unsigned comment added by Ringdongdang ( talk • contribs) 23:49, 20 November 2020 (UTC)
Removed activation functions (AF) still have usage even minimal. If some AF that not necessary mean that it will not be used in feature. If we there some criteria for AF to be on lit let's add comment like top 20 most used AF. Trach their score like in sport. Pick most used AF every year.
Someone keeps adding the "Growing Cosine Unit" which does not even have five citations at the time of writing. They keep adding it and text that serves to advertise the activation function, as well as a figure that talks about it. When I removed it, I was called "possible vandalism," but spending inordinate time talking about an activation that someone just proposed (that has not caught on) is closer to vandalism. — Preceding unsigned comment added by Ringdongdang ( talk • contribs) 22:41, 23 November 2021 (UTC)
They added back their three citation paper for a third time. — Preceding unsigned comment added by Ringdongdang ( talk • contribs) 19:07, 13 December 2021 (UTC)
They've tried adding it back I think five times now. Can we require that people have an account to edit this page? Researchers keep adding their activation functions that do not have traction. — Preceding unsigned comment added by Ringdongdang ( talk • contribs) 00:45, 26 December 2021 (UTC)
The list within the link provided is more comprehensive than this Wiki Page, updates are worth considering. https://github.com/digantamisra98/Mish#significance-level — Preceding unsigned comment added by 183.179.53.41 ( talk) 04:15, 21 October 2021 (UTC)
What is the result of the derivative of the ELU function, when alpha != 1.0 and x == 0.0? Someone who knows the answer might want to update the page. — Preceding unsigned comment added by 2003:E5:2724:4EA3:784E:FE9D:6EC8:85C3 ( talk) 15:20, 29 November 2021 (UTC)
It would be great if the IPs edit warring at this page would review WP:COI and WP:RS - Wikipedia isn't a place to promote yourself by posting links to your arxiv preprints. Also have a look at WP:POINT - you are not going to get your way by disrupting Wikipedia. Reverting my edits at random is just going to ensure you keep getting blocked and will get this article locked down so IP editors won't be able to change it. MrOllie ( talk) 19:07, 20 October 2022 (UTC)
This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||
|
What is the purpose of the column " when "? It's unclear what the expression means, as I wrote in the {{ clarify}} template. What I don't understand are:
These ambiguities apparently lead to some disagreement to what should actually be the truth value of that expression (yes or no?). I think we need to answer these questions, otherwise I don't know if it makes any sense to keep the column. — Kri ( talk) 11:56, 5 May 2016 (UTC)
Take a look at these two links
Is it possible to add these to the table? — Preceding unsigned comment added by 183.179.55.44 ( talk) 16:54, 21 January 2020 (UTC)
The terminology in this subsection has little support in literature. The embedded link to fold functions is neither particularly relevant. This section should be removed or updated. — Preceding unsigned comment added by 88.94.64.26 ( talk) 16:39, 17 October 2022 (UTC)
On July 7th 2020, many of the existing activation functions were removed from this Wikipedia article. The full list is available at: https://en.wikipedia.org/?title=Activation_function&oldid=966536154 — Preceding unsigned comment added by 185.107.13.4 ( talk) 20:49, 26 July 2020 (UTC)
The user who removed these activation functions gave number of citations as their criterion for removal. Is it clear how citations were counted? Is 20 a reasonable threshold? Kaylimekay ( talk) 12:18, 20 September 2020 (UTC)
There are hundreds of different activation functions, most with minimal traction. Usage is a better criterion than citations: https://paperswithcode.com/methods/category/activation-functions Even now, there are probably too many irrelevant activations listed; are there any state-of-the-art models that use the sinc, atan, or sin as an activation function? Another good heuristic is this: what activations are included in PyTorch, Jax, Tensorflow, or MXNet? User:Ringdongdang 9 November, 2020
I removed sinc, sin, atan since those aren't activation functions in any SOTA architecture nor are they used in common neural networks. https://paperswithcode.com/methods/category/activation-functions Someone added "squashing functions" referencing papers from October 2020 with 0 citations; once the paper has more traction (like it being implemented in the core library of tensorflow and pytorch) the person can feel to re-add the activation. It may be useful to add activations that people actually use, like GLU. User:Ringdongdang 20 November, 2020 — Preceding unsigned comment added by Ringdongdang ( talk • contribs) 23:49, 20 November 2020 (UTC)
Removed activation functions (AF) still have usage even minimal. If some AF that not necessary mean that it will not be used in feature. If we there some criteria for AF to be on lit let's add comment like top 20 most used AF. Trach their score like in sport. Pick most used AF every year.
Someone keeps adding the "Growing Cosine Unit" which does not even have five citations at the time of writing. They keep adding it and text that serves to advertise the activation function, as well as a figure that talks about it. When I removed it, I was called "possible vandalism," but spending inordinate time talking about an activation that someone just proposed (that has not caught on) is closer to vandalism. — Preceding unsigned comment added by Ringdongdang ( talk • contribs) 22:41, 23 November 2021 (UTC)
They added back their three citation paper for a third time. — Preceding unsigned comment added by Ringdongdang ( talk • contribs) 19:07, 13 December 2021 (UTC)
They've tried adding it back I think five times now. Can we require that people have an account to edit this page? Researchers keep adding their activation functions that do not have traction. — Preceding unsigned comment added by Ringdongdang ( talk • contribs) 00:45, 26 December 2021 (UTC)
The list within the link provided is more comprehensive than this Wiki Page, updates are worth considering. https://github.com/digantamisra98/Mish#significance-level — Preceding unsigned comment added by 183.179.53.41 ( talk) 04:15, 21 October 2021 (UTC)
What is the result of the derivative of the ELU function, when alpha != 1.0 and x == 0.0? Someone who knows the answer might want to update the page. — Preceding unsigned comment added by 2003:E5:2724:4EA3:784E:FE9D:6EC8:85C3 ( talk) 15:20, 29 November 2021 (UTC)
It would be great if the IPs edit warring at this page would review WP:COI and WP:RS - Wikipedia isn't a place to promote yourself by posting links to your arxiv preprints. Also have a look at WP:POINT - you are not going to get your way by disrupting Wikipedia. Reverting my edits at random is just going to ensure you keep getting blocked and will get this article locked down so IP editors won't be able to change it. MrOllie ( talk) 19:07, 20 October 2022 (UTC)