Mathematics desk | ||
---|---|---|
< April 23 | << Mar | April | May >> | April 25 > |
Welcome to the Wikipedia Mathematics Reference Desk Archives |
---|
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
Hello, while studying neural networks I came upon what was described by the lecturer as a "math trick" to solve a particular type of optimization problem using gradient descent. Basically, when optimizing a neural net that requires two parameters to be equal, you can replace the partial derivative for each constrained parameter by the sum of the partials with respect to each constrained parameter. So if you have , then the derivative of with respect to is the same as the sum of the partial derivatives of with respect to and . So far I have not been able to find a counter example, but I also do not know how to prove it. If anyone has pointers, clues, or proofs please help! Sorry if its obvious, and thanks! Brusegadi ( talk) 01:09, 24 April 2017 (UTC)
How about a slightly different situation involving total derivative of a function G of constrained independent variables xi with constant sum, for instance 1. Can the partial derivative with respect to xi exist by keeping the other xj constant even if only the sum of xi is constant, not every xi? Is this due to fact that dxi is around zero? Thanks.-- 82.137.9.214 ( talk) 23:49, 24 April 2017 (UTC)
Is Johnson's SU-distribution and "shepherd's crook" and Johnson Curve all the same thing? Thanks. Anna Frodesiak ( talk) 06:25, 24 April 2017 (UTC)
I want to know because of [1] and [2] that I did. Anna Frodesiak ( talk) 10:24, 24 April 2017 (UTC)
When we count, for example, coins or banknotes by their face value rather than actual quantity expressed by natural numbers, are those still natural numbers or some other kind? Thanks.-- 212.180.235.46 ( talk) 16:59, 24 April 2017 (UTC)
Mathematics desk | ||
---|---|---|
< April 23 | << Mar | April | May >> | April 25 > |
Welcome to the Wikipedia Mathematics Reference Desk Archives |
---|
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
Hello, while studying neural networks I came upon what was described by the lecturer as a "math trick" to solve a particular type of optimization problem using gradient descent. Basically, when optimizing a neural net that requires two parameters to be equal, you can replace the partial derivative for each constrained parameter by the sum of the partials with respect to each constrained parameter. So if you have , then the derivative of with respect to is the same as the sum of the partial derivatives of with respect to and . So far I have not been able to find a counter example, but I also do not know how to prove it. If anyone has pointers, clues, or proofs please help! Sorry if its obvious, and thanks! Brusegadi ( talk) 01:09, 24 April 2017 (UTC)
How about a slightly different situation involving total derivative of a function G of constrained independent variables xi with constant sum, for instance 1. Can the partial derivative with respect to xi exist by keeping the other xj constant even if only the sum of xi is constant, not every xi? Is this due to fact that dxi is around zero? Thanks.-- 82.137.9.214 ( talk) 23:49, 24 April 2017 (UTC)
Is Johnson's SU-distribution and "shepherd's crook" and Johnson Curve all the same thing? Thanks. Anna Frodesiak ( talk) 06:25, 24 April 2017 (UTC)
I want to know because of [1] and [2] that I did. Anna Frodesiak ( talk) 10:24, 24 April 2017 (UTC)
When we count, for example, coins or banknotes by their face value rather than actual quantity expressed by natural numbers, are those still natural numbers or some other kind? Thanks.-- 212.180.235.46 ( talk) 16:59, 24 April 2017 (UTC)