You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The gradient of the Logistic Loss implemented in FastTree uses the LambdaRank-style sigmoid parameter, set to the learning rate. This quashes the gradients for small learning rates. While this works well for classification tasks, when used by the General Additive Model (GAM) trainer, it prohibits learning with small learning rates. However, the GAM learning-by-boosting technique implemented here requires small learning rates to be stable.
The text was updated successfully, but these errors were encountered:
Issue
The gradient of the Logistic Loss implemented in
FastTree
uses theLambdaRank
-stylesigmoid
parameter, set to the learning rate. This quashes the gradients for small learning rates. While this works well for classification tasks, when used by theGeneral Additive Model
(GAM
) trainer, it prohibits learning with small learning rates. However, theGAM
learning-by-boosting technique implemented here requires small learning rates to be stable.The text was updated successfully, but these errors were encountered: