Skip to content

Commit 4311552

Browse files
committed
PR feedback plus more cleanup.
1 parent eac024b commit 4311552

File tree

2 files changed

+6
-6
lines changed

2 files changed

+6
-6
lines changed

src/Microsoft.ML.Mkl.Components/MklComponentsCatalog.cs

+2-2
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ public static OlsTrainer Ols(
6969
}
7070

7171
/// <summary>
72-
/// Create an <see cref="SymbolicSgdLogisticRegressionBinaryTrainer"/> with advanced options, which predicts a target using a linear binary classification model trained over boolean label data.
72+
/// Create <see cref="SymbolicSgdLogisticRegressionBinaryTrainer"/>, which predicts a target using a linear binary classification model trained over boolean label data.
7373
/// Stochastic gradient descent (SGD) is an iterative algorithm that optimizes a differentiable objective function.
7474
/// The <see cref="SymbolicSgdLogisticRegressionBinaryTrainer"/> parallelizes SGD using <a href="https://www.microsoft.com/en-us/research/project/project-parade/#!symbolic-execution">symbolic execution</a>.
7575
/// </summary>
@@ -102,7 +102,7 @@ public static SymbolicSgdLogisticRegressionBinaryTrainer SymbolicSgdLogisticRegr
102102
}
103103

104104
/// <summary>
105-
/// Create an<see cref= "SymbolicSgdLogisticRegressionBinaryTrainer" />, which predicts a target using a linear binary classification model trained over boolean label data.
105+
/// Create <see cref= "SymbolicSgdLogisticRegressionBinaryTrainer" /> with advanced options, which predicts a target using a linear binary classification model trained over boolean label data.
106106
/// Stochastic gradient descent (SGD) is an iterative algorithm that optimizes a differentiable objective function.
107107
/// The <see cref="SymbolicSgdLogisticRegressionBinaryTrainer"/> parallelizes SGD using <a href="https://www.microsoft.com/en-us/research/project/project-parade/#!symbolic-execution">symbolic execution</a>.
108108
/// </summary>

src/Microsoft.ML.Mkl.Components/SymSgdClassificationTrainer.cs

+4-4
Original file line numberDiff line numberDiff line change
@@ -50,13 +50,13 @@ namespace Microsoft.ML.Trainers
5050
/// | Required NuGet in addition to Microsoft.ML |Microsoft.ML.Mkl.Components |
5151
///
5252
/// ### Training Algorithm Details
53-
/// The symbolic SGD is a classification algorithm that makes its predictions by finding a separating hyperplane.
53+
/// The symbolic stochastic gradient descent is an algorithm that makes its predictions by finding a separating hyperplane.
5454
/// For instance, with feature values $f0, f1,..., f_{D-1}$, the prediction is given by determining what side of the hyperplane the point falls into.
55-
/// That is the same as the sign of the feautures' weighted sum, i.e. $\sum_{i = 0}^{D-1} (w_i * f_i)$, where $w_0, w_1,..., w_{D-1}$ are the weights computed by the algorithm.
55+
/// That is the same as the sign of the feature's weighted sum, i.e. $\sum_{i = 0}^{D-1} (w_i * f_i)$, where $w_0, w_1,..., w_{D-1}$ are the weights computed by the algorithm.
5656
///
57-
/// While most of SGD algorithms is inherently sequential - at each step, the processing of the current example depends on the parameters learned from previous examples.
57+
/// While most symbolic stochastic gradient descent algorithms are inherently sequential - at each step, the processing of the current example depends on the parameters learned from previous examples.
5858
/// This algorithm trains local models in separate threads and probabilistic model cobminer that allows the local models to be combined
59-
/// to produce the same result as what a sequential SGD would have produced, in expectation.
59+
/// to produce the same result as what a sequential symbolic stochastic gradient descent would have produced, in expectation.
6060
///
6161
/// For more information see [Parallel Stochastic Gradient Descent with Sound Combiners](https://arxiv.org/abs/1705.08030).
6262
/// ]]>

0 commit comments

Comments
 (0)