Skip to content

Commit 6eef331

Browse files
committed
updating the copyright and adding tests
1 parent 89462b7 commit 6eef331

19 files changed

+3085
-8
lines changed

src/Microsoft.ML.StandardLearners/FactorizationMachine/FactorizationMachine.cs

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,6 @@
1-
//------------------------------------------------------------------------------
2-
// <copyright company="Microsoft Corporation">
3-
// Copyright (c) Microsoft Corporation. All rights reserved.
4-
// </copyright>
5-
//------------------------------------------------------------------------------
1+
// Licensed to the .NET Foundation under one or more agreements.
2+
// The .NET Foundation licenses this file to you under the MIT license.
3+
// See the LICENSE file in the project root for more information.
64

75
using System;
86
using System.Collections.Generic;

src/Microsoft.ML.StandardLearners/FactorizationMachine/FactorizationMachineInterface.cs

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,8 @@
1-
using Microsoft.ML.Runtime.Internal.CpuMath;
1+
// Licensed to the .NET Foundation under one or more agreements.
2+
// The .NET Foundation licenses this file to you under the MIT license.
3+
// See the LICENSE file in the project root for more information.
4+
5+
using Microsoft.ML.Runtime.Internal.CpuMath;
26
using Microsoft.ML.Runtime.Internal.Utilities;
37
using System.Runtime.InteropServices;
48

src/Microsoft.ML.StandardLearners/Microsoft.ML.StandardLearners.csproj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
<Project Sdk="Microsoft.NET.Sdk">
1+
<Project Sdk="Microsoft.NET.Sdk">
22

33
<PropertyGroup>
44
<TargetFramework>netstandard2.0</TargetFramework>
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,77 @@
1+
maml.exe CV tr=FieldAwareFactorizationMachine{d=5 shuf- norm-} col[Feature]=DupFeatures threads=- norm=No dout=%Output% data=%Data% seed=1 xf=Copy{col=DupFeatures:Features} xf=MinMax{col=Features col=DupFeatures}
2+
Not adding a normalizer.
3+
Warning: Skipped 8 examples with bad label/weight/features in training set
4+
Not training a calibrator because it is not needed.
5+
Not adding a normalizer.
6+
Warning: Skipped 8 examples with bad label/weight/features in training set
7+
Not training a calibrator because it is not needed.
8+
Warning: The predictor produced non-finite prediction values on 8 instances during testing. Possible causes: abnormal data or the predictor is numerically unstable.
9+
TEST POSITIVE RATIO: 0.3785 (134.0/(134.0+220.0))
10+
Confusion table
11+
||======================
12+
PREDICTED || positive | negative | Recall
13+
TRUTH ||======================
14+
positive || 122 | 12 | 0.9104
15+
negative || 4 | 216 | 0.9818
16+
||======================
17+
Precision || 0.9683 | 0.9474 |
18+
OVERALL 0/1 ACCURACY: 0.954802
19+
LOG LOSS/instance: 0.259660
20+
Test-set entropy (prior Log-Loss/instance): 0.956998
21+
LOG-LOSS REDUCTION (RIG): 72.867233
22+
AUC: 0.984973
23+
Warning: The predictor produced non-finite prediction values on 8 instances during testing. Possible causes: abnormal data or the predictor is numerically unstable.
24+
TEST POSITIVE RATIO: 0.3191 (105.0/(105.0+224.0))
25+
Confusion table
26+
||======================
27+
PREDICTED || positive | negative | Recall
28+
TRUTH ||======================
29+
positive || 92 | 13 | 0.8762
30+
negative || 2 | 222 | 0.9911
31+
||======================
32+
Precision || 0.9787 | 0.9447 |
33+
OVERALL 0/1 ACCURACY: 0.954407
34+
LOG LOSS/instance: 0.260480
35+
Test-set entropy (prior Log-Loss/instance): 0.903454
36+
LOG-LOSS REDUCTION (RIG): 71.168362
37+
AUC: 0.967049
38+
39+
OVERALL RESULTS
40+
---------------------------------------
41+
AUC: 0.976011 (0.0090)
42+
Accuracy: 0.954605 (0.0002)
43+
Positive precision: 0.973489 (0.0052)
44+
Positive recall: 0.893319 (0.0171)
45+
Negative precision: 0.946025 (0.0013)
46+
Negative recall: 0.986445 (0.0046)
47+
Log-loss: 0.260070 (0.0004)
48+
Log-loss reduction: 72.017798 (0.8494)
49+
F1 Score: 0.931542 (0.0069)
50+
AUPRC: 0.974115 (0.0054)
51+
52+
---------------------------------------
53+
Physical memory usage(MB): %Number%
54+
Virtual memory usage(MB): %Number%
55+
%DateTime% Time elapsed(s): %Number%
56+
57+
--- Progress log ---
58+
[1] 'Normalize' started.
59+
[1] (%Time%) 337 examples
60+
[1] 'Normalize' finished in %Time%.
61+
[2] 'Training' started.
62+
[2] (%Time%) 1 iterations, 329 examples Training-loss: 0.371414389819699
63+
[2] (%Time%) 2 iterations, 329 examples Training-loss: 0.225137821503565
64+
[2] (%Time%) 3 iterations, 329 examples Training-loss: 0.197323119398265
65+
[2] (%Time%) 4 iterations, 329 examples Training-loss: 0.183649426646222
66+
[2] (%Time%) 5 iterations, 329 examples Training-loss: 0.174400635825405
67+
[2] 'Training' finished in %Time%.
68+
[3] 'Normalize #2' started.
69+
[3] (%Time%) 362 examples
70+
[3] 'Normalize #2' finished in %Time%.
71+
[4] 'Training #2' started.
72+
[4] (%Time%) 1 iterations, 354 examples Training-loss: 0.35872800705401
73+
[4] (%Time%) 2 iterations, 354 examples Training-loss: 0.239609312114266
74+
[4] (%Time%) 3 iterations, 354 examples Training-loss: 0.210775498912242
75+
[4] (%Time%) 4 iterations, 354 examples Training-loss: 0.19625903089058
76+
[4] (%Time%) 5 iterations, 354 examples Training-loss: 0.187121580244397
77+
[4] 'Training #2' finished in %Time%.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
FieldAwareFactorizationMachine
2+
AUC Accuracy Positive precision Positive recall Negative precision Negative recall Log-loss Log-loss reduction F1 Score AUPRC /d /norm /shuf Learner Name Train Dataset Test Dataset Results File Run Time Physical Memory Virtual Memory Command Line Settings
3+
0.976011 0.954605 0.973489 0.893319 0.946025 0.986445 0.26007 72.0178 0.931542 0.974115 5 - - FieldAwareFactorizationMachine %Data% %Output% 99 0 0 maml.exe CV tr=FieldAwareFactorizationMachine{d=5 shuf- norm-} col[Feature]=DupFeatures threads=- norm=No dout=%Output% data=%Data% seed=1 xf=Copy{col=DupFeatures:Features} xf=MinMax{col=Features col=DupFeatures} /d:5;/norm:-;/shuf:-
4+

0 commit comments

Comments
 (0)