-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Data.Analysis : float/double values don't convert properly with . decimals in csv file #5652
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Apparently you need to add So a potential fix could be to check whether the type of the column is a double/float and call |
We may also think about accepting a @chriss2401 - if you have a fix and test cases, feel free to send a PR and @pgovind and I can take a look. Note also dotnet/corefxlab#2927 is re-doing how LoadCsv is implemented. So we may want to consider that as well. |
@eerhardt Good idea. I can implement this in a couple of days unless you guys decide to add it to dotnet/corefxlab#2927 , which in that case I can just close the issue. |
I think taking a |
Sounds good @pgovind , I'll push this and dotnet/corefxlab#2902 in dotnet/corefxlab#2927 in the next couple of days. I think I will clean up some duplicate code I saw as well that I saw in that PR :) |
Will this ever be fixed? For people outside of the US, the |
I wouldn't mind re-taking a look at this at some point, just don't know exactly when that would be :) |
If I have a csv dataframe that looks like this:
Date;Value
12-04-1989;20,7
03-11-1990;22,1
Then my two float values will properly be loaded when I call
DataFrame.LoadCsv
(with values of20.7
and22.1
)But if my separator is the default one (',') and my floats are with a dot instead, they will get ignored and I will get 207 and 221 as values.
The issue comes from here :
https://github.com/dotnet/corefxlab/blob/master/src/Microsoft.Data.Analysis/DataFrame.cs#L488
Since if you write
object value = Convert.ChangeType("20.7", typeof(double));
you will get 207 as a result.The text was updated successfully, but these errors were encountered: