-
Notifications
You must be signed in to change notification settings - Fork 12.8k
Incorrect type inference of tuple when underlying array is extended #13941
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Using |
But the block does execute and the code compiles without warnings or errors, even with
Thanks for the reference to the proposal — that is indeed a little closer to the behavior I would have expected. |
@anfedorov Ah, I see your point now. I misread it as |
Yup, |
unfortunately there is no way to do this correctly in the general case, e.g.: |
copying tuple behavior from Python seems like one way to handle it. i.e. this seems to work fine in the playground: https://www.typescriptlang.org/play/index.html#src=let%20x%3A%20%5Bnumber%2C%20string%2C%20undefined%2C%20number%5D%3B%0D%0Ax%20%3D%20%5B1%2C%20'1'%2C%20undefined%2C%202%5D%3B%0D%0A%0D%0Ax%5B3%5D%20%3D%202%0D%0Ax%5B2%5D%20%3D%203%0D%0A regardless, |
Automatically closing this issue for housekeeping purposes. The issue labels indicate that it is unactionable at the moment or has already been addressed. |
TypeScript Version: 2.1.5
Code
Expected behavior:
Before reading the docs, I would expect the type checker to find an error on line 4.
Given the feature set defined in the docs, I would expect the type at line 4 to be
let y : undefined
.Actual behavior:
Type is incorrectly inferred as
let y : never
.The text was updated successfully, but these errors were encountered: