-
-
Notifications
You must be signed in to change notification settings - Fork 290
Remove inference tip cache #1832
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Pull Request Test Coverage Report for Build 3245692888
💛 - Coveralls |
I'm not sure this is the right call. It looks like we don't fully understand the consequences of removing the cache yet. |
I was going to ask you for a test on That does indeed seem like this serves a purpose! |
Would you mind sharing the false-positive location? I'd like to dig that a bit. |
Regarding the performance impact, it's expected that removing the cache would lower performance. Slightly lower perfs are acceptable if we fix a bug that happen often, but +10% is not slight. We'd need to replace the buggy cache by a working cache. What was the false positive in home-assistant @cdce8p ? |
I'm not able to see the false positive on my side. On my slow machine, the |
Haven't had time to double check. Hopefully tomorrow. |
This seems to be enough to reproduce it.
I've given up on running it locally a long time ago 😅 My laptop just isn't powerful enough.
+15% roughly matches what I've seen. IMO that's quite a lot so I'm really not sure it's worth it to remove the whole cache. |
Thanks for moving forward by tackling the cache key issue @jacobtylerwalls! |
ChangeLog
Fixed type inconsistently inferred for base methods returning type(self).
Description
Remove inference tip cache. It misses the context to properly evaluate a base class method returned type when the returned type depends on the subclass instance calling the method.
Type of Changes
Related Issue
Closes #1828