-
-
Notifications
You must be signed in to change notification settings - Fork 5.6k
--trace-compile
ignores inferred const-return Methods
#58482
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I think it makes sense for |
I believe we deprecated |
I'm not sure It doesn't help you to distinguish between whether you are dispatching to existing code or if you are compiling brand new code, which is very useful, e.g., for hunting down invalidations and caching bugs. |
There's actually an additional wrinkle here, which is that these signatures may not be "compileable". These work fine: julia> precompile(Tuple{typeof(Base.tail), Tuple{Int}})
true
julia> precompile(Tuple{typeof(Base.isempty), Tuple{Base.SubString{String}}})
true
julia> precompile(Tuple{typeof(Base.haslength), Base.Dict{Int, Nothing}})
true but these do not: julia> precompile(Tuple{typeof(Base.tail), Tuple{Any}})
false
julia> precompile(Tuple{typeof(Base.isempty), Tuple{Vararg{Base.SubString{String}}}})
false
julia> precompile(Tuple{typeof(Base.haslength), Base.Dict{_A, Nothing} where _A})
false The The irony is that the signatures are, in fact, easily compiled to the most precise inference / performant codegen that we have (a side-effect-free function w/ a That said, even if |
Yes, leaving those out makes sense to me, since presumably they will be compiled (and cached?) as part of compiling whatever other functions reached them, during those other functions' compilations? |
Yeah, that sounds right to me. In principle, we only need to log the compiled functions that were also the target of dispatches (I think the most compact log would be an intersection of |
Agreed, except that I think |
Right, I think this PR just makes |
For some history, the printing of this evolved to the Now, the point of that is that the only reason |
I think at this point, all parts of all those PRs related to trace-compile have been removed due to the problems with trace-compile |
I don't think that's the case - this ticket is making me realize that I can't precisely say what the definition of $ julia +nightly --trace-dispatch=stderr -e "Core.invoke(Base.getindex, Tuple{Vector{Int}, Int}, Int[1,2,3], 1)"
precompile(Tuple{typeof(Base.getindex), Array{Int64, 1}, Int64})
$ julia +nightly --trace-compile=stderr -e "Core.invoke(Base.getindex, Tuple{Vector{Int}, Int}, Int[1,2,3], 1)"
# no output It also does not report some dynamic dispatches, depending on where there are found in the cache: julia +nightly --trace-dispatch=stderr -e "v = Int[1,2,3]; (foo() = v[1]); foo()"
precompile(Tuple{typeof(Main.foo)})
# no entry for: precompile(Tuple{typeof(getindex), Vector{Int}, Int}) despite having a dynamic dispatch: $ julia +nightly -e "using InteractiveUtils; v = Int[1,2,3]; (foo() = v[1]); display(@code_typed foo())"
CodeInfo(
1 ─ %1 = Main.v::Any
│ %2 = dynamic Base.getindex(%1, 1)::Any
└── return %2
) => Any That's not an issue for pre-compilation lists / reducing compilation latency, but it could be considered a bug if you are using |
I wouldn't say "all" :P julia/contrib/generate_precompile.jl Lines 33 to 238 in 953903b
|
Huh, thanks Cody. Yeah i think with --trace-dispatch i would expect to find all the dynamic dispatches 👍 |
Uh oh!
There was an error while loading. Please reload this page.
If you write a method that ends up being inferred to a
Const(...)
return value:it will not be included in the
--trace-compile
output:Even though we ran our compilation pipeline on
foo()
for the first time (generating a new CodeInstance), it ended up not needing to compile via LLVM so it is excluded from the output.A user may hope to use the output of
--trace-compile
to determine whether they should addprecompile(foo, ())
to the pre-compilation workload, If we had reported this MethodInstance, it would have saved them that inference time.The text was updated successfully, but these errors were encountered: