Skip to content

Commit e8dc353

Browse files
authored
Fix caching tutorial order (#3278)
1 parent 707cfcf commit e8dc353

File tree

1 file changed

+5
-4
lines changed

1 file changed

+5
-4
lines changed

Diff for: recipes_source/torch_compile_caching_tutorial.rst

+5-4
Original file line numberDiff line numberDiff line change
@@ -62,16 +62,17 @@ Consider the following example. First, compile and save the cache artifacts.
6262
6363
artifacts = torch.compiler.save_cache_artifacts()
6464
65-
# Now, potentially store these artifacts in a database
65+
assert artifacts is not None
66+
artifact_bytes, cache_info = artifacts
67+
68+
# Now, potentially store artifact_bytes in a database
69+
# You can use cache_info for logging
6670
6771
Later, you can jump-start the cache by the following:
6872

6973
.. code-block:: python
7074
7175
# Potentially download/fetch the artifacts from the database
72-
assert artifacts is not None
73-
artifact_bytes, cache_info = artifacts
74-
7576
torch.compiler.load_cache_artifacts(artifact_bytes)
7677
7778
This operation populates all the modular caches that will be discussed in the next section, including ``PGO``, ``AOTAutograd``, ``Inductor``, ``Triton``, and ``Autotuning``.

0 commit comments

Comments
 (0)