You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -63,19 +63,19 @@ this repository often has more up to date algorithms for complex contractions.
63
63
64
64
The following capabilities are enabled by `opt_einsum`:
65
65
66
-
* Inspect [detailed information](https://dgasmith.github.io/opt_einsum/paths/introduction.html) about the path chosen.
67
-
* Perform contractions with [numerous backends](https://dgasmith.github.io/opt_einsum/getting_started/backends.html), including on the GPU and with libraries such as [TensorFlow](https://www.tensorflow.org) and [PyTorch](https://pytorch.org).
68
-
* Generate [reusable expressions](https://dgasmith.github.io/opt_einsum/getting_started/reusing_paths.html), potentially with [constant tensors](https://dgasmith.github.io/opt_einsum/getting_started/reusing_paths.html#specifying-constants), that can be compiled for greater performance.
69
-
* Use an arbitrary number of indices to find contractions for [hundreds or even thousands of tensors](https://dgasmith.github.io/opt_einsum/examples/large_expr_with_greedy.html).
70
-
* Share [intermediate computations](https://dgasmith.github.io/opt_einsum/getting_started/sharing_intermediates.html) among multiple contractions.
66
+
* Inspect [detailed information](https://dgasmith.github.io/opt_einsum/paths/introduction) about the path chosen.
67
+
* Perform contractions with [numerous backends](https://dgasmith.github.io/opt_einsum/getting_started/backends), including on the GPU and with libraries such as [TensorFlow](https://www.tensorflow.org) and [PyTorch](https://pytorch.org).
68
+
* Generate [reusable expressions](https://dgasmith.github.io/opt_einsum/getting_started/reusing_paths), potentially with [constant tensors](https://dgasmith.github.io/opt_einsum/getting_started/reusing_paths#specifying-constants), that can be compiled for greater performance.
69
+
* Use an arbitrary number of indices to find contractions for [hundreds or even thousands of tensors](https://dgasmith.github.io/opt_einsum/examples/large_expr_with_greedy).
70
+
* Share [intermediate computations](https://dgasmith.github.io/opt_einsum/getting_started/sharing_intermediates) among multiple contractions.
71
71
* Compute gradients of tensor contractions using [autograd](https://github.com/HIPS/autograd) or [jax](https://github.com/google/jax)
72
72
73
-
Please see the [documentation](https://dgasmith.github.io/opt_einsum/index.html) for more features!
73
+
Please see the [documentation](https://dgasmith.github.io/opt_einsum/index) for more features!
74
74
75
75
## Installation
76
76
77
77
`opt_einsum` can either be installed via `pip install opt_einsum` or from conda `conda install opt_einsum -c conda-forge`.
78
-
See the installation [documentation](https://dgasmith.github.io/opt_einsum/getting_started/install.html) for further methods.
78
+
See the installation [documentation](https://dgasmith.github.io/opt_einsum/getting_started/install) for further methods.
0 commit comments