Skip to content

Commit acfb0ec

Browse files
authored
Update README.md (#201)
* Update README.md Fix links * Update paths.py Remove `# type: ignore`
1 parent 3824e4f commit acfb0ec

File tree

2 files changed

+10
-10
lines changed

2 files changed

+10
-10
lines changed

README.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ information.
2828

2929
## Example usage
3030

31-
The [`opt_einsum.contract`](https://dgasmith.github.io/opt_einsum/api_reference.html#opt_einsumcontract)
31+
The [`opt_einsum.contract`](https://dgasmith.github.io/opt_einsum/api_reference#opt_einsumcontract)
3232
function can often act as a drop-in replacement for `einsum`
3333
functions without further changes to the code while providing superior performance.
3434
Here, a tensor contraction is performed with and without optimization:
@@ -50,7 +50,7 @@ I = np.random.rand(N, N, N, N)
5050

5151
In this particular example, we see a ~3000x performance improvement which is
5252
not uncommon when compared against unoptimized contractions. See the [backend
53-
examples](https://dgasmith.github.io/opt_einsum/getting_started/backends.html)
53+
examples](https://dgasmith.github.io/opt_einsum/getting_started/backends)
5454
for more information on using other backends.
5555

5656
## Features
@@ -63,19 +63,19 @@ this repository often has more up to date algorithms for complex contractions.
6363

6464
The following capabilities are enabled by `opt_einsum`:
6565

66-
* Inspect [detailed information](https://dgasmith.github.io/opt_einsum/paths/introduction.html) about the path chosen.
67-
* Perform contractions with [numerous backends](https://dgasmith.github.io/opt_einsum/getting_started/backends.html), including on the GPU and with libraries such as [TensorFlow](https://www.tensorflow.org) and [PyTorch](https://pytorch.org).
68-
* Generate [reusable expressions](https://dgasmith.github.io/opt_einsum/getting_started/reusing_paths.html), potentially with [constant tensors](https://dgasmith.github.io/opt_einsum/getting_started/reusing_paths.html#specifying-constants), that can be compiled for greater performance.
69-
* Use an arbitrary number of indices to find contractions for [hundreds or even thousands of tensors](https://dgasmith.github.io/opt_einsum/examples/large_expr_with_greedy.html).
70-
* Share [intermediate computations](https://dgasmith.github.io/opt_einsum/getting_started/sharing_intermediates.html) among multiple contractions.
66+
* Inspect [detailed information](https://dgasmith.github.io/opt_einsum/paths/introduction) about the path chosen.
67+
* Perform contractions with [numerous backends](https://dgasmith.github.io/opt_einsum/getting_started/backends), including on the GPU and with libraries such as [TensorFlow](https://www.tensorflow.org) and [PyTorch](https://pytorch.org).
68+
* Generate [reusable expressions](https://dgasmith.github.io/opt_einsum/getting_started/reusing_paths), potentially with [constant tensors](https://dgasmith.github.io/opt_einsum/getting_started/reusing_paths#specifying-constants), that can be compiled for greater performance.
69+
* Use an arbitrary number of indices to find contractions for [hundreds or even thousands of tensors](https://dgasmith.github.io/opt_einsum/examples/large_expr_with_greedy).
70+
* Share [intermediate computations](https://dgasmith.github.io/opt_einsum/getting_started/sharing_intermediates) among multiple contractions.
7171
* Compute gradients of tensor contractions using [autograd](https://github.com/HIPS/autograd) or [jax](https://github.com/google/jax)
7272

73-
Please see the [documentation](https://dgasmith.github.io/opt_einsum/index.html) for more features!
73+
Please see the [documentation](https://dgasmith.github.io/opt_einsum/index) for more features!
7474

7575
## Installation
7676

7777
`opt_einsum` can either be installed via `pip install opt_einsum` or from conda `conda install opt_einsum -c conda-forge`.
78-
See the installation [documentation](https://dgasmith.github.io/opt_einsum/getting_started/install.html) for further methods.
78+
See the installation [documentation](https://dgasmith.github.io/opt_einsum/getting_started/install) for further methods.
7979

8080
## Citation
8181

opt_einsum/paths.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1248,7 +1248,7 @@ def __call__(
12481248
# the set of n tensors is represented by a bitmap: if bit j is 1,
12491249
# tensor j is in the set, e.g. 0b100101 = {0,2,5}; set unions
12501250
# (intersections) can then be computed by bitwise or (and);
1251-
x: List[Any] = [None] * 2 + [dict() for j in range(len(g) - 1)] # type: ignore
1251+
x: List[Any] = [None] * 2 + [dict() for j in range(len(g) - 1)]
12521252
x[1] = OrderedDict((1 << j, (inputs[j], 0, inputs_contractions[j])) for j in g)
12531253

12541254
# convert set of tensors g to a bitmap set:

0 commit comments

Comments
 (0)