Skip to content

Commit c18add6

Browse files
committed
Added license
1 parent 848605f commit c18add6

File tree

3 files changed

+30
-2
lines changed

3 files changed

+30
-2
lines changed

html/licenses.html

+28-1
Original file line numberDiff line numberDiff line change
@@ -184,7 +184,7 @@ <h2><a href="https://github.com/pharmapsychotic/clip-interrogator/blob/main/LICE
184184
</pre>
185185

186186
<h2><a href="https://github.com/JingyunLiang/SwinIR/blob/main/LICENSE">SwinIR</a></h2>
187-
<small>Code added by contirubtors, most likely copied from this repository.</small>
187+
<small>Code added by contributors, most likely copied from this repository.</small>
188188

189189
<pre>
190190
Apache License
@@ -390,3 +390,30 @@ <h2><a href="https://github.com/JingyunLiang/SwinIR/blob/main/LICENSE">SwinIR</a
390390
limitations under the License.
391391
</pre>
392392

393+
<h2><a href="https://github.com/AminRezaei0x443/memory-efficient-attention/blob/main/LICENSE">Memory Efficient Attention</a></h2>
394+
<small>The sub-quadratic cross attention optimization uses modified code from the Memory Efficient Attention package that Alex Birch optimized for 3D tensors. This license is updated to reflect that.</small>
395+
<pre>
396+
MIT License
397+
398+
Copyright (c) 2023 Alex Birch
399+
Copyright (c) 2023 Amin Rezaei
400+
401+
Permission is hereby granted, free of charge, to any person obtaining a copy
402+
of this software and associated documentation files (the "Software"), to deal
403+
in the Software without restriction, including without limitation the rights
404+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
405+
copies of the Software, and to permit persons to whom the Software is
406+
furnished to do so, subject to the following conditions:
407+
408+
The above copyright notice and this permission notice shall be included in all
409+
copies or substantial portions of the Software.
410+
411+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
412+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
413+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
414+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
415+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
416+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
417+
SOFTWARE.
418+
</pre>
419+

modules/sd_hijack_optimizations.py

+1
Original file line numberDiff line numberDiff line change
@@ -216,6 +216,7 @@ def split_cross_attention_forward_invokeAI(self, x, context=None, mask=None):
216216

217217

218218
# Based on Birch-san's modified implementation of sub-quadratic attention from https://github.com/Birch-san/diffusers/pull/1
219+
# The sub_quad_attention_forward function is under the MIT License listed under Memory Efficient Attention in the Licenses section of the web UI interface
219220
def sub_quad_attention_forward(self, x, context=None, mask=None):
220221
assert mask is None, "attention-mask not currently implemented for SubQuadraticCrossAttnProcessor."
221222

modules/sub_quadratic_attention.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# original source:
22
# https://github.com/AminRezaei0x443/memory-efficient-attention/blob/1bc0d9e6ac5f82ea43a375135c4e1d3896ee1694/memory_efficient_attention/attention_torch.py
33
# license:
4-
# unspecified
4+
# MIT License (see Memory Efficient Attention under the Licenses section in the web UI interface for the full license)
55
# credit:
66
# Amin Rezaei (original author)
77
# Alex Birch (optimized algorithm for 3D tensors, at the expense of removing bias, masking and callbacks)

0 commit comments

Comments
 (0)