Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Sign in / Register
Toggle navigation
S
Stable Diffusion Webui
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Locked Files
Issues
0
Issues
0
List
Boards
Labels
Service Desk
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Security & Compliance
Security & Compliance
Dependency List
License Compliance
Packages
Packages
List
Container Registry
Analytics
Analytics
CI / CD
Code Review
Insights
Issues
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
novelai-storage
Stable Diffusion Webui
Commits
26b459a3
Commit
26b459a3
authored
Oct 08, 2022
by
C43H66N12O12S2
Committed by
GitHub
Oct 08, 2022
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
default to split attention if cuda is available and xformers is not
parent
d0e85873
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
modules/sd_hijack.py
modules/sd_hijack.py
+2
-2
No files found.
modules/sd_hijack.py
View file @
26b459a3
...
@@ -21,12 +21,12 @@ diffusionmodules_model_AttnBlock_forward = ldm.modules.diffusionmodules.model.At
...
@@ -21,12 +21,12 @@ diffusionmodules_model_AttnBlock_forward = ldm.modules.diffusionmodules.model.At
def
apply_optimizations
():
def
apply_optimizations
():
ldm
.
modules
.
diffusionmodules
.
model
.
nonlinearity
=
silu
ldm
.
modules
.
diffusionmodules
.
model
.
nonlinearity
=
silu
if
not
cmd_opts
.
disable_opt_xformers_attention
and
not
(
cmd_opts
.
opt_split_attention
or
torch
.
version
.
hip
):
if
not
cmd_opts
.
disable_opt_xformers_attention
and
not
(
cmd_opts
.
opt_split_attention
or
torch
.
version
.
hip
or
shared
.
xformers_available
):
ldm
.
modules
.
attention
.
CrossAttention
.
forward
=
sd_hijack_optimizations
.
xformers_attention_forward
ldm
.
modules
.
attention
.
CrossAttention
.
forward
=
sd_hijack_optimizations
.
xformers_attention_forward
ldm
.
modules
.
diffusionmodules
.
model
.
AttnBlock
.
forward
=
sd_hijack_optimizations
.
xformers_attnblock_forward
ldm
.
modules
.
diffusionmodules
.
model
.
AttnBlock
.
forward
=
sd_hijack_optimizations
.
xformers_attnblock_forward
elif
cmd_opts
.
opt_split_attention_v1
:
elif
cmd_opts
.
opt_split_attention_v1
:
ldm
.
modules
.
attention
.
CrossAttention
.
forward
=
sd_hijack_optimizations
.
split_cross_attention_forward_v1
ldm
.
modules
.
attention
.
CrossAttention
.
forward
=
sd_hijack_optimizations
.
split_cross_attention_forward_v1
elif
cmd_opts
.
opt_split_attention
:
elif
cmd_opts
.
opt_split_attention
or
torch
.
cuda
.
is_available
()
:
ldm
.
modules
.
attention_CrossAttention_forward
=
sd_hijack_optimizations
.
split_cross_attention_forward
ldm
.
modules
.
attention_CrossAttention_forward
=
sd_hijack_optimizations
.
split_cross_attention_forward
ldm
.
modules
.
diffusionmodules
.
model
.
AttnBlock
.
forward
=
sd_hijack_optimizations
.
cross_attention_attnblock_forward
ldm
.
modules
.
diffusionmodules
.
model
.
AttnBlock
.
forward
=
sd_hijack_optimizations
.
cross_attention_attnblock_forward
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment