Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Sign in / Register
Toggle navigation
S
Stable Diffusion Webui
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Locked Files
Issues
0
Issues
0
List
Boards
Labels
Service Desk
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Security & Compliance
Security & Compliance
Dependency List
License Compliance
Packages
Packages
List
Container Registry
Analytics
Analytics
CI / CD
Code Review
Insights
Issues
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
novelai-storage
Stable Diffusion Webui
Commits
c2d5b290
Commit
c2d5b290
authored
Sep 29, 2022
by
Jairo Correa
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Move silu to sd_hijack
parent
c938679d
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
3 additions
and
12 deletions
+3
-12
modules/sd_hijack.py
modules/sd_hijack.py
+3
-9
webui.py
webui.py
+0
-3
No files found.
modules/sd_hijack.py
View file @
c2d5b290
...
...
@@ -12,6 +12,7 @@ from ldm.util import default
from
einops
import
rearrange
import
ldm.modules.attention
import
ldm.modules.diffusionmodules.model
from
torch.nn.functional
import
silu
# see https://github.com/basujindal/stable-diffusion/pull/117 for discussion
...
...
@@ -100,14 +101,6 @@ def split_cross_attention_forward(self, x, context=None, mask=None):
return
self
.
to_out
(
r2
)
def
nonlinearity_hijack
(
x
):
# swish
t
=
torch
.
sigmoid
(
x
)
x
*=
t
del
t
return
x
def
cross_attention_attnblock_forward
(
self
,
x
):
h_
=
x
h_
=
self
.
norm
(
h_
)
...
...
@@ -245,11 +238,12 @@ class StableDiffusionModelHijack:
m
.
cond_stage_model
=
FrozenCLIPEmbedderWithCustomWords
(
m
.
cond_stage_model
,
self
)
self
.
clip
=
m
.
cond_stage_model
ldm
.
modules
.
diffusionmodules
.
model
.
nonlinearity
=
silu
if
cmd_opts
.
opt_split_attention_v1
:
ldm
.
modules
.
attention
.
CrossAttention
.
forward
=
split_cross_attention_forward_v1
elif
not
cmd_opts
.
disable_opt_split_attention
and
(
cmd_opts
.
opt_split_attention
or
torch
.
cuda
.
is_available
()):
ldm
.
modules
.
attention
.
CrossAttention
.
forward
=
split_cross_attention_forward
ldm
.
modules
.
diffusionmodules
.
model
.
nonlinearity
=
nonlinearity_hijack
ldm
.
modules
.
diffusionmodules
.
model
.
AttnBlock
.
forward
=
cross_attention_attnblock_forward
def
flatten
(
el
):
...
...
webui.py
View file @
c2d5b290
...
...
@@ -22,10 +22,7 @@ import modules.txt2img
import
modules.img2img
import
modules.swinir
as
swinir
import
modules.sd_models
from
torch.nn.functional
import
silu
import
ldm
ldm
.
modules
.
diffusionmodules
.
model
.
nonlinearity
=
silu
modules
.
codeformer_model
.
setup_codeformer
()
modules
.
gfpgan_model
.
setup_gfpgan
()
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment