Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Sign in / Register
Toggle navigation
S
Stable Diffusion Webui
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Locked Files
Issues
0
Issues
0
List
Boards
Labels
Service Desk
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Security & Compliance
Security & Compliance
Dependency List
License Compliance
Packages
Packages
List
Container Registry
Analytics
Analytics
CI / CD
Code Review
Insights
Issues
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
novelai-storage
Stable Diffusion Webui
Commits
ef5dac77
Commit
ef5dac77
authored
Jul 17, 2023
by
AUTOMATIC1111
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
fix
parent
c2297b89
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
1 addition
and
3 deletions
+1
-3
extensions-builtin/Lora/network_hada.py
extensions-builtin/Lora/network_hada.py
+0
-3
extensions-builtin/Lora/networks.py
extensions-builtin/Lora/networks.py
+1
-0
No files found.
extensions-builtin/Lora/network_hada.py
View file @
ef5dac77
...
...
@@ -27,9 +27,6 @@ class NetworkModuleHada(network_lyco.NetworkModuleLyco):
self
.
t1
=
weights
.
w
.
get
(
"hada_t1"
)
self
.
t2
=
weights
.
w
.
get
(
"hada_t2"
)
self
.
alpha
=
weights
.
w
[
"alpha"
]
.
item
()
if
"alpha"
in
weights
.
w
else
None
self
.
scale
=
weights
.
w
[
"scale"
]
.
item
()
if
"scale"
in
weights
.
w
else
None
def
calc_updown
(
self
,
orig_weight
):
w1a
=
self
.
w1a
.
to
(
orig_weight
.
device
,
dtype
=
orig_weight
.
dtype
)
w1b
=
self
.
w1b
.
to
(
orig_weight
.
device
,
dtype
=
orig_weight
.
dtype
)
...
...
extensions-builtin/Lora/networks.py
View file @
ef5dac77
...
...
@@ -271,6 +271,7 @@ def network_apply_weights(self: Union[torch.nn.Conv2d, torch.nn.Linear, torch.nn
updown
=
torch
.
nn
.
functional
.
pad
(
updown
,
(
0
,
0
,
0
,
0
,
0
,
5
))
self
.
weight
+=
updown
continue
module_q
=
net
.
modules
.
get
(
network_layer_name
+
"_q_proj"
,
None
)
module_k
=
net
.
modules
.
get
(
network_layer_name
+
"_k_proj"
,
None
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment