Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Sign in / Register
Toggle navigation
S
Stable Diffusion Webui
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Locked Files
Issues
0
Issues
0
List
Boards
Labels
Service Desk
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Security & Compliance
Security & Compliance
Dependency List
License Compliance
Packages
Packages
List
Container Registry
Analytics
Analytics
CI / CD
Code Review
Insights
Issues
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
novelai-storage
Stable Diffusion Webui
Commits
62ce77e2
Commit
62ce77e2
authored
Sep 08, 2022
by
AUTOMATIC
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
support for sd-concepts as alternatives for textual inversion #151
parent
f5001246
Changes
2
Show whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
17 additions
and
6 deletions
+17
-6
.gitignore
.gitignore
+2
-1
modules/sd_hijack.py
modules/sd_hijack.py
+15
-5
No files found.
.gitignore
View file @
62ce77e2
...
...
@@ -9,4 +9,5 @@ __pycache__
/outputs
/config.json
/log
webui.settings.bat
\ No newline at end of file
/webui.settings.bat
/embeddings
modules/sd_hijack.py
View file @
62ce77e2
...
...
@@ -73,11 +73,21 @@ class StableDiffusionModelHijack:
name
=
os
.
path
.
splitext
(
filename
)[
0
]
data
=
torch
.
load
(
path
)
# textual inversion embeddings
if
'string_to_param'
in
data
:
param_dict
=
data
[
'string_to_param'
]
if
hasattr
(
param_dict
,
'_parameters'
):
param_dict
=
getattr
(
param_dict
,
'_parameters'
)
# fix for torch 1.12.1 loading saved file from torch 1.11
assert
len
(
param_dict
)
==
1
,
'embedding file has multiple terms in it'
emb
=
next
(
iter
(
param_dict
.
items
()))[
1
]
elif
type
(
data
)
==
dict
and
type
(
next
(
iter
(
data
.
values
())))
==
torch
.
Tensor
:
assert
len
(
data
.
keys
())
==
1
,
'embedding file has multiple terms in it'
emb
=
next
(
iter
(
data
.
values
()))
if
len
(
emb
.
shape
)
==
1
:
emb
=
emb
.
unsqueeze
(
0
)
self
.
word_embeddings
[
name
]
=
emb
.
detach
()
self
.
word_embeddings_checksums
[
name
]
=
f
'{const_hash(emb.reshape(-1))&0xffff:04x}'
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment