Commit Graph

485 Commits

Author SHA1 Message Date
cmdr2
d143e85760 Use the local profile folder for dev console 2023-07-10 12:37:49 +05:30
cmdr2
2612c274d3 sdkit 1.0.119 - another fix for live preview 2023-07-10 10:50:10 +05:30
cmdr2
764ad1b8db sdkit 1.0.118 - fix live preview in img2img 2023-07-10 10:40:00 +05:30
cmdr2
b1dcfbd017 sdkit 1.0.117 - fix directml/tensorrt for diffusers 0.18.1 2023-07-10 09:56:28 +05:30
cmdr2
2b9f5eb627 sdkit 1.0.116 - diffusers 0.18.1 upgrade; compel upgrade; embeddings support; AMD-on-Windows and TensorRT acceleration - preliminary support; faster sdkit import time to improve startup time of ED's UI 2023-07-09 20:45:11 +05:30
cmdr2
2dfa482b24
Update get_config.py 2023-06-30 18:33:16 +05:30
cmdr2
d023fd07b0 Move config.yaml to the root folder of ED 2023-06-30 16:36:24 +05:30
cmdr2
45f99ab48a Include comments in config.yaml even when converting from config.json 2023-06-30 11:53:38 +05:30
cmdr2
26042b1e26 Don't use YAML as a singleton, seems to be stateful; Use ruamel in get_config for consistency 2023-06-30 10:37:25 +05:30
cmdr2
cc475f26f4 sdkit 1.0.115 - check for upcasting precision only if using half-precision (e.g. skip for cpu) 2023-06-28 11:31:52 +05:30
cmdr2
f252ca75e9 temp rollback 2023-06-28 10:52:54 +05:30
cmdr2
05b608831c sdkit 1.0.113 - check for upcasting precision only if using half-precision (e.g. skip for cpu) 2023-06-28 10:33:41 +05:30
cmdr2
913550295c
Install ruamel.yaml 0.17.21 2023-06-26 17:01:01 +05:30
cmdr2
af7073d9b6
Merge branch 'beta' into yaml 2023-06-26 16:57:35 +05:30
cmdr2
13056f87d3
Merge pull request #1342 from ManInDark/beta
Prevent UI freeze caused by intensive prompts
2023-06-26 16:25:18 +05:30
cmdr2
817436b65c
Merge pull request #1361 from JeLuF/patch-28
Run dev console in ED directory
2023-06-26 15:53:38 +05:30
JeLuF
c9a5ad9c3a
Update Developer Console.cmd 2023-06-26 12:20:34 +02:00
cmdr2
c480b615ce
Merge pull request #1363 from JeLuF/comspec
Show COMSPEC variable in logs
2023-06-26 15:41:13 +05:30
cmdr2
c74be07c33 sdkit 1.0.112 - fix broken inpainting in low vram mode 2023-06-24 15:46:03 +05:30
cmdr2
4dd1a46efa sdkit 1.0.111 - don't apply a negative lora when testing a newly loaded SD model 2023-06-24 15:21:13 +05:30
cmdr2
d9bddffc42 sdkit 1.0.110 - don't offload latent upscaler to the CPU if not running on a GPU 2023-06-23 21:42:11 +05:30
JeLuF
a5898aaf3b Show COMSPEC variable in logs 2023-06-22 23:54:45 +02:00
JeLuF
7811929b5b
Run dev console in ED directory 2023-06-22 01:15:07 +02:00
cmdr2
aac9acf068 sdkit 1.0.109 - auto-set fp32 attention precision in diffusers if required 2023-06-20 10:49:34 +05:30
cmdr2
2a5b3040e2 sdkit 1.0.108 - potential fix for multi-gpu bug while rendering - the sampler instances weren't thread-local 2023-06-19 19:58:17 +05:30
cmdr2
2c4cd21c8f sdkit 1.0.107 - fix a bug where low VRAM usage mode wasn't working with multiple GPUs 2023-06-16 16:46:32 +05:30
ManInDark
ed59972b03
Changed all links as mentioned in #1339 2023-06-14 11:57:06 +02:00
cmdr2
eb96bfe8a4 sdkit 1.0.106 - fix errors with multi-gpu in low vram mode 2023-06-13 13:39:23 +05:30
cmdr2
3037cceab3 Merge branch 'beta' of github.com:cmdr2/stable-diffusion-ui into beta 2023-06-12 17:22:29 +05:30
cmdr2
9a81d17d33 Fix for multi-gpu bug in codeformer 2023-06-12 16:57:36 +05:30
JeLuF
f83af28e42
Set PYTHONNOUSERSITE=y in dev console
Make behaviour consistent with on_env_start.sh
2023-06-11 21:12:22 +02:00
cmdr2
48edce72a9 Log the version numbers of only a few important modules 2023-06-07 16:38:15 +05:30
cmdr2
9a0031c47b Don't copy check_models.py, it doesn't exist anymore 2023-06-07 15:21:16 +05:30
cmdr2
0d8e73b206 sdkit 1.0.104 - Not all pipelines have vae slicing 2023-06-07 15:10:57 +05:30
cmdr2
4b36ca75cb
Merge pull request #1313 from JeLuF/cloudflared
Share ED via Cloudflare's ArgoTunnel
2023-06-05 16:20:40 +05:30
cmdr2
b14653cb9e sdkit 1.0.103 - Pin the versions of diffusers models used; Use cpu offloading for balanced and low while upscaling using latent upscaler 2023-06-05 16:11:48 +05:30
cmdr2
a10aa92634 Fix a bug where the realesrgan model would get unloaded after the first request in a batch while using Codeformer with upscaling of faces 2023-06-05 15:08:57 +05:30
cmdr2
dd95df8f02 Refactor the default model download code, remove check_models.py, don't check in legacy paths since that's already migrated during initialization; Download CodeFormer's model only when it's used for the first time 2023-06-02 16:34:29 +05:30
cmdr2
0860e35d17 sdkit 1.0.101 - CodeFormer as an option to improve faces 2023-06-01 16:50:01 +05:30
JeLuF
2080d6e27b Share ED via Cloudflare's ArgoTunnel
Shares the Easy Diffusion instance via https://try.cloudflare.com/
2023-05-28 00:50:23 +02:00
cmdr2
a0b3b5af53 sdkit 1.0.98 - seamless tiling 2023-05-25 15:36:27 +05:30
cmdr2
29ec34169c Merge branch 'beta' of github.com:cmdr2/stable-diffusion-ui into beta 2023-05-22 18:07:00 +05:30
cmdr2
d60cb61e58 sdkit 1.0.97 - flatten arguments sent to latent upscaler 2023-05-22 18:06:38 +05:30
JeLuF
0f6caaec33 get_config: return default value if conf file is corrupted 2023-05-22 10:21:19 +02:00
cmdr2
bdf36a8dab sdkit 1.0.96 - missing xformers import 2023-05-19 18:36:37 +05:30
cmdr2
107323d8e7 sdkit 1.0.95 - lower vram usage for high mode 2023-05-19 17:42:47 +05:30
cmdr2
83557d4b3c Merge branch 'beta' of github.com:cmdr2/stable-diffusion-ui into beta 2023-05-19 17:29:20 +05:30
cmdr2
415213878d sdkit 1.0.94 - vram optimizations - perform softmax in half precision 2023-05-19 17:28:54 +05:30
JeLuF
b77036443f Fail gracefully if proc access isn't possible 2023-05-18 16:04:28 +02:00
cmdr2
7562a882f4 sdkit 1.0.93 - lower vram usage for balanced mode, by using attention slice of 1 2023-05-16 16:02:20 +05:30
cmdr2
45db4bb036 sdkit 1.0.92 - more vram optimizations for low,balanced,high - reduces VRAM usage by 20% (especially with larger images) 2023-05-12 16:49:13 +05:30
cmdr2
add05228bd sdkit 1.0.91 - use slice size 1 for low vram usage mode, to reduce VRAM usage 2023-05-11 16:30:06 +05:30
cmdr2
566a83ce3f sdkit 1.0.89 - use half precision in test diffusers for low vram usage mode' 2023-05-11 14:49:15 +05:30
cmdr2
2d1be6186e sdkit 1.0.88 - Fix LoRA in low VRAM mode 2023-05-10 20:19:17 +05:30
cmdr2
64cfd55065 sdkit 1.0.87 - typo 2023-05-04 16:31:40 +05:30
cmdr2
f9cfe1da45 sdkit 1.0.86 - don't use cpu offload for mps/mac, doesn't make sense since the memory is shared between GPU/CPU 2023-05-04 16:09:28 +05:30
cmdr2
b27a14b1b4 sdkit 1.0.85 - torch.Generator fix for mps/mac 2023-05-04 16:04:45 +05:30
cmdr2
75f0780bd1 sdkit 1.0.84 - VRAM optimizations for the diffusers version 2023-05-03 16:12:11 +05:30
cmdr2
07f52c38ef sdkit 1.0.83 - formatting 2023-04-28 16:35:30 +05:30
cmdr2
a46ff731d8 sdkit 1.0.82 - VAE slicing for pytorch 2.0, don't fail to hash files smaller than 3 MB 2023-04-28 16:03:35 +05:30
cmdr2
6a6ea5009a
Merge pull request #1182 from JeLuF/get_config
Don't write config.bat and config.sh any more
2023-04-26 16:35:52 +05:30
cmdr2
24d0e7566f
Copy get_config.py in on_sd_start for the first run, when on_env_start hasn't yet been updated 2023-04-26 16:34:27 +05:30
cmdr2
fe8c208e7c
Copy get_config.py in on_sd_start for the first run, when on_env_start hasn't yet been updated 2023-04-26 16:33:43 +05:30
JeLuF
9399fb5371
Don't use python packages from the user's home directory
PYTHONNOUSERSITE is required to ignore packages installed to `/home/user/.local/`. Since these folders are outside of our control, they can cause conflicts in ED's python env.

https://discord.com/channels/1014774730907209781/1100375010650103808

Fixes #1193
2023-04-25 21:02:36 +02:00
cmdr2
3ae851ab1f
Revert "Revert "Stop messing with %USERPROFILE%"" 2023-04-24 14:32:18 +05:30
cmdr2
6fbb24ae3d
Revert "Stop messing with %USERPROFILE%" 2023-04-24 14:30:52 +05:30
JeLuF
bb607927d0
Stop messing with %USERPROFILE%
Set HF_HOME, so that the models don't get downloaded again.
2023-04-23 12:54:20 +02:00
cmdr2
5acf5949a6 sdkit 1.0.81 - use tf32 = True for ampere GPUs 2023-04-22 15:42:24 +05:30
cmdr2
3d740555c3 Force mac to downgrade from torch 2.0 2023-04-22 14:54:52 +05:30
cmdr2
eb16296873 Restrict AMD cards on Linux to torch 1.13.1 and ROCm 5.2. Avoids black images on some AMD cards. Temp hack until AMD works properly on torch 2.0 2023-04-21 19:08:51 +05:30
cmdr2
1864921d1d Don't copy bootstrap.bat unnecessarily 2023-04-21 16:09:32 +05:30
cmdr2
2e84a421f3 Show sdkit installation progress during the first run 2023-04-21 15:49:38 +05:30
cmdr2
fea77e97a0 actually fix the img2img error in the new diffusers version 2023-04-21 15:26:14 +05:30
cmdr2
e1b6cc2a86 typo 2023-04-21 15:13:29 +05:30
cmdr2
0921573644 sdkit 1.0.78 - fix the 'astype' error with the new diffusers version 2023-04-21 15:11:26 +05:30
JeLuF
5eec05c0c4 Don't write config.bat and config.sh any more 2023-04-21 00:09:27 +02:00
cmdr2
526fc989c1 Allow any version of torch/torchvision 2023-04-20 18:40:45 +05:30
cmdr2
023b78d1c9 Allow rocm5.2 2023-04-20 17:46:34 +05:30
cmdr2
670410b539 sdkit 1.0.77 - fix inpainting bug on diffusers 2023-04-20 17:42:12 +05:30
cmdr2
76e379d7e1 Don't install xformers, it downgrades the torch version. Still need to fix this 2023-04-20 17:07:10 +05:30
cmdr2
6c148f1791 Don't install xformers for AMD on Linux; changelog 2023-04-20 16:48:38 +05:30
cmdr2
534bb2dd84 Use xformers 0.0.16 to speed up image generation 2023-04-20 16:44:06 +05:30
cmdr2
d0f4476ba5 Suggest downloading a model downloading in the troubleshooting steps. Thanks JeLuf 2023-04-20 16:22:42 +05:30
cmdr2
6287bcd00a sdkit 1.0.76 - use 256 as the tile size for realesrgan, instead of 128. slightly more VRAM, but faster upscaling 2023-04-20 16:17:27 +05:30
cmdr2
cb527919a2 sdkit 1.0.75 - upgrade to diffusers 0.15.1 2023-04-19 16:45:28 +05:30
cmdr2
83c34ea52f Remove unnecessary hotfix 2023-04-19 16:31:04 +05:30
cmdr2
35c75115de Log errors during module and model initialization 2023-04-19 16:20:08 +05:30
cmdr2
7c75a61700 Typo 2023-04-19 16:15:15 +05:30
cmdr2
34ea49147c Update the check_models.py script during startup 2023-04-19 16:13:29 +05:30
cmdr2
c1e8637a9f Re-implement the code for downloading models in python. Save some eyeballs from bleeding 2023-04-19 16:11:16 +05:30
cmdr2
becbef4fac Include ROCm in the list of allowed versions 2023-04-18 17:36:52 +05:30
cmdr2
bf3df097b8 Don't use ROCm on Linux if an NVIDIA card is present 2023-04-18 17:14:24 +05:30
cmdr2
30a133bad9 Allow torch 1.11 to continue being installed 2023-04-18 16:10:46 +05:30
cmdr2
d8d44c579c Typo 2023-04-18 15:43:56 +05:30
cmdr2
80384e6ee1 Install PyTorch 2.0 by default, but allow existing PyTorch 1.13.1 installations to continue running; Unify and streamline the installation of dependencies 2023-04-18 15:42:33 +05:30
cmdr2
0898f98355 Merge branch 'beta' of github.com:cmdr2/stable-diffusion-ui into beta 2023-04-18 15:05:53 +05:30
Diana
e7dc41e271
Automatic AMD GPU detection on Linux (#1078)
* Automatic AMD GPU detection on Linux

Automatically detects AMD GPUs and installs the ROCm version of PyTorch instead of the cuda one

A later improvement may be to detect the GPU ROCm version and handle GPUs that dont work on upstream ROCm, ether because they're too old and need a special patched version, or too new and need `HSA_OVERRIDE_GFX_VERSION=10.3.0` added, possibly check through `rocminfo`?

* Address stdout suppression and download failure

* If any NVIDIA GPU is found, always use it

* Use /proc/bus/pci/devices to detect GPUs

* Fix comparisons

`-eq` and `-ne` only work for numbers

* Add back -q

---------

Co-authored-by: JeLuF <jf@mormo.org>
2023-04-18 15:02:39 +05:30
cmdr2
0f0f475241 Merge branch 'beta' of github.com:cmdr2/stable-diffusion-ui into beta 2023-04-18 14:44:23 +05:30
cmdr2
127ee68486
Merge pull request #1171 from JeLuF/p0417
Don't download 1.4 if other models are available
2023-04-18 14:43:45 +05:30
JeLuF
44824fb5f9 Don't download 1.4 if other models are available 2023-04-17 23:22:44 +02:00