Hello everyone. I am Dr. Furkan Gözükara. PhD Computer Engineer. SECourses is a dedicated YouTube channel for the following topics : Tech, AI, News, Science, Robotics, Singularity, ComfyUI, SwarmUI, ML, Artificial Intelligence, Humanoid Robots, Wan 2.2, FLUX, Krea, Qwen Image, VLMs, Stable Diffusion
Regardless if you guys have pulled the model for any which reason, it's probably best to verify the hashes below so that people don't download anything malicious? Below is the version I dis...
@Dr. Furkan Gözükara Hi, I have followed your tutorial "Easiest Way to Install & Run Stable Diffusion Web UI on PC by Using Open Source Automatic Installer" The web UI boots up, but when i try to generate an image, i get the error: safetensor_rust.SafetensorError: Error while deserializing header: MetadataIncimpleteBuffer
Hey! Thanks for the tutorial on running Kohya on Runpod. I'm getting the ModuleNotFoundError: No module named 'tkinter' error. When I follow the further instructions i get these errors:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. clean-fid 0.1.29 requires requests==2.25.1, but you have requests 2.28.2 which is incompatible. open-clip-torch 2.7.0 requires protobuf==3.20.0, but you have protobuf 3.19.6 which is incompatible. tb-nightly 2.14.0a20230511 requires google-auth-oauthlib<1.1,>=0.5, but you have google-auth-oauthlib 0.4.6 which is incompatible. tb-nightly 2.14.0a20230511 requires tensorboard-data-server<0.8.0,>=0.7.0, but you have tensorboard-data-server 0.6.1 which is incompatible.
Thanks for getting back! When I run the suggested command I retrieve "python3-tk is already the newest version (3.8.10-0ubuntu1~20.04). 0 upgraded, 0 newly installed, 0 to remove and 46 not upgraded." BUT, when I start the Kohya Gui I dont get any errors any longer. When I click on the local URL i get "Unable to connect" in all browsers. Any further tips?
It runs on port 7861 right? You will need to expose that port, you can either set your pod to have a public IP and expose the port as a TCP port, and then connect on the public IP and public port, because 7861 will be the internal port, or else you can set it as an HTTP port and then use the RunPod proxy to access it, but you will need to create a new pod if you never did that when your pod was created.