Hello everyone. I am Dr. Furkan Gözükara. PhD Computer Engineer. SECourses is a dedicated YouTube channel for the following topics : Tech, AI, News, Science, Robotics, Singularity, ComfyUI, SwarmUI, ML, Artificial Intelligence, Humanoid Robots, Wan 2.2, FLUX, Krea, Qwen Image, VLMs, Stable Diffusion
why using sdxl and sd1.5..flux is here now..waiting for someone to make flux as optimized as sd 1.5 and sdxl so that i can have similar generation speeds as sd1.5 and sdxl....
if you want the best local then use Flux Dev. but not the fp8 version! plus the T5 FP16 text encoder as Alex said above. It takes a little while until everything is in memory and you can generate images, but after that it is actually quite fast, but if you change something at the prompt it takes a little longer again for the first time. I run it this way with WebUI Forge and SwarmUi. We just need more VRAM
What's the fastest way to swap between loras for use in Flux dev generations? Like Pieter Levels's photoai or danny postmaa with headshotpro, how are they swapping between the loras for all their thousands of users on the fly? I've read in the past that Pieter at least uses replicate. Are they spinning up an instance, loading something like a comfyui environment, and downloading the lora to the server every single time? There's gotta be a faster/more efficient way
have you guys seen this, it train a image of a product to place the product in a image, i think it only can do this if the image used was generated by their model
I haven't personally tested it before but I figure the logic would be something like using a dataset of a product with the backgrounds removed, so like on a plain background and also various angles of it, and no faces whatsoever in any of the images.
I didnt exprince it either , however, what about, give a try to, rare token word (xmycompanylogo) , and train 10 images just the logo, you want to see... after training to typo , "a woman wears a green shirt that xmycompanylogo is over it. " .. after train , you may want to extract the the training as a lora.