This is an adult 18+ website
This website contains age-restricted materials including nudity and explicit depictions of sexual activity. By entering, you affirm that you are at least 18 years of age or the age of majority in the jurisdiction you are accessing the website from and you consent to viewing sexually explicit content.
Deffo! I run my models on a 3070 with comfy-ui in low vram mode so it uses my DRAM as well, you need a good amount of DRAM if you’re doing it that way though, I have 64 gigs and still get OOM errors when using dram for AI models.
The 4070’s 12 gigs of VRAM should cut it though!
Do I have to use this …docker thing. …? I have like zero experience with it
I highly recommend using Docker, it’s probably the easiest way to set it up if you’re a linux or intel mac user.
Alternatively: Comfy.org has the windows and Apple Silicon versions as executables.
Good to know. Thanks!