Deffo! I run my models on a 3070 with comfy-ui in low vram mode so it uses my DRAM as well, you need a good amount of DRAM if you’re doing it that way though, I have 64 gigs and still get OOM errors when using dram for AI models.
The 4070’s 12 gigs of VRAM should cut it though!
I highly recommend using Docker, it’s probably the easiest way to set it up if you’re a linux or intel mac user.
Alternatively: Comfy.org has the windows and Apple Silicon versions as executables.