crackpnt69
DF Vagrant
I'm really hoping it gets support and can utilize the new specs, 24gb seems like a nice jump over my current 1080ti when producing my... "art"
HyperactivePear said:As a relative newbie to deepfakes I'm curious. What impact does more graphics memory have on the quality / speed of the fakes? and do the Tensor Cores on RTX cards make any difference yet to the AI training?
Also is the 10Gb 3080 likely to be a good card for this as I'm planning on upgrading my gaming rig to that this year. I get the feeling that spending twice the price for a 3090 is complete overkill. (To be honest any modern card is going to beat the 970 I'm currently rocking)
SPT said:HyperactivePear said:As a relative newbie to deepfakes I'm curious. What impact does more graphics memory have on the quality / speed of the fakes? and do the Tensor Cores on RTX cards make any difference yet to the AI training?
Also is the 10Gb 3080 likely to be a good card for this as I'm planning on upgrading my gaming rig to that this year. I get the feeling that spending twice the price for a 3090 is complete overkill. (To be honest any modern card is going to beat the 970 I'm currently rocking)
More VRAM helps for both quality and speed. It's your choice. You can have a higher res model, and use a bigger batch size. How you balance it is up to you, but it all depends on VRAM.
Numbers of Tensor cores and frequency only determines how long one iteration takes. But batch size and max resolution depends on VRAM.
That's why a 10GB card isn't such a big improvement if you previously had a 8GB GPU, even if it's from 2 generations ago.
Personnaly waiting for the rumored Ti versions. Hopefully there will be a 3070 or a 3080 with 16GB VRAM, which will still be less expensive than a 3090 and almost as future proof.
HyperactivePear said:SPT said:HyperactivePear said:As a relative newbie to deepfakes I'm curious. What impact does more graphics memory have on the quality / speed of the fakes? and do the Tensor Cores on RTX cards make any difference yet to the AI training?
Also is the 10Gb 3080 likely to be a good card for this as I'm planning on upgrading my gaming rig to that this year. I get the feeling that spending twice the price for a 3090 is complete overkill. (To be honest any modern card is going to beat the 970 I'm currently rocking)
More VRAM helps for both quality and speed. It's your choice. You can have a higher res model, and use a bigger batch size. How you balance it is up to you, but it all depends on VRAM.
Numbers of Tensor cores and frequency only determines how long one iteration takes. But batch size and max resolution depends on VRAM.
That's why a 10GB card isn't such a big improvement if you previously had a 8GB GPU, even if it's from 2 generations ago.
Personnaly waiting for the rumored Ti versions. Hopefully there will be a 3070 or a 3080 with 16GB VRAM, which will still be less expensive than a 3090 and almost as future proof.
Unfortunately I have the 4GB version, but I understand your point. Thanks
Would it be fair to say then that a 3080 could trade off speed for a higher res model, and gain similar results to a 3090 at the cost of more time training. Or would you say 10Gb is a bit too restrictive compared to the 16/24gb cards.