MrDeepFakes Forums

Some content may not be available to Guests. Consider registering an account to enjoy unrestricted access to guides, support and tools

  • We are looking for community members who are intested in helping out. See our HELP WANTED post.

I'm thinking a 3090 might be a game changer, but we shall see.

crackpnt69

DF Vagrant
I'm really hoping it gets support and can utilize the new specs, 24gb seems like a nice jump over my current 1080ti when producing my... "art"
 

HyperactivePear

DF Vagrant
As a relative newbie to deepfakes I'm curious. What impact does more graphics memory have on the quality / speed of the fakes? and do the Tensor Cores on RTX cards make any difference yet to the AI training?

Also is the 10Gb 3080 likely to be a good card for this as I'm planning on upgrading my gaming rig to that this year. I get the feeling that spending twice the price for a 3090 is complete overkill. (To be honest any modern card is going to beat the 970 I'm currently rocking)
 

SPT

Moderator
Staff member
Moderator
Verified Video Creator
HyperactivePear said:
As a relative newbie to deepfakes I'm curious. What impact does more graphics memory have on the quality / speed of the fakes? and do the Tensor Cores on RTX cards make any difference yet to the AI training?

Also is the 10Gb 3080 likely to be a good card for this as I'm planning on upgrading my gaming rig to that this year. I get the feeling that spending twice the price for a 3090 is complete overkill. (To be honest any modern card is going to beat the 970 I'm currently rocking)

More VRAM helps for both quality and speed. It's your choice. You can have a higher res model, and use a bigger batch size. How you balance it is up to you, but it all depends on VRAM.
Numbers of Tensor cores and frequency only determines how long one iteration takes. But batch size and max resolution depends on VRAM.
That's why a 10GB card isn't such a big improvement if you previously had a 8GB GPU, even if it's from 2 generations ago.

Personnaly waiting for the rumored Ti versions. Hopefully there will be a 3070 or a 3080 with 16GB VRAM, which will still be less expensive than a 3090 and almost as future proof.
 

HyperactivePear

DF Vagrant
SPT said:
HyperactivePear said:
As a relative newbie to deepfakes I'm curious. What impact does more graphics memory have on the quality / speed of the fakes? and do the Tensor Cores on RTX cards make any difference yet to the AI training?

Also is the 10Gb 3080 likely to be a good card for this as I'm planning on upgrading my gaming rig to that this year. I get the feeling that spending twice the price for a 3090 is complete overkill. (To be honest any modern card is going to beat the 970 I'm currently rocking)

More VRAM helps for both quality and speed. It's your choice. You can have a higher res model, and use a bigger batch size. How you balance it is up to you, but it all depends on VRAM.
Numbers of Tensor cores and frequency only determines how long one iteration takes. But batch size and max resolution depends on VRAM.
That's why a 10GB card isn't such a big improvement if you previously had a 8GB GPU, even if it's from 2 generations ago.

Personnaly waiting for the rumored Ti versions. Hopefully there will be a 3070 or a 3080 with 16GB VRAM, which will still be less expensive than a 3090 and almost as future proof.

Unfortunately I have the 4GB version, but I understand your point. Thanks :)

Would it be fair to say then that a 3080 could trade off speed for a higher res model, and gain similar results to a 3090 at the cost of more time training. Or would you say 10Gb is a bit too restrictive compared to the 16/24gb cards.
 

angeloshredder

DF Admirer
RTX 3080 

hl2dj4Zh.png


RTX 3090

wiKdfnmh.png
 

SPT

Moderator
Staff member
Moderator
Verified Video Creator
HyperactivePear said:
SPT said:
HyperactivePear said:
As a relative newbie to deepfakes I'm curious. What impact does more graphics memory have on the quality / speed of the fakes? and do the Tensor Cores on RTX cards make any difference yet to the AI training?

Also is the 10Gb 3080 likely to be a good card for this as I'm planning on upgrading my gaming rig to that this year. I get the feeling that spending twice the price for a 3090 is complete overkill. (To be honest any modern card is going to beat the 970 I'm currently rocking)

More VRAM helps for both quality and speed. It's your choice. You can have a higher res model, and use a bigger batch size. How you balance it is up to you, but it all depends on VRAM.
Numbers of Tensor cores and frequency only determines how long one iteration takes. But batch size and max resolution depends on VRAM.
That's why a 10GB card isn't such a big improvement if you previously had a 8GB GPU, even if it's from 2 generations ago.

Personnaly waiting for the rumored Ti versions. Hopefully there will be a 3070 or a 3080 with 16GB VRAM, which will still be less expensive than a 3090 and almost as future proof.

Unfortunately I have the 4GB version, but I understand your point. Thanks :)

Would it be fair to say then that a 3080 could trade off speed for a higher res model, and gain similar results to a 3090 at the cost of more time training. Or would you say 10Gb is a bit too restrictive compared to the 16/24gb cards.

Yes, fair to say, even if at a certain point, for example if you're trying a super high res, like 450+ you won't be able to train realisticly on your card because it will be at batch 1 or 2 and take a full week or more. Whereas the 3090 will still be able to train at batch 6-10 or something because of the beefier VRAM. But besides this, a 3080 with 10GB will be a massive improvement over your 4GB card (I thought DFL worked only on 6GB cards minimum ?). Anyways, I want to keep the same card for at least 5 years, that's why I want a 16GB one. A 10GB, for me would last more like 2-3 years before you think of buying a new one. Which is not bad at all.
 

SlickPWraith

DF Vagrant
I need to buy a card. Sold my 1660 Super in anticipation of the 3070 but now hearing DFL doesn't currently work with the 3000 series and it's unknown if/when there will be an update? Should I just get a 2070 Super?
 

ssjenforcer

DF Admirer
If you can find a used 2080Ti cheap maybe try that? Otherwise 2070S seems good. Check the benchmark forum for settings people can get from that.

It's unfortunate about the tensorflow not working with 3080/3090 yet.
 
Top