MrDeepFakes Forums

Some content may not be available to Guests. Consider registering an account to enjoy unrestricted access to guides, support and tools

  • We are looking for community members who are intested in helping out. See our HELP WANTED post.

DFL Google colab version on a TPU instead of a GPU

super_bleu

DF Vagrant
Is this possible?  If so is there anybody here who can modify the existing notebook so that it will run on the TPU?
 This sounds like it will allow us to train models at a much much faster rate.   Seems to be the obvious next step for the Google Colab fork of DFL


here's some info on Googles TPU

"TPU has the ability to process 65,536 multiply-and-adds for 8-bit integers every cycle.

As a comparision, consider this:

CPU can handle tens of operation per cycle
GPU can handle tens of thousands of operation per cycle
TPU can handle upto 128000 operations per cycle
TPU is several folds faster than GPU for neural network computations"
 
D

deleted1

Guest
I think the devs already though about it, the problem is TPU not free... it's very expensive actually (1-3$ per hour?)
 

super_bleu

DF Vagrant
Kyuri said:
I think the devs already though about it, the problem is TPU not free... it's very expensive actually (1-3$ per hour?)

no, it's $free.99.  You have the option of running on the gpu or the tpu in google colab.  Just go to Runtime-> Change runtime type-> TPU. The DFL Notebook is not coded to run on the TPU though, but the option is available for free from google.
 
D

deleted1

Guest
But the devs offline long time ago... btw I will go to shadow pc so ...
 
Top