You must be on CPU. Honestly, because I have to pause training, It normally takes me a week and a half. But again, I have to constantly interrupt training. Also, If you're on CPU, try a smaller batch size. right now for me 64 seems to be the sweet spot. I'll normally do 128 until I get close to a loss of .04, then use 64 until the model is trained. Still learning and playing with it. but for my system, on CPU training, 64 seems to be the sweet spot for detail and training speed.