Why such high dimms? Its not really needed as much anymore. I have a SAEHD 12.29 model thats 192 res, default dimms. However, both 12.26/12.29 seem to crash frequently with GPU issues, regardless of batch size (RAM used). So its taking forever to train. Im only at 44k iterations. ANNDD it just crashed again as I was typing this.