Export to GitHub

cuda-convnet - issue #8

Getting nan values while using response normalization layer


Posted on May 8, 2013 by Happy Panda

The network training is fine without adding any contrast normalization layer (all types), but ones add the contrast normalization layers, after several iterations the net gets nan values. I tried different values of the size, scale and pow values, and tried to place the layer before and after pooling layer.

Comment #1

Posted on Sep 13, 2013 by Swift Bird

I have a similar issue. Does it still work when you remove the contrast normalization layer?

Status: New

Labels:
Type-Defect Priority-Medium