roywei commented on a change in pull request #16532: fix dropout gpu seed URL: https://github.com/apache/incubator-mxnet/pull/16532#discussion_r339257797
########## File path: src/operator/rnn-inl.h ########## @@ -1360,15 +1360,16 @@ class RNNOp { // Create Dropout descriptors if (param_.p > 0) { ctx.requested[rnn_enum::kCuDNNDropoutDescSpace].get_cudnn_dropout_desc - (&dropout_desc_, s, 1.0f - param_.p, seed_); + (&dropout_desc_, s, 1.0f - param_.p); } // Only update the probability by passing in a null dropout_states ptr DType* dropout_states = NULL; size_t dropout_bytes = 0; + // use dummy seed as state is null CUDNN_CALL(cudnnSetDropoutDescriptor(dropout_desc_, s->dnn_handle_, param_.p, // discard probability dropout_states, dropout_bytes, - seed_)); + 0)); Review comment: Ok, and I found another issue, the seed will be fixed during resource initialization. So once a dropout layer is created, event if we don't fix seed, each dropout result will be the same. If we want each forward to have a different result, and the random result must respect mxnet random seed. The only solution is to get mxnet seed during each forward. Is that right? ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services