guliashvili opened a new issue #11560: custom loss function URL: https://github.com/apache/incubator-mxnet/issues/11560 Hi, I'm fine tuning Resnet following the https://mxnet.incubator.apache.org/faq/finetune.html tutorial. Tutorial modifies the Resnet like that: ``` all_layers = symbol.get_internals() net = all_layers[layer_name+'_output'] net = mx.symbol.FullyConnected(data=net, num_hidden=num_classes, name='fc1') net = mx.symbol.SoftmaxOutput(data=net, name='softmax') new_args = dict({k:arg_params[k] for k in arg_params if 'fc1' not in k}) return (net, new_args) ``` However, I need to create a different kind of loss function. Which is following : 1) Do softmax(just softmax not softmaxoutput) with 100 classes. 2) Compute dot product between softmax probabilities and the vector from [1, 2, 3 ... 100]. (This vector should not change during the training) 3)Use LinearRegressionOutput My implementation which does not work is the following ``` all_layers = symbol.get_internals() net = all_layers[layer_name+'_output'] net = mx.symbol.FullyConnected(data=net, num_hidden=num_classes, name='fc1') net = mx.symbol.softmax(data=net, name='softmax_intermediate') net = mx.symbol.dot(lhs=net, rhs= mx.nd.array(range(1,101)), name = "givi") net = mx.symbol.LinearRegressionOutput(net, name='softmax') ``` I'm getting the message > Traceback (most recent call last): > File "modify-resnet-50-base-custom-average.py", line 41, in <module> > main(parse_arguments(sys.argv[1:])) > File "modify-resnet-50-base-custom-average.py", line 26, in main > (new_sym, new_args) = get_fine_tune_model(sym, arg_params, args.num) > File "modify-resnet-50-base-custom-average.py", line 19, in get_fine_tune_model > net = mx.symbol.dot(lhs=net, rhs= mx.nd.array(range(1,101)), name = "givi") > File "<string>", line 71, in dot > AssertionError: Argument rhs must be Symbol instances, but got > [ 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. > 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. > 25. 26. 27. 28. 29. 30. 31. 32. 33. 34. 35. 36. > 37. 38. 39. 40. 41. 42. 43. 44. 45. 46. 47. 48. > 49. 50. 51. 52. 53. 54. 55. 56. 57. 58. 59. 60. > 61. 62. 63. 64. 65. 66. 67. 68. 69. 70. 71. 72. > 73. 74. 75. 76. 77. 78. 79. 80. 81. 82. 83. 84. > 85. 86. 87. 88. 89. 90. 91. 92. 93. 94. 95. 96. > 97. 98. 99. 100.] > <NDArray 100 @cpu(0)> Any help would be appriciated. Thank you.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services
