anirudh2290 commented on a change in pull request #15118: Conversion from FP32
model to Mixed Precision model
URL: https://github.com/apache/incubator-mxnet/pull/15118#discussion_r293172522
##########
File path: python/mxnet/contrib/amp/amp.py
##########
@@ -342,3 +349,320 @@ def unscale(optimizer_or_trainer):
else:
raise TypeError("optimizer_or_trainer should be a Gluon Trainer or "
"an optimizer, instead is %s" %
type(optimizer_or_trainer))
+
+def convert_symbol(sym, target_dtype="float16", target_dtype_ops=None,
+ fp32_ops=None, conditional_fp32_ops=None,
+ excluded_sym_names=None, data_names=None):
+ """Given a symbol object representing a neural network of data type FP32
and target_dtype,
+ add cast layers according to the op lists (target_dtype_ops, fp32_ops,
+ conditional_fp32_ops) if provided, otherwise use the default
+ lists provided by the framework.
+
+ Parameters
+ ----------
+ sym : Symbol
+ FP32 neural network symbol
+ target_dtype : str or numpy, optional defaults to float16
+ currently only supports float16. The target dtype indicates to add
cast layers
+ when possible so that lower precision computation can be leveraged.
+ target_dtype_ops : list of strs, optional
+ Override the list of operator names casted to the target_dtype.
+ If None, uses the framework's default list to be casted to
target_dtype.
+ fp32_ops : list of strs, optional
+ Override the list of operator names casted to FP32.
+ If None, uses the framework's default list to be casted to FP32.
+ conditional_fp32_ops : list of (string, string, list of string), optional
Review comment:
I think class may not be needed here. There is no state that needs to be
maintained and acted upon multiple times and through lifetime of a program.
Most users will be calling these conversion APIs once.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services