lfcnn.utils package
Submodules
lfcnn.utils.callback_utils module
Utilities to be used with LearningRateSchedulers and MomentumSchedulers. Requires the matplotlib library.
- lfcnn.utils.callback_utils.plot_scheduler(schedulers, max_epoch, log=False, **kwargs)[source]
Plot one ore more learning rate or momentum schedulers for visual comparison. Uses the matplotlib library.
- Parameters:
schedulers – Single instance or list of scheduler instances. Scheduler instance cann be either a LearningRateScheduler or a MomentumScheduler instance
max_epoch – Maximum epoch to plot.
log – Whether to scale the y-axis logarithmically.
**kwargs – Passed to plt.plot
lfcnn.utils.tf_utils module
lfcnn tensorflow utils
- lfcnn.utils.tf_utils.disable_eager()[source]
Disables TF eager execution. By default in TF >= 2.0, eager execution is enabled.
- lfcnn.utils.tf_utils.mixed_precision_graph_rewrite(opt, loss_scale='dynamic')[source]
Using a graph rewrite to enable mixed precision training. Use with care. The Keras API
set_mixed_precision_keras()
is the prefered method for mixed precision training. .. seealso:: https://www.tensorflow.org/api_docs/python/tf/train/experimental/enable_mixed_precision_graph_rewrite- Parameters:
opt (
OptimizerV2
) – Keras optimizer instance.loss_scale (
str
) – Lass scale method. Default: ‘dynamic’.
- Return type:
OptimizerV2
- Returns:
Optimizer with mixed precision graph rewrite enabled.
- lfcnn.utils.tf_utils.set_mixed_precision_keras(policy='mixed_float16', loss_scale='dynamic')[source]
Set to use the Keras mixed precision api. Simply call at the beginning of your script.
Module contents
LCNN utilities.