Use the Debugger built-in rules provided by Amazon SageMaker Debugger and analyze tensors emitted while training your models. The Debugger built-in rules monitor various common conditions that are critical for the success of a training job. You can call the built-in rules using Amazon SageMaker Python SDK or the low-level SageMaker API operations. Depending on deep learning frameworks of your choice, there are four scopes of validity for the built-in rules as shown in the following table. https://docs.aws.amazon.com/sagemaker/latest/dg/debugger-built-in-rules.html
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 | vanishing_gradient()
similar_across_runs()
weight_update_ratio()
all_zero()
exploding_tensor()
unchanged_tensor()
loss_not_decreasing()
check_input_images()
dead_relu()
confusion()
tree_depth()
class_imbalance()
overfit()
tensor_variance()
overtraining()
poor_weight_initialization()
saturated_activation()
nlp_sequence_ratio()
stalled_training_rule()
feature_importance_overweight()
create_xgboost_report()
|
list to be used in Amazon SageMaker Debugger
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.