LearningRateMonitor¶
- class lightning.pytorch.callbacks.LearningRateMonitor(logging_interval=None, log_momentum=False, log_weight_decay=False)[source]¶
Bases:
CallbackAutomatically monitor and logs learning rate for learning rate schedulers during training.
- Parameters:
logging_interval¶ (
Optional[Literal['step','epoch']]) – set to'epoch'or'step'to loglrof all optimizers at the same interval, set toNoneto log at individual interval according to theintervalkey of each scheduler. Defaults toNone.log_momentum¶ (
bool) – option to also log the momentum values of the optimizer, if the optimizer has themomentumorbetasattribute. Defaults toFalse.log_weight_decay¶ (
bool) – option to also log the weight decay values of the optimizer. Defaults toFalse.
- Raises:
MisconfigurationException – If
logging_intervalis none of"step","epoch", orNone.
Example:
>>> from lightning.pytorch import Trainer >>> from lightning.pytorch.callbacks import LearningRateMonitor >>> lr_monitor = LearningRateMonitor(logging_interval='step') >>> trainer = Trainer(callbacks=[lr_monitor])
Logging names are automatically determined based on optimizer class name. In case of multiple optimizers of same type, they will be named
Adam,Adam-1etc. If an optimizer has multiple parameter groups they will be namedAdam/pg1,Adam/pg2etc. To control naming, pass in anamekeyword in the construction of the learning rate schedulers. Anamekeyword can also be used for parameter groups in the construction of the optimizer.Example:
def configure_optimizer(self): optimizer = torch.optim.Adam(...) lr_scheduler = { 'scheduler': torch.optim.lr_scheduler.LambdaLR(optimizer, ...) 'name': 'my_logging_name' } return [optimizer], [lr_scheduler]
Example:
def configure_optimizer(self): optimizer = torch.optim.SGD( [{ 'params': [p for p in self.parameters()], 'name': 'my_parameter_group_name' }], lr=0.1 ) lr_scheduler = torch.optim.lr_scheduler.LambdaLR(optimizer, ...) return [optimizer], [lr_scheduler]
- on_train_batch_start(trainer, *args, **kwargs)[source]¶
Called when the train batch begins.
- Return type: