Bases: LearningRateSchedule
Cyclic learning rate schedule
Constructor for CyclicLRSchedule
Args:
max_lr: maximum learning rate
cycle_size: steps per cycle
min_lr: minimum learning rate (default: max_lr / 10)
Source code in niceml/dlframeworks/keras/optimizers/schedules/cycliclrschedule.py
| def __init__(self, max_lr: float, cycle_size: int, min_lr: Optional[float] = None):
"""
Constructor for CyclicLRSchedule
Args:
max_lr: maximum learning rate
cycle_size: steps per cycle
min_lr: minimum learning rate (default: max_lr / 10)
"""
super().__init__()
self.max_lr = max_lr
self.cycle_size = cycle_size
self.min_lr = min_lr or max_lr / 10
|
Functions
__call__
Return the learning rate for a given step
Source code in niceml/dlframeworks/keras/optimizers/schedules/cycliclrschedule.py
| def __call__(self, step):
"""Return the learning rate for a given step"""
step = tf.cast(step, tf.float32)
cycle = tf.floor(1 + step / self.cycle_size)
x_value = tf.abs(step / (self.cycle_size / 2) - 2 * cycle + 1)
learning_rate = self.min_lr + (self.max_lr - self.min_lr) * tf.maximum(
0.0, (1 - x_value)
)
return learning_rate
|
get_config
Return the config of the schedule
Source code in niceml/dlframeworks/keras/optimizers/schedules/cycliclrschedule.py
| def get_config(self) -> dict:
"""Return the config of the schedule"""
return {
"max_lr": self.max_lr,
"cycle_size": self.cycle_size,
"min_lr": self.min_lr,
}
|