cvpods.solver package

cvpods.solver.build_lr_scheduler(cfg, optimizer: torch.optim.optimizer.Optimizer, **kwargs) → torch.optim.lr_scheduler._LRScheduler[source]

Build a LR scheduler from config.

cvpods.solver.build_optimizer(cfg, model: torch.nn.modules.module.Module) → torch.optim.optimizer.Optimizer[source]

Build an optimizer with clip and LARS wraper from config.

class cvpods.solver.AdamBuilder[source]

Bases: cvpods.solver.optimizer_builder.OptimizerBuilder

static build(model, cfg)[source]
class cvpods.solver.AdamWBuilder[source]

Bases: cvpods.solver.optimizer_builder.OptimizerBuilder

static build(model, cfg)[source]
class cvpods.solver.OptimizerBuilder[source]

Bases: object

static build(model, cfg)[source]
class cvpods.solver.SGDBuilder[source]

Bases: cvpods.solver.optimizer_builder.OptimizerBuilder

static build(model, cfg)[source]
class cvpods.solver.SGDGateLRBuilder[source]

Bases: cvpods.solver.optimizer_builder.OptimizerBuilder

SGD Gate LR optimizer builder, used for DynamicRouting in cvpods. This optimizer will ultiply lr for gating function.

static build(model, cfg)[source]
class cvpods.solver.BaseSchedulerBuilder[source]

Bases: object

static build(optimizer, cfg, **kwargs)[source]
class cvpods.solver.LambdaLRBuilder[source]

Bases: cvpods.solver.scheduler_builder.BaseSchedulerBuilder

static build(optimizer, cfg, **kwargs)[source]
class cvpods.solver.OneCycleLRBuilder[source]

Bases: cvpods.solver.scheduler_builder.BaseSchedulerBuilder

static build(optimizer, cfg, **kwargs)[source]
class cvpods.solver.PolyLRBuilder[source]

Bases: cvpods.solver.scheduler_builder.BaseSchedulerBuilder

static build(optimizer, cfg, **kwargs)[source]
class cvpods.solver.WarmupCosineLR(optimizer: torch.optim.optimizer.Optimizer, max_iters: int, warmup_factor: float = 0.001, warmup_iters: int = 1000, warmup_method: str = 'linear', last_epoch: int = - 1, epoch_iters: int = - 1)[source]

Bases: torch.optim.lr_scheduler._LRScheduler

__init__(optimizer: torch.optim.optimizer.Optimizer, max_iters: int, warmup_factor: float = 0.001, warmup_iters: int = 1000, warmup_method: str = 'linear', last_epoch: int = - 1, epoch_iters: int = - 1)[source]

Cosine LR with warmup

Parameters
  • optimizer (Optimizer) – Wrapped optimizer.

  • max_iters (int) – max num of iters

  • warmup_factor (float) – warmup factor to compute lr

  • warmup_iters (int) – warmup iters

  • warmup_method (str) – warmup method in [“constant”, “linear”, “burnin”]

  • last_epoch – The index of last epoch. Default: -1.

get_lr() → List[float][source]
class cvpods.solver.WarmupCosineLRBuilder[source]

Bases: cvpods.solver.scheduler_builder.BaseSchedulerBuilder

static build(optimizer, cfg, **kwargs)[source]
class cvpods.solver.WarmupMultiStepLR(optimizer: torch.optim.optimizer.Optimizer, milestones: List[int], gamma: float = 0.1, warmup_factor: float = 0.001, warmup_iters: int = 1000, warmup_method: str = 'linear', last_epoch: int = - 1)[source]

Bases: torch.optim.lr_scheduler._LRScheduler

__init__(optimizer: torch.optim.optimizer.Optimizer, milestones: List[int], gamma: float = 0.1, warmup_factor: float = 0.001, warmup_iters: int = 1000, warmup_method: str = 'linear', last_epoch: int = - 1)[source]

Multi Step LR with warmup

Parameters
  • optimizer (torch.optim.Optimizer) – optimizer used.

  • milestones (list[Int]) – a list of increasing integers.

  • gamma (float) – gamma

  • warmup_factor (float) – lr = warmup_factor * base_lr

  • warmup_iters (int) – iters to warmup

  • warmup_method (str) – warmup method in [“constant”, “linear”, “burnin”]

  • last_epoch (int) – The index of last epoch. Default: -1.

get_lr() → List[float][source]
class cvpods.solver.WarmupMultiStepLRBuilder[source]

Bases: cvpods.solver.scheduler_builder.BaseSchedulerBuilder

static build(optimizer, cfg, **kwargs)[source]