DDPContextManager#

class DDPContextManager[source]#

Bases: AbstractContextManager

Context manager for initializing and destroying DDP process groups.

Note that this context manager does not start processes itself, but merely calls torch.distributed.init_process_group() and torch.distributed.destroy_process_group() and sets DDP-related fields in the Learner to appropriate values.

If a process group is already initialized, this context manager does nothing on either entry or exit.

__init__(learner: Learner, rank: Optional[int] = None, world_size: Optional[int] = None) None[source]#

Constructor.

Parameters
  • learner (Learner) – The Learner on which to set DDP-related fields.

  • rank (Optional[int]) – The process rank. If None, will be set to Learner.ddp_rank. Defaults to None.

  • world_size (Optional[int]) – The world size. If None, will be set to Learner.ddp_world_size. Defaults to None.

Raises

ValueError – If rank or world_size not provided and aren’t set on the Learner.

Return type

None

Methods

__init__(learner[, rank, world_size])

Constructor.

__init__(learner: Learner, rank: Optional[int] = None, world_size: Optional[int] = None) None[source]#

Constructor.

Parameters
  • learner (Learner) – The Learner on which to set DDP-related fields.

  • rank (Optional[int]) – The process rank. If None, will be set to Learner.ddp_rank. Defaults to None.

  • world_size (Optional[int]) – The world size. If None, will be set to Learner.ddp_world_size. Defaults to None.

Raises

ValueError – If rank or world_size not provided and aren’t set on the Learner.

Return type

None