class mmcls.models.utils.ShiftWindowMSA(embed_dims, num_heads, window_size, shift_size=0, qkv_bias=True, qk_scale=None, attn_drop=0, proj_drop=0, dropout_layer={'drop_prob': 0.0, 'type': 'DropPath'}, pad_small_map=False, input_resolution=None, auto_pad=None, window_msa=<class 'mmcls.models.utils.attention.WindowMSA'>, msa_cfg={}, init_cfg=None)[source]

Shift Window Multihead Self-Attention Module.

  • embed_dims (int) – Number of input channels.

  • num_heads (int) – Number of attention heads.

  • window_size (int) – The height and width of the window.

  • shift_size (int, optional) – The shift step of each window towards right-bottom. If zero, act as regular window-msa. Defaults to 0.

  • qkv_bias (bool, optional) – If True, add a learnable bias to q, k, v. Defaults to True

  • qk_scale (float | None, optional) – Override default qk scale of head_dim ** -0.5 if set. Defaults to None.

  • attn_drop (float, optional) – Dropout ratio of attention weight. Defaults to 0.0.

  • proj_drop (float, optional) – Dropout ratio of output. Defaults to 0.

  • dropout_layer (dict, optional) – The dropout_layer used before output. Defaults to dict(type=’DropPath’, drop_prob=0.).

  • pad_small_map (bool) – If True, pad the small feature map to the window size, which is common used in detection and segmentation. If False, avoid shifting window and shrink the window size to the size of feature map, which is common used in classification. Defaults to False.

  • version (str, optional) – Version of implementation of Swin Transformers. Defaults to v1.

  • init_cfg (dict, optional) – The extra config for initialization. Defaults to None.

forward(query, hw_shape)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.


Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

Read the Docs v: latest
On Read the Docs
Project Home

Free document hosting provided by Read the Docs.