Skip to content

Commit b2d42a2

Browse files
kaijieshi7datumbox
andauthored
fix spell error (#5554)
Co-authored-by: Vasilis Vryniotis <[email protected]>
1 parent 7039c2c commit b2d42a2

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

torchvision/ops/misc.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -119,14 +119,14 @@ def __init__(
119119

120120
class Conv2dNormActivation(ConvNormActivation):
121121
"""
122-
Configurable block used for Convolution2d-Normalzation-Activation blocks.
122+
Configurable block used for Convolution2d-Normalization-Activation blocks.
123123
124124
Args:
125125
in_channels (int): Number of channels in the input image
126-
out_channels (int): Number of channels produced by the Convolution-Normalzation-Activation block
126+
out_channels (int): Number of channels produced by the Convolution-Normalization-Activation block
127127
kernel_size: (int, optional): Size of the convolving kernel. Default: 3
128128
stride (int, optional): Stride of the convolution. Default: 1
129-
padding (int, tuple or str, optional): Padding added to all four sides of the input. Default: None, in wich case it will calculated as ``padding = (kernel_size - 1) // 2 * dilation``
129+
padding (int, tuple or str, optional): Padding added to all four sides of the input. Default: None, in which case it will calculated as ``padding = (kernel_size - 1) // 2 * dilation``
130130
groups (int, optional): Number of blocked connections from input channels to output channels. Default: 1
131131
norm_layer (Callable[..., torch.nn.Module], optional): Norm layer that will be stacked on top of the convolution layer. If ``None`` this layer wont be used. Default: ``torch.nn.BatchNorm2d``
132132
activation_layer (Callable[..., torch.nn.Module], optinal): Activation function which will be stacked on top of the normalization layer (if not None), otherwise on top of the conv layer. If ``None`` this layer wont be used. Default: ``torch.nn.ReLU``
@@ -169,14 +169,14 @@ def __init__(
169169

170170
class Conv3dNormActivation(ConvNormActivation):
171171
"""
172-
Configurable block used for Convolution3d-Normalzation-Activation blocks.
172+
Configurable block used for Convolution3d-Normalization-Activation blocks.
173173
174174
Args:
175175
in_channels (int): Number of channels in the input video.
176-
out_channels (int): Number of channels produced by the Convolution-Normalzation-Activation block
176+
out_channels (int): Number of channels produced by the Convolution-Normalization-Activation block
177177
kernel_size: (int, optional): Size of the convolving kernel. Default: 3
178178
stride (int, optional): Stride of the convolution. Default: 1
179-
padding (int, tuple or str, optional): Padding added to all four sides of the input. Default: None, in wich case it will calculated as ``padding = (kernel_size - 1) // 2 * dilation``
179+
padding (int, tuple or str, optional): Padding added to all four sides of the input. Default: None, in which case it will calculated as ``padding = (kernel_size - 1) // 2 * dilation``
180180
groups (int, optional): Number of blocked connections from input channels to output channels. Default: 1
181181
norm_layer (Callable[..., torch.nn.Module], optional): Norm layer that will be stacked on top of the convolution layer. If ``None`` this layer wont be used. Default: ``torch.nn.BatchNorm3d``
182182
activation_layer (Callable[..., torch.nn.Module], optinal): Activation function which will be stacked on top of the normalization layer (if not None), otherwise on top of the conv layer. If ``None`` this layer wont be used. Default: ``torch.nn.ReLU``

0 commit comments

Comments
 (0)