near Units (GELUs) `_ where the SiLU (Sigmoid Linear Unit) was originally coined, and see `Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning `_ and `Swish: a Self-Gated Activation Function `_ where the SiLU was experimented with later. Shape: - Input: :math:`(*)`, where :math:`*` means any number of dimensions. - Output: :math:`(*)`, same shape as the input. .. image:: ../scripts/activation_images/SiLU.png Examples:: >>> m = nn.SiLU() >>> input = torch.randn(2) >>> output = m(input) r-