1 write to LayerNorm
Microsoft.ML.TorchSharp (1)
Roberta\Modules\AttentionOutput.cs (1)
26
LayerNorm
= torch.nn.LayerNorm(new long[] { hiddenSize });
2 references to LayerNorm
Microsoft.ML.TorchSharp (2)
Roberta\Modules\AttentionOutput.cs (2)
35
hiddenStates =
LayerNorm
.forward(hiddenStates + inputTensor);
47
LayerNorm
.Dispose();