1 write to LayerNorm
Microsoft.ML.TorchSharp (1)
Roberta\Modules\Output.cs (1)
23LayerNorm = torch.nn.LayerNorm(new long[] { hiddenSize });
2 references to LayerNorm
Microsoft.ML.TorchSharp (2)
Roberta\Modules\Output.cs (2)
32hiddenStates = LayerNorm.forward(hiddenStates + inputTensor); 43LayerNorm.Dispose();