1 write to output
Microsoft.ML.TorchSharp (1)
Roberta\Modules\Attention.cs (1)
23output = new AttentionOutput(hiddenSize, layerNormEps, attentionDropoutRate, outputDropoutRate);
2 references to output
Microsoft.ML.TorchSharp (2)
Roberta\Modules\Attention.cs (2)
31x = output.forward(x, hiddenStates); 42output.Dispose();