1 write to attention
Microsoft.ML.TorchSharp (1)
Roberta\Modules\Layer.cs (1)
24
attention
= new Attention(numAttentionHeads, hiddenSize, layerNormEps, attentionDropoutRate, outputDropoutRate);
2 references to attention
Microsoft.ML.TorchSharp (2)
Roberta\Modules\Layer.cs (2)
33
var attentionOutput =
attention
.forward(input, attentionMask);
45
attention
.Dispose();