1 write to attention
Microsoft.ML.TorchSharp (1)
Roberta\Modules\Layer.cs (1)
24attention = new Attention(numAttentionHeads, hiddenSize, layerNormEps, attentionDropoutRate, outputDropoutRate);
2 references to attention
Microsoft.ML.TorchSharp (2)
Roberta\Modules\Layer.cs (2)
33var attentionOutput = attention.forward(input, attentionMask); 45attention.Dispose();