1 write to self
Microsoft.ML.TorchSharp (1)
Roberta\Modules\Attention.cs (1)
22
self
= new AttentionSelf(numAttentionHeads, hiddenSize, layerNormEps, attentionDropoutRate);
2 references to self
Microsoft.ML.TorchSharp (2)
Roberta\Modules\Attention.cs (2)
30
var x =
self
.forward(hiddenStates, attentionMask);
41
self
.Dispose();