1 write to NumAttentionHeads
Microsoft.ML.TorchSharp (1)
Roberta\Modules\AttentionSelf.cs (1)
30NumAttentionHeads = numAttentionHeads;
3 references to NumAttentionHeads
Microsoft.ML.TorchSharp (3)
Roberta\Modules\AttentionSelf.cs (3)
32if (NumAttentionHeads * AttentionHeadSize != hiddenSize) 69var contextShape = DataUtils.Concat<long>(contextLayer.shape.AsSpan(0, contextLayer.shape.Length - 2), NumAttentionHeads * AttentionHeadSize); 80var newShape = DataUtils.Concat<long>(x.shape.AsSpan(0, x.shape.Length - 1), NumAttentionHeads, AttentionHeadSize);