1 write to _config
Microsoft.ML.GenAI.Mistral (1)
MistralModel.cs (1)
29this._config = config;
2 references to _config
Microsoft.ML.GenAI.Mistral (2)
MistralModel.cs (2)
102if (this._config.AttnImplementation == "flash_attention_2") 109attentionMask = AttentionMaskConverter.Create4DCausalAttentionMask(attentionMask, [batchSize, seqLength], inputsEmbeds.dtype, device, pastKeyValuesLength, slidingWindow: _config.SlidingWindow);