1 write to GroundTruth
Microsoft.Extensions.AI.Evaluation.NLP (1)
F1EvaluatorContext.cs (1)
47GroundTruth = groundTruth;
4 references to GroundTruth
Microsoft.Extensions.AI.Evaluation.NLP (4)
F1Evaluator.cs (2)
24/// supplied by <see cref="F1EvaluatorContext.GroundTruth"/>. The score is returned in a <see cref="NumericMetric"/> 74string[] reference = SimpleWordTokenizer.WordTokenize(context.GroundTruth).ToArray();
F1EvaluatorContext.cs (2)
16/// <see cref="GroundTruth" />. F1 is a metric used to valuate the quality of machine-generated text. It is the ratio 32/// the response supplied via <see cref="GroundTruth"/>. The metric will be reported as an F1 score.