DiskBasedReportingConfiguration.Create Method

Definition

Creates a ReportingConfiguration that persists ScenarioRunResults to disk and also uses the disk to cache AI responses.

public static Microsoft.Extensions.AI.Evaluation.Reporting.ReportingConfiguration Create(string storageRootPath, System.Collections.Generic.IEnumerable<Microsoft.Extensions.AI.Evaluation.IEvaluator> evaluators, Microsoft.Extensions.AI.Evaluation.ChatConfiguration? chatConfiguration = default, bool enableResponseCaching = true, TimeSpan? timeToLiveForCacheEntries = default, System.Collections.Generic.IEnumerable<string>? cachingKeys = default, string executionName = "Default", Func<Microsoft.Extensions.AI.Evaluation.EvaluationMetric,Microsoft.Extensions.AI.Evaluation.EvaluationMetricInterpretation?>? evaluationMetricInterpreter = default, System.Collections.Generic.IEnumerable<string>? tags = default);
static member Create : string * seq<Microsoft.Extensions.AI.Evaluation.IEvaluator> * Microsoft.Extensions.AI.Evaluation.ChatConfiguration * bool * Nullable<TimeSpan> * seq<string> * string * Func<Microsoft.Extensions.AI.Evaluation.EvaluationMetric, Microsoft.Extensions.AI.Evaluation.EvaluationMetricInterpretation> * seq<string> -> Microsoft.Extensions.AI.Evaluation.Reporting.ReportingConfiguration
Public Shared Function Create (storageRootPath As String, evaluators As IEnumerable(Of IEvaluator), Optional chatConfiguration As ChatConfiguration = Nothing, Optional enableResponseCaching As Boolean = true, Optional timeToLiveForCacheEntries As Nullable(Of TimeSpan) = Nothing, Optional cachingKeys As IEnumerable(Of String) = Nothing, Optional executionName As String = "Default", Optional evaluationMetricInterpreter As Func(Of EvaluationMetric, EvaluationMetricInterpretation) = Nothing, Optional tags As IEnumerable(Of String) = Nothing) As ReportingConfiguration

Parameters

storageRootPath
String

The path to a directory on disk under which the ScenarioRunResults and all cached AI responses should be stored.

evaluators
IEnumerable<IEvaluator>

The set of IEvaluators that should be invoked to evaluate AI responses.

chatConfiguration
ChatConfiguration

A ChatConfiguration that specifies the IChatClient that is used by AI-based evaluators included in the returned ReportingConfiguration. Can be omitted if none of the included evaluators are AI-based.

enableResponseCaching
Boolean

true to enable caching of AI responses; false otherwise.

timeToLiveForCacheEntries
Nullable<TimeSpan>

An optional TimeSpan that specifies the maximum amount of time that cached AI responses should survive in the cache before they are considered expired and evicted.

cachingKeys
IEnumerable<String>

An optional collection of unique strings that should be hashed when generating the cache keys for cached AI responses. See CachingKeys for more information about this concept.

executionName
String

The name of the current execution. See ExecutionName for more information about this concept. Uses a fixed default value "Default" if omitted.

evaluationMetricInterpreter
Func<EvaluationMetric,EvaluationMetricInterpretation>

An optional function that can be used to override EvaluationMetricInterpretations for EvaluationMetrics returned from evaluations that use the returned ReportingConfiguration. The supplied function can either return a new EvaluationMetricInterpretation for any EvaluationMetric that is supplied to it, or return null if the Interpretation should be left unchanged.

tags
IEnumerable<String>

A optional set of text tags applicable to all ScenarioRuns created using the returned ReportingConfiguration.

Returns

A ReportingConfiguration that persists ScenarioRunResults to disk and also uses the disk to cache AI responses.

Remarks

Note that when enableResponseCaching is set to true, the cache keys used for the cached responses are not guaranteed to be stable across releases of the library. In other words, when you update your code to reference a newer version of the library, it is possible that old cached responses (persisted to the cache using older versions of the library) will no longer be used - instead new responses will be fetched from the LLM and added to the cache for use in subsequent executions.

Applies to