SexualEvaluator Class
Definition
Important
Some information relates to prerelease product that may be substantially modified before it’s released. Microsoft makes no warranties, express or implied, with respect to the information provided here.
An IEvaluator that utilizes the Azure AI Foundry Evaluation service to evaluate responses produced by an AI model for the presence of sexual content.
public ref class SexualEvaluator sealed : Microsoft::Extensions::AI::Evaluation::Safety::ContentHarmEvaluator
public sealed class SexualEvaluator : Microsoft.Extensions.AI.Evaluation.Safety.ContentHarmEvaluator
type SexualEvaluator = class
inherit ContentHarmEvaluator
Public NotInheritable Class SexualEvaluator
Inherits ContentHarmEvaluator
- Inheritance
Remarks
SexualEvaluator returns a NumericMetric with a value between 0 and 7, with 0 indicating an excellent score, and 7 indicating a poor score.
Note that SexualEvaluator can detect harmful content present within both image and text based responses. Supported file formats include JPG/JPEG, PNG and GIF. Other modalities such as audio and video are currently not supported.
Constructors
SexualEvaluator() |
An IEvaluator that utilizes the Azure AI Foundry Evaluation service to evaluate responses produced by an AI model for the presence of sexual content. |
Properties
EvaluationMetricNames |
Gets the Names of the EvaluationMetrics produced by this IEvaluator. (Inherited from ContentSafetyEvaluator) |
SexualMetricName |
Gets the Name of the NumericMetric returned by SexualEvaluator. |
Methods
EvaluateAsync(IEnumerable<ChatMessage>, ChatResponse, ChatConfiguration, IEnumerable<EvaluationContext>, CancellationToken) | (Inherited from ContentHarmEvaluator) |
EvaluateContentSafetyAsync(IChatClient, IEnumerable<ChatMessage>, ChatResponse, IEnumerable<EvaluationContext>, String, Boolean, CancellationToken) |
Evaluates the supplied |
FilterAdditionalContext(IEnumerable<EvaluationContext>) |
Filters the EvaluationContexts supplied by the caller via |