HarmCategory

class HarmCategory


Category for a given harm rating.

Summary

Public companion properties

HarmCategory

Represents the harm category for content that is classified as content that may be used to harm civic integrity.

HarmCategory

Represents the harm category for content that is classified as dangerous content.

HarmCategory

Represents the harm category for content that is classified as harassment.

HarmCategory

Represents the harm category for content that is classified as hate speech.

HarmCategory

Represents the harm category for image content that is classified as dangerous.

HarmCategory

Represents the harm category for image content that is classified as harassment.

HarmCategory

Represents the harm category for image content that is classified as hateful.

HarmCategory

Represents the harm category for image content that is classified as sexually explicit.

HarmCategory

Represents the harm category for content that is classified as sexually explicit content.

HarmCategory

A new and not yet supported value.

Public properties

Int

Public companion properties

CIVIC_INTEGRITY

val CIVIC_INTEGRITYHarmCategory

Represents the harm category for content that is classified as content that may be used to harm civic integrity.

DANGEROUS_CONTENT

val DANGEROUS_CONTENTHarmCategory

Represents the harm category for content that is classified as dangerous content.

HARASSMENT

val HARASSMENTHarmCategory

Represents the harm category for content that is classified as harassment.

HATE_SPEECH

val HATE_SPEECHHarmCategory

Represents the harm category for content that is classified as hate speech.

IMAGE_DANGEROUS_CONTENT

val IMAGE_DANGEROUS_CONTENTHarmCategory

Represents the harm category for image content that is classified as dangerous.

IMAGE_HARASSMENT

val IMAGE_HARASSMENTHarmCategory

Represents the harm category for image content that is classified as harassment.

IMAGE_HATE

val IMAGE_HATEHarmCategory

Represents the harm category for image content that is classified as hateful.

IMAGE_SEXUALLY_EXPLICIT

val IMAGE_SEXUALLY_EXPLICITHarmCategory

Represents the harm category for image content that is classified as sexually explicit.

SEXUALLY_EXPLICIT

val SEXUALLY_EXPLICITHarmCategory

Represents the harm category for content that is classified as sexually explicit content.

UNKNOWN

val UNKNOWNHarmCategory

A new and not yet supported value.

Public properties

ordinal

val ordinalInt