View source: R/comprehend_operations.R
comprehend_detect_toxic_content | R Documentation |
Performs toxicity analysis on the list of text strings that you provide as input. The API response contains a results list that matches the size of the input list. For more information about toxicity detection, see Toxicity detection in the Amazon Comprehend Developer Guide.
See https://www.paws-r-sdk.com/docs/comprehend_detect_toxic_content/ for full documentation.
comprehend_detect_toxic_content(TextSegments, LanguageCode)
TextSegments |
[required] A list of up to 10 text strings. Each string has a maximum size of 1 KB, and the maximum size of the list is 10 KB. |
LanguageCode |
[required] The language of the input text. Currently, English is the only supported language. |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.