Description Usage Arguments Value References See Also Examples
Detect the rate of emotion at the sentence level. This method uses a simple
dictionary lookup to find emotion words and then compute the rate per sentence.
The emotion
score ranges between 0 (no emotion used) and 1 (all
words used were emotional). Note that a single emotion phrase would count as
just one in the emotion_count
column but would count as two words in
the word_count
column.
1 2 3 4 5 6 7 8 9 10 11 12 13 | emotion(
text.var,
emotion_dt = lexicon::hash_nrc_emotions,
valence_shifters_dt = lexicon::hash_valence_shifters,
drop.unused.emotions = FALSE,
un.as.negation = TRUE,
un.as.negation.warn = isTRUE(all.equal(valence_shifters_dt,
lexicon::hash_nrc_emotions)),
n.before = 5,
n.after = 2,
retention_regex = "[^[:alpha:];:,']",
...
)
|
text.var |
The text variable. Can be a |
emotion_dt |
A data.table with a |
valence_shifters_dt |
A data.table of valence shifters that can alter a polarized word's meaning and an integer key for negators (1), amplifiers [intensifiers] (2), de-amplifiers [downtoners] (3) and adversative conjunctions (4) with x and y as column names. For this purpose only negators is required/used. |
drop.unused.emotions |
logical. If |
un.as.negation |
logical. If |
un.as.negation.warn |
logical. If |
n.before |
The number of words to consider as negated before
the emotion word. To consider the entire beginning portion of a sentence
use |
n.after |
The number of words to consider as negated after
the emotion word. To consider the entire ending portion of a sentence
use |
retention_regex |
A regex of what characters to keep. All other
characters will be removed. Note that when this is used all text is lower
case format. Only adjust this parameter if you really understand how it is
used. Note that swapping the |
... |
ignored. |
Returns a data.table of:
element_id - The id number of the original vector passed to emotion
sentence_id - The id number of the sentences within each element_id
word_count - Word count
emotion_type - Type designation from the emotion
column of the emotion_dt
table
emotion_count - Count of the number of emotion words of that emotion_type
emotion - A score of the percentage of emotion words of that emotion_type
Plutchik, R. (1962). The emotions: Facts and theories, and a new
model. Random House studies in psychology. Random House.
Plutchik, R. (2001). The nature of emotions: Human emotions have deep
evolutionary roots, a fact that may explain their complexity and provide tools
for clinical practice. American Scientist , 89 (4), 344-350.
Other emotion functions:
emotion_by()
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 | mytext <- c(
"I am not afraid of you",
NA,
"",
"I love it [not really]",
"I'm not angry with you",
"I hate it when you lie to me. It's so humiliating",
"I'm not happpy anymore. It's time to end it",
"She's a darn good friend to me",
"I went to the terrible store",
"There is hate and love in each of us",
"I'm no longer angry! I'm really experiencing peace but not true joy.",
paste("Out of the night that covers me, Black as the Pit from pole to",
"pole, I thank whatever gods may be For my unconquerable soul."
),
paste("In the fell clutch of circumstance I have not winced nor cried",
"aloud. Under the bludgeonings of chance My head is bloody, but unbowed."
),
paste("Beyond this place of wrath and tears Looms but the Horror of the",
"shade, And yet the menace of the years Finds, and shall find, me unafraid."
),
paste("It matters not how strait the gate, How charged with punishments",
"the scroll, I am the master of my fate: I am the captain of my soul."
)
)
## works on a character vector but not the preferred method avoiding the
## repeated cost of doing sentence boundary disambiguation every time
## `emotion` is run
emotion(mytext)
## preferred method avoiding paying the cost
split_text <- get_sentences(mytext)
(emo <- emotion(split_text))
emotion(split_text, drop.unused.emotions = TRUE)
## Not run:
plot(emo)
plot(emo, drop.unused.emotions = FALSE)
plot(emo, facet = FALSE)
plot(emo, facet = 'negated')
library(data.table)
fear <- emo[
emotion_type == 'fear', ][,
text := unlist(split_text)][]
fear[emotion > 0,]
brady <- get_sentences(crowdflower_deflategate)
brady_emotion <- emotion(brady)
brady_emotion
## End(Not run)
|
element_id sentence_id word_count emotion_type emotion_count
1: 1 1 6 anger 0
2: 1 1 6 anger_negated 0
3: 1 1 6 anticipation 0
4: 1 1 6 anticipation_negated 0
5: 1 1 6 disgust 0
---
300: 15 1 27 surprise 0
301: 15 1 27 trust 0
302: 15 1 27 trust_negated 1
303: 15 1 27 sadness_negated 0
304: 15 1 27 surprise_negated 0
emotion
1: 0.00000000
2: 0.00000000
3: 0.00000000
4: 0.00000000
5: 0.00000000
---
300: 0.00000000
301: 0.00000000
302: 0.03703704
303: 0.00000000
304: 0.00000000
element_id sentence_id word_count emotion_type emotion_count
1: 1 1 6 anger 0
2: 1 1 6 anger_negated 0
3: 1 1 6 anticipation 0
4: 1 1 6 anticipation_negated 0
5: 1 1 6 disgust 0
---
300: 15 1 27 surprise 0
301: 15 1 27 trust 0
302: 15 1 27 trust_negated 1
303: 15 1 27 sadness_negated 0
304: 15 1 27 surprise_negated 0
emotion
1: 0.00000000
2: 0.00000000
3: 0.00000000
4: 0.00000000
5: 0.00000000
---
300: 0.00000000
301: 0.00000000
302: 0.03703704
303: 0.00000000
304: 0.00000000
element_id sentence_id word_count emotion_type emotion_count
1: 1 1 6 anger 0
2: 1 1 6 anger_negated 0
3: 1 1 6 anticipation 0
4: 1 1 6 anticipation_negated 0
5: 1 1 6 disgust 0
---
262: 15 1 27 joy_negated 0
263: 15 1 27 sadness 0
264: 15 1 27 surprise 0
265: 15 1 27 trust 0
266: 15 1 27 trust_negated 1
emotion
1: 0.00000000
2: 0.00000000
3: 0.00000000
4: 0.00000000
5: 0.00000000
---
262: 0.00000000
263: 0.00000000
264: 0.00000000
265: 0.00000000
266: 0.03703704
element_id sentence_id word_count emotion_type emotion_count emotion
1: 6 1 8 fear 1 0.12500000
2: 9 1 6 fear 1 0.16666667
3: 10 1 9 fear 1 0.11111111
4: 13 2 11 fear 1 0.09090909
5: 14 1 27 fear 3 0.11111111
text
1: I hate it when you lie to me.
2: I went to the terrible store
3: There is hate and love in each of us
4: Under the bludgeonings of chance My head is bloody, but unbowed.
5: Beyond this place of wrath and tears Looms but the Horror of the shade, And yet the menace of the years Finds, and shall find, me unafraid.
sentiment
1: 0.5
2: 0.5
3: 0.5
4: -1.0
5: -1.0
---
349452: -0.5
349453: -0.5
349454: 0.5
349455: 0.5
349456: 0.5
text
1: RT @DeSmogBlog: Bill Nye @TheScienceGuy: Screw #Deflategate.
2: You Should "Give a Fuck" About #ClimateChange Instead.
3: http://t.co/cKCXLgn9Xf _
4: RT @CBSPittsburgh: Former Belichick Scout: /Bill has always worked to that edge, crossed that edge.'
5: #Deflategate http://t.co/vd9Lhjmtsy
---
349452: #DeflateGate If it was weather/atmosphere, would it not have also affected Colts footballs too?
349453: #GiveUsABreak #BlatantLies #Patriots
349454: I'll probably get slammed & yes I'm a @Patriots fan, but I'm tired of #deflategate.
349455: Erase 1st half scores & Pats STILL win 28-7.
349456: #blowout
element_id sentence_id word_count emotion_type emotion_count
1: 1 1 7 anger 0
2: 1 2 7 anger_negated 0
3: 1 3 7 anticipation 0
4: 2 1 7 anticipation_negated 0
5: 2 2 7 disgust 0
---
349452: 11785 1 1 sadness_negated 0
349453: 11785 2 1 surprise 0
349454: 11786 1 1 surprise_negated 0
349455: 11786 2 1 trust 0
349456: 11786 3 1 trust_negated 0
emotion
1: 0
2: 0
3: 0
4: 0
5: 0
---
349452: 0
349453: 0
349454: 0
349455: 0
349456: 0
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.