affixProductivity: Affix productivity

Description Usage Format Source References Examples

Description

Affix productivity, gauged by the P* productivity measure, for 27 English affixes in 44 texts.

Usage

1

Format

A data frame with 44 observations on the following 30 variables.

semi

a numeric vector of P*-values

anti

a numeric vector of P*-values

ee

a numeric vector of P*-values

ism

a numeric vector of P*-values

ian

a numeric vector of P*-values

ful

a numeric vector of P*-values

y

a numeric vector of P*-values

ness

a numeric vector of P*-values

able

a numeric vector of P*-values

ly

a numeric vector of P*-values

unV

a numeric vector of P*-values

unA

a numeric vector of P*-values

ize

a numeric vector of P*-values

less

a numeric vector of P*-values

erA

a numeric vector of P*-values

erC

a numeric vector of P*-values

ity

a numeric vector of P*-values

super

a numeric vector of P*-values

est

a numeric vector of P*-values

ment

a numeric vector of P*-values

ify

a numeric vector of P*-values

re

a numeric vector of P*-values

ation

a numeric vector of P*-values

in.

a numeric vector of P*-values

ex

a numeric vector of P*-values

en

a numeric vector of P*-values

be

a numeric vector of P*-values

AuthorCodes

a factor with levels

BLu

(King James Version: Luke-Acts)

BMo

(Book of Mormon)

CAs

(Aesop's fables, translation by Townsend)

CBo

(Baum, The Marvelous Land of Oz)

CBp

(Barrie, Peter Pan and Wendy)

CBw

(Baum, The Wonderful Wizard of Oz)

CCa

(Carroll, Alice's Adventures in Wonderland)

CCt

(Carroll, Through the Looking Glass and what Alice Found There)

CGr

(Grimm Fairy Tales, translations)

CKj

(Kipling, The Jungle Book)

LAp

(Austen, Pride and Prejudice)

LBp

(Burroughs, A Princess of Mars)

LBw

(Bronte, Wuthering Heights)

LCl

(Conrad, Lord Jim)

LCn

(Conrad, Nigger of the Narcissus)

LDb

(Doyle, The Casebook of Sherlock Holmes)

LDc

(Dickens, The Chimes: a Goblin Story)

LDC

(Dickens, A Christmas Carol)

LDh

(Doyle, The Hound of the Baskervilles)

LDv

(Doyle, The Valley of Fear)

LJc

(James, Confidence)

LJe

(James, The Europeans)

LLc

(London, The Call of the Wild)

LLs

(London, The Sea Wolf)

LMa

(Montgomery, Anne of Avonlea)

LMm

(Melville, Moby Dick)

LMn

(Morris, News from Nowhere)

LMp

(Milton, Paradise Lost)

LOs

(Orczy, The Scarlet Pimpernel)

LSd

(Stoker, Dracula)

LSs

(Chu, More than a Chance Meeting (Startrek))

LTa

(Trollope, Ayala's Angel)

LTe

(Trollope, The Eustace Diamonds)

LTf

(Trollope, Can you Forgive her?)

LTy

(Twain, A Connecticut Yankee in King Arthur's Court)

LWi

(Wells, The Invisible Man)

LWt

(Wells, The Time Machine)

LWw

(Wells, The War of the Worlds)

OAf

(The Federalist Papers)

OCh

(Texts sampled from Congress Hearings)

OCl

(Texts sampled from Clinton's Election Speeches)

ODo

(Darwin, On the Origin of the Species)

OGa

(Selected Texts from the Government Accounting Office)

OJe

(James, Essays in Radical Empiricism)

Registers

a factor with levels B (Biblical texts) C (Children's books) L (Literary texts) O (other)

Birth

a numeric vector for the author's year of birth (where available)

Source

Most texts were obtained from the Gutenberg Project (http://www.gutenberg.org/wiki/Main_Page) and the Oxford Text Archive (http://ota.ahds.ac.uk/).

References

Baayen, R. H. (1994) Derivational Productivity and Text Typology, Journal of Quantitative Linguistics, 1, 16-34.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
## Not run: 
data(affixProductivity)
affixes.pr = prcomp(affixProductivity[,1:(ncol(affixProductivity)-3)], 
center = TRUE, scale. = TRUE)
library(lattice)
trellis.device()
super.sym = trellis.par.get("superpose.symbol")
splom(data.frame(affixes.pr$x[,1:3]), 
groups = affixProductivity$Registers, 
panel = panel.superpose,
key = list(title  = "texts in productivity space",
text   = list(c("Religious", "Children", "Literary", "Other")),
points = list(pch = super.sym$pch[1:4], col = super.sym$col[1:4])))

## End(Not run)

Example output



languageR documentation built on May 2, 2019, 10:02 a.m.