condmixt.quant: Quantile computation for conditional mixtures.

Description Usage Arguments Details Value Author(s) References See Also Examples

Description

Quantile computation for conditional mixtures requires to solve numerically F(y)=p where F is the distribution function of the conditional mixture and p is a probability level.

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
condhparetomixt.quant(theta, h, m, x, p, a, b, trunc = TRUE)
condhparetomixt.dirac.quant(theta,h,m,x,p,a,b)
condhparetomixt.dirac.condquant(theta,h,m,x,p,a,b)
condgaussmixt.quant(theta,h,m,x,p,a,b,trunc=TRUE)
condgaussmixt.dirac.quant(theta,h,m,x,p,a,b)
condgaussmixt.dirac.condquant(theta,h,m,x,p,a,b)
condlognormixt.quant(theta,h,m,x,p,a,b)
condlognormixt.dirac.quant(theta,h,m,x,p,a,b)
condlognormixt.dirac.condquant(theta,h,m,x,p,a,b)
condbergamixt.quant(theta,h,x,p)

Arguments

theta

Vector of neural network parameters

h

Number of hidden units

m

Number of components

x

Matrix of explanatory (independent) variables of dimension d x n, d is the number of variables and n is the number of examples (patterns)

p

Probability level in [0,1]

a

Approximate lower bound on quantile value.

b

Approximate upper bound on quantile value.

trunc

Logical variable, if true, density is truncated below zero and re-weighted to make sure it integrates to one.

Details

condhparetomixt indicates a mixture with hybrid Pareto components, condgaussmixt for Gaussian components, condlognormixt for Log-Normal components, condbergam for a Bernoulli-Gamma two component mixture, dirac indicates that a discrete dirac component is included in the mixture condquant applies for mixtures with a dirac component at zero : quantiles are computed given that the variable is strictly positive, that is the quantile is computed for the continuous part of the mixture only : P(Y <= y | Y >0, X)

Value

Computed quantiles are stored in a matrix whose rows correspond to the probability levels and whose columns correspond to the number of examples n.

Author(s)

Julie Carreau

References

Bishop, C. (1995), Neural Networks for Pattern Recognition, Oxford

Carreau, J. and Bengio, Y. (2009), A Hybrid Pareto Mixture for Conditional Asymmetric Fat-Tailed Distributions, 20, IEEE Transactions on Neural Networks

See Also

condmixt.train,condmixt.nll, condmixt.init

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
# generate train data
ntrain <- 200
xtrain <- runif(ntrain)
ytrain <- rfrechet(ntrain,loc = 3*xtrain+1,scale =
0.5*xtrain+0.001,shape=xtrain+2)
plot(xtrain,ytrain,pch=22) # plot train data
qgen <- qfrechet(0.99,loc = 3*xtrain+1,scale = 0.5*xtrain+0.001,shape=xtrain+2)
points(xtrain,qgen,pch="*",col="orange")

# generate test data
ntest <- 200
xtest <- runif(ntest)
ytest <- rfrechet(ntest,loc = 3*xtest+1,scale =
0.5*xtest+0.001,shape=xtest+2)

h <- 2 # number of hidden units
m <- 4 # number of components

# train a mixture with hybrid Pareto components
thetaopt <- condhparetomixt.train(h,m,t(xtrain),ytrain, nstart=2,iterlim=100)
qmod <- condhparetomixt.quant(thetaopt,h,m,t(xtest),0.99,0,10,trunc=TRUE)
points(xtest,qmod,pch="o",col="blue")

condmixt documentation built on July 1, 2020, 6:04 p.m.