MoTBFs: Learning Hybrid Bayesian Networks using Mixtures of Truncated Basis Functions
Version 1.0

Learning, manipulation and evaluation of mixtures of truncated basis functions (MoTBFs), which include mixtures of polynomials (MOPs) and mixtures of truncated exponentials (MTEs). MoTBFs are a flexible framework for modelling hybrid Bayesian networks. The package provides functionality for learning univariate, multivariate and conditional densities, with the possibility of incorporating prior knowledge. Structural learning of hybrid Bayesian networks is also provided. A set of useful tools is provided, including plotting, printing and likelihood evaluation. This package makes use of S3 objects, with two new classes called 'motbf' and 'jointmotbf'.

Getting started

Package details

AuthorInmaculada Pérez-Bernabé, Antonio Salmerón
Date of publication2015-09-28 09:26:43
MaintainerInmaculada Pérez-Bernabé <[email protected]>
LicenseLGPL-3
Version1.0
Package repositoryView on CRAN
Installation Install the latest version of this package by entering the following in R:
install.packages("MoTBFs")

Try the MoTBFs package in your browser

Any scripts or data that you put into this service are public.

MoTBFs documentation built on May 30, 2017, 12:17 a.m.