MoTBFs: Learning Hybrid Bayesian Networks using Mixtures of Truncated
Basis Functions
Learning, manipulation and evaluation of mixtures of truncated basis functions
(MoTBFs), which include mixtures of polynomials (MOPs) and mixtures of truncated
exponentials (MTEs). MoTBFs are a flexible framework for modelling hybrid Bayesian
networks (I. Pérez-Bernabé, A. Salmerón, H. Langseth (2015) <doi:10.1007/978-3-319-20807-7_36>; H. Langseth, T.D. Nielsen, I. Pérez-Bernabé, A. Salmerón (2014) <doi:10.1016/j.ijar.2013.09.012>; I. Pérez-Bernabé, A. Fernández, R. Rumí, A. Salmerón (2016) <doi:10.1007/s10618-015-0429-7>). The package provides functionality for learning univariate, multivariate and
conditional densities, with the possibility of incorporating prior knowledge. Structural
learning of hybrid Bayesian networks is also provided. A set of useful tools is provided,
including plotting, printing and likelihood evaluation. This package makes use of S3
objects, with two new classes called 'motbf' and 'jointmotbf'.
Version: |
1.4.1 |
Depends: |
R (≥ 3.2.0) |
Imports: |
quadprog, lpSolve, bnlearn, methods, ggm, Matrix |
Published: |
2022-04-18 |
Author: |
Inmaculada Pérez-Bernabé, Antonio Salmerón, Thomas D. Nielsen, Ana D. Maldonado |
Maintainer: |
Ana D. Maldonado <ana.d.maldonado at ual.es> |
License: |
LGPL-3 |
NeedsCompilation: |
yes |
CRAN checks: |
MoTBFs results |
Documentation:
Downloads:
Linking:
Please use the canonical form
https://CRAN.R-project.org/package=MoTBFs
to link to this page.