@@ -1899,6 +1899,164 @@ Then, use ``matplotlib`` library to show the pmf and the cdf of a GenBeta::
18991899.. image :: images/cdfGenBeta.png
19001900 :width: 400
19011901
1902+ ``Multinomial ``
1903+ ~~~~~~~~~~~~~~~~~~~~
1904+
1905+ .. autoclass :: gemact.distributions.Multinomial
1906+ :members:
1907+ :inherited-members:
1908+
1909+ Parametrization
1910+ ^^^^^^^^^^^^^^^^^^^^^^^^^^
1911+
1912+ The multinomial probability mass function is given by
1913+
1914+ .. math :: f(\mathbf{x}) = \frac{n!}{x_1!\cdots x_k!} \prod_{i=1}^k p_i^{x_i},
1915+
1916+ where :math: `\sum _{i=1 }^k x_i = n`, :math: `x_i \in \mathbb {N}_0 `, :math: `p_i \ge 0 ` and :math: `\sum _{i=1 }^k p_i = 1 `.
1917+
1918+ Example code on the usage of the Multinomial class::
1919+
1920+ from gemact import distributions
1921+
1922+ dist = distributions.Multinomial(n=10, p=[0.2, 0.3, 0.5])
1923+ x = [2, 3, 5]
1924+
1925+ print('PMF at x:', dist.pmf(x))
1926+ print('Mean vector:', dist.mean())
1927+ print('Random variates:\n', dist.rvs(size=3, random_state=42))
1928+
1929+ ``Dirichlet_Multinomial ``
1930+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
1931+
1932+ .. autoclass :: gemact.distributions.Dirichlet_Multinomial
1933+ :members:
1934+ :inherited-members:
1935+
1936+ Parametrization
1937+ ^^^^^^^^^^^^^^^^^^^^^^^^^^
1938+
1939+ Let :math: `\alpha _0 = \sum _{i=1 }^k \alpha _i`. The Dirichlet-multinomial mass function is
1940+
1941+ .. math :: f(\mathbf{x}) = \frac{n!}{x_1!\cdots x_k!} \frac{\Gamma(\alpha_0)}{\Gamma(n+\alpha_0)} \prod_{i=1}^k \frac{\Gamma(x_i+\alpha_i)}{\Gamma(\alpha_i)},
1942+
1943+ where :math: `x_i \in \mathbb {N}_0 `, :math: `\sum _{i=1 }^k x_i = n` and :math: `\alpha _i > 0 ` for all :math: `i`.
1944+
1945+ Example code on the usage of the Dirichlet_Multinomial class::
1946+
1947+ from gemact import distributions
1948+
1949+ dist = distributions.Dirichlet_Multinomial(n=12, alpha=[2.0, 3.0, 4.0], seed=7)
1950+ x = [4, 5, 3]
1951+
1952+ print('PMF at x:', dist.pmf(x))
1953+ print('Mean vector:', dist.mean())
1954+ print('Random variates:\n', dist.rvs(size=2, random_state=21))
1955+
1956+ ``NegMultinom ``
1957+ ~~~~~~~~~~~~~~~~~~~~
1958+
1959+ .. autoclass :: gemact.distributions.NegMultinom
1960+ :members:
1961+ :inherited-members:
1962+
1963+ Parametrization
1964+ ^^^^^^^^^^^^^^^^^^^^^^^^^^
1965+
1966+ Let :math: `p_0 = 1 - \sum _{i=1 }^k p_i`. The negative multinomial mass function is
1967+
1968+ .. math :: f(\mathbf{x}) = \frac{\Gamma\!\left(x_0 + \sum_{i=1}^k x_i\right)}{\Gamma(x_0) \prod_{i=1}^k x_i!} p_0^{x_0} \prod_{i=1}^k p_i^{x_i},
1969+
1970+ where :math: `x_0 > 0 `, :math: `x_i \in \mathbb {N}_0 `, :math: `p_i \ge 0 ` and :math: `\sum _{i=1 }^k p_i < 1 `.
1971+
1972+ Example code on the usage of the NegMultinom class::
1973+
1974+ from gemact import distributions
1975+ import numpy as np
1976+
1977+ dist = distributions.NegMultinom(x0=5, p=[0.2, 0.3])
1978+ x = np.array([3, 4])
1979+
1980+ print('PMF at x:', dist.pmf(x))
1981+ print('Mean vector:', dist.mean())
1982+ print('Covariance matrix:\n', dist.cov())
1983+
1984+ ``MultivariateBinomial ``
1985+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
1986+
1987+ .. autoclass :: gemact.distributions.MultivariateBinomial
1988+ :members:
1989+ :inherited-members:
1990+
1991+ Parametrization
1992+ ^^^^^^^^^^^^^^^^^^^^^^^^^^
1993+
1994+ The multivariate binomial vector :math: `\mathbf {X} = (X_1 ,\ldots ,X_k)` models the number of successes in
1995+ :math: `n` common Bernoulli trials with success probabilities :math: `p_i = \Pr (B_i = 1 )` and pairwise joint success
1996+ probabilities :math: `p_{ij} = \Pr (B_i = 1 , B_j = 1 )` for each coordinate.
1997+ The marginal distributions are binomial with
1998+
1999+ .. math :: X_i \sim \mathrm{Binomial}(n, p_i),\quad i=1,\ldots,k,
2000+
2001+ while the dependence structure is governed by :math: `p_{ij}`.
2002+ The first two moments are
2003+
2004+ .. math :: \mathbb{E}[X_i] = n p_i, \qquad \operatorname{Cov}(X_i, X_j) = n \left(p_{ij} - p_i p_j\right).
2005+
2006+ Example code on the usage of the MultivariateBinomial class::
2007+
2008+ from gemact import distributions
2009+ import numpy as np
2010+
2011+ p_joint = np.array([[0.3, 0.12, 0.10],
2012+ [0.12, 0.4, 0.08],
2013+ [0.10, 0.08, 0.35]])
2014+
2015+ dist = distributions.MultivariateBinomial(n=20, p_joint=p_joint)
2016+ x = np.array([5, 6, 4])
2017+
2018+ print('PMF at x:', dist.pmf(x))
2019+ print('Mean vector:', dist.mean())
2020+ print('Covariance matrix:\n', dist.cov())
2021+
2022+ ``MultivariatePoisson ``
2023+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
2024+
2025+ .. autoclass :: gemact.distributions.MultivariatePoisson
2026+ :members:
2027+ :inherited-members:
2028+
2029+ Parametrization
2030+ ^^^^^^^^^^^^^^^^^^^^^^^^^^
2031+
2032+ Let :math: `\mathbf {X} = (X_1 ,\ldots ,X_k)` be a multivariate Poisson vector with marginal means :math: `\mu _i`
2033+ and pairwise joint intensities :math: `\lambda _{ij}`. Each component can be represented as
2034+
2035+ .. math :: X_i = Y_i + \sum_{j \ne i} Y_{ij},
2036+
2037+ where :math: `Y_i \sim \mathrm {Poisson}\left (\mu _i - \sum _{j \ne i} \lambda _{ij}\right )` and
2038+ :math: `Y_{ij} = Y_{ji} \sim \mathrm {Poisson}(\lambda _{ij})` are independent. Consequently,
2039+
2040+ .. math :: \mathbb{E}[X_i] = \mu_i, \qquad \operatorname{Cov}(X_i, X_j) = \lambda_{ij} \quad (i \ne j).
2041+
2042+ Example code on the usage of the MultivariatePoisson class::
2043+
2044+ from gemact import distributions
2045+ import numpy as np
2046+
2047+ mu = np.array([2.0, 3.0, 1.5])
2048+ mu_joint = np.array([[0.0, 0.4, 0.2],
2049+ [0.4, 0.0, 0.3],
2050+ [0.2, 0.3, 0.0]])
2051+
2052+ dist = distributions.MultivariatePoisson(mu=mu, mu_joint=mu_joint)
2053+ x = np.array([1, 2, 0])
2054+
2055+ print('PMF at x:', dist.pmf(x))
2056+ print('Mean vector:', dist.mean())
2057+ print('Covariance matrix:\n', dist.cov())
2058+ print('Random variates:\n', dist.rvs(size=5, random_state=123))
2059+
19022060The ``copulas `` module
19032061---------------------------
19042062
0 commit comments