Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adaptive Metropolis and Degenerate Likelihoods #166

Closed
julienmalard opened this issue Oct 11, 2017 · 2 comments
Closed

Adaptive Metropolis and Degenerate Likelihoods #166

julienmalard opened this issue Oct 11, 2017 · 2 comments

Comments

@julienmalard
Copy link

When trying to fit a model with an external function, the external function will not rerun at each iteration if the variable list (l_var_paráms) includes a degenerate likelihood or a uniform likelihood with upper==lower.

import matplotlib.pyplot as dib
import numpy as np
import pymc

i = 0

class ModBayes(object):
    def __init__(símismo, función, dic_argums, d_obs, lista_d_paráms, aprioris, lista_líms, id_calib,
                 función_llenar_coefs):

        símismo.id = id_calib

        l_var_paráms = [pymc.Normal('n', mu=10, tau=1), pymc.Uniform('u', upper=1, lower=1)]

        # Without the following 4 lines, the model will not run correctly.
        for x in l_var_paráms.copy():
            if isinstance(x, pymc.Uniform):
                if x.parents['upper'] == x.parents['lower']:
                    l_var_paráms.remove(x)

        def fun(**kwargs):
            global i
            print(i)
            i += 1

            res = función(**kwargs)['Normal']
            return res

        @pymc.deterministic(trace=True)
        def simul(_=l_var_paráms):
            return fun(**dic_argums)

        l_var_obs = []

        for tipo, obs in d_obs.items():
            if tipo == 'Normal':
                tau = 1 / simul['sigma'] ** 2
                var_obs = pymc.Normal('obs', mu=simul['mu'], tau=tau, value=obs, observed=True)
                l_var_obs.extend([var_obs, tau])

        símismo.MCMC = pymc.MCMC({simul, *l_var_paráms, *l_var_obs})

    def calib(símismo, rep, quema, extraer):

        símismo.MCMC.use_step_method(pymc.AdaptiveMetropolis, símismo.MCMC.stochastics)

        símismo.MCMC.sample(iter=500, burn=0, thin=1, verbose=1)
@fonnesbeck
Copy link
Member

PyMC2 is no longer being actively developed, though you are welcome to submit a pull request. I would strongly recommend looking at PyMC3, as the Hamiltonian MC samplers (particularly NUTS) are far more effective than adaptive Metropolis.

@julienmalard
Copy link
Author

Thank you!
I am mainly dealing with a relatively slow external model (1-2 seconds to evaluate, including reading output; https://github.com/julienmalard/Tikon/). Am I correct in understanding that NUTS would not be a possible MC sampler for this case? (And if not, would you have any recommendations?) I have been seriously considering porting to PyMC3, but wanted to understand better what the performance improvement might be before dedicating myself to it.

Many thanks,
Julien Malard

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants