# Measurement noise assessment¶

In this notebook, we illustrate how to use pyABC with different noise models. For simplicity, we use a simple ODE model of a conversion reaction. For simplicity, we consider a single parameter:

[1]:

import numpy as np
import scipy as sp
import matplotlib.pyplot as plt
import pyabc
%matplotlib inline

init = np.array([1, 0])

def f(y, t0, theta1, theta2=np.exp(-2)):
x1, x2 = y
dx1 = - theta1 * x1 + theta2 * x2
dx2 =   theta1 * x1 - theta2 * x2
return dx1, dx2

theta1_true = np.exp(-2.5)
theta_true = {'theta1': theta1_true}

theta_min, theta_max = 0.05, 0.15
prior = pyabc.Distribution(
theta1=pyabc.RV("uniform", theta_min, theta_max-theta_min))

n_time = 10
measurement_times = np.linspace(0, 10, n_time)


We assume that the underlying dynamics of our observations follow the following model:

[2]:

def model(pars):
sol = sp.integrate.odeint(
f, init, measurement_times,
args=(pars["theta1"],))
return {'X_2': sol[:,1]}

true_trajectory = model(theta_true)


However, we assume that our measurements are subject to additive Gaussian noise:

[3]:

sigma = 0.02

def model_noisy(pars):
sim = model(pars)
return {'X_2': sim['X_2'] + sigma * np.random.randn(n_time)}

measured_data = model_noisy(theta_true)

# plot data
plt.plot(measurement_times, true_trajectory['X_2'], color="C0",
label='Simulation')
plt.scatter(measurement_times, measured_data['X_2'],
color="C1", label='Data')
plt.xlabel('Time $t$')
plt.ylabel('Measurement $Y$')
plt.title('Conversion reaction: True parameters fit')
plt.legend()
plt.show()


## Ignoring noise¶

In the notebook “Ordinary Differential Equations: Conversion Reaction”, this model is used without accounting for a noise model, which is strictly speaking not correct. In this case, we get the following result:

[4]:

def distance(simulation, data):
return np.absolute(data["X_2"] - simulation["X_2"]).sum()

abc = pyabc.ABCSMC(model, prior, distance)
abc.new(pyabc.create_sqlite_db_id(), measured_data)
history = abc.run(max_nr_populations=10)

INFO:Sampler:Parallelizing the sampling on 4 cores.
INFO:History:Start <ABCSMC(id=128, start_time=2020-06-15 19:23:37.333605, end_time=None)>
INFO:ABC:Calibration sample before t=0.
INFO:Epsilon:initial epsilon is 0.5624782572935602
INFO:ABC:t: 0, eps: 0.5624782572935602.
INFO:ABC:Acceptance rate: 100 / 229 = 4.3668e-01, ESS=1.0000e+02.
INFO:ABC:t: 1, eps: 0.3394877969449191.
INFO:ABC:Acceptance rate: 100 / 201 = 4.9751e-01, ESS=9.9892e+01.
INFO:ABC:t: 2, eps: 0.19851989584245494.
INFO:ABC:Acceptance rate: 100 / 210 = 4.7619e-01, ESS=9.9249e+01.
INFO:ABC:t: 3, eps: 0.15919232295862715.
INFO:ABC:Acceptance rate: 100 / 200 = 5.0000e-01, ESS=9.9827e+01.
INFO:ABC:t: 4, eps: 0.14438278698726847.
INFO:ABC:Acceptance rate: 100 / 215 = 4.6512e-01, ESS=9.9683e+01.
INFO:ABC:t: 5, eps: 0.1370462087460876.
INFO:ABC:Acceptance rate: 100 / 223 = 4.4843e-01, ESS=9.9604e+01.
INFO:ABC:t: 6, eps: 0.1346182762154195.
INFO:ABC:Acceptance rate: 100 / 198 = 5.0505e-01, ESS=9.8976e+01.
INFO:ABC:t: 7, eps: 0.13449453193467115.
INFO:ABC:Acceptance rate: 100 / 239 = 4.1841e-01, ESS=9.4218e+01.
INFO:ABC:t: 8, eps: 0.13444005696155228.
INFO:ABC:Acceptance rate: 100 / 209 = 4.7847e-01, ESS=9.5063e+01.
INFO:ABC:t: 9, eps: 0.13439720882423833.
INFO:ABC:Acceptance rate: 100 / 256 = 3.9062e-01, ESS=9.8735e+01.
INFO:History:Done <ABCSMC(id=128, start_time=2020-06-15 19:23:37.333605, end_time=2020-06-15 19:23:41.795215)>


As one can see in the below plot, this converges to a point estimate as $$\varepsilon\rightarrow 0$$, and does not correctly represent the posterior. In particular, in general this point estimate will not capture the correct parameter value (indicated by the grey line).

[5]:

_, ax = plt.subplots()
for t in range(history.max_t + 1):
pyabc.visualization.plot_kde_1d_highlevel(
history, x="theta1", t=t, refval=theta_true, refval_color='grey',
xmin=theta_min, xmax=theta_max, numx=100, ax=ax, label=f"Iteration {t}")
ax.legend()
plt.show()


## Add noise to the model output¶

To correctly account for noise, there are essentially two possibilities: Firstly, we can use the noisified model output:

[6]:

abc = pyabc.ABCSMC(model_noisy, prior, distance)
abc.new(pyabc.create_sqlite_db_id(), measured_data)
history_noisy_output = abc.run(max_nr_populations=10)

INFO:Sampler:Parallelizing the sampling on 4 cores.
INFO:History:Start <ABCSMC(id=129, start_time=2020-06-15 19:23:42.824635, end_time=None)>
INFO:ABC:Calibration sample before t=0.
INFO:Epsilon:initial epsilon is 0.6026361265317445
INFO:ABC:t: 0, eps: 0.6026361265317445.
INFO:ABC:Acceptance rate: 100 / 201 = 4.9751e-01, ESS=1.0000e+02.
INFO:ABC:t: 1, eps: 0.40307937702819413.
INFO:ABC:Acceptance rate: 100 / 183 = 5.4645e-01, ESS=9.9927e+01.
INFO:ABC:t: 2, eps: 0.26804878544751437.
INFO:ABC:Acceptance rate: 100 / 254 = 3.9370e-01, ESS=9.8752e+01.
INFO:ABC:t: 3, eps: 0.23005068461777956.
INFO:ABC:Acceptance rate: 100 / 269 = 3.7175e-01, ESS=8.7397e+01.
INFO:ABC:t: 4, eps: 0.19748032366319818.
INFO:ABC:Acceptance rate: 100 / 535 = 1.8692e-01, ESS=9.5180e+01.
INFO:ABC:t: 5, eps: 0.18275242228026406.
INFO:ABC:Acceptance rate: 100 / 793 = 1.2610e-01, ESS=8.6146e+01.
INFO:ABC:t: 6, eps: 0.16605305697113107.
INFO:ABC:Acceptance rate: 100 / 1622 = 6.1652e-02, ESS=7.6403e+01.
INFO:ABC:t: 7, eps: 0.15351169146934754.
INFO:ABC:Acceptance rate: 100 / 3122 = 3.2031e-02, ESS=8.4115e+01.
INFO:ABC:t: 8, eps: 0.1418635171783479.
INFO:ABC:Acceptance rate: 100 / 5402 = 1.8512e-02, ESS=8.8656e+01.
INFO:ABC:t: 9, eps: 0.12830890205714054.
INFO:ABC:Acceptance rate: 100 / 12580 = 7.9491e-03, ESS=8.4644e+01.
INFO:History:Done <ABCSMC(id=129, start_time=2020-06-15 19:23:42.824635, end_time=2020-06-15 19:23:59.654520)>

[7]:

_, ax = plt.subplots()
for t in range(history_noisy_output.max_t + 1):
pyabc.visualization.plot_kde_1d_highlevel(
history_noisy_output, x="theta1", t=t,
refval=theta_true, refval_color='grey',
xmin=theta_min, xmax=theta_max, ax=ax, numx=200, label=f"Iteration {t}")
ax.legend()

[7]:

<matplotlib.legend.Legend at 0x7feef67a5ac0>


This curve is much broader and, as one can show, closer to the correct posterior.

## Modify the acceptance step¶

Secondly, we can alternatively use the non-noisy model, but adjust the acceptance step:

[8]:

acceptor = pyabc.StochasticAcceptor()
kernel = pyabc.IndependentNormalKernel(var=sigma**2)
eps = pyabc.Temperature()

abc = pyabc.ABCSMC(model, prior, kernel, eps=eps, acceptor=acceptor)
abc.new(pyabc.create_sqlite_db_id(), measured_data)
history_acceptor = abc.run(max_nr_populations=10)

INFO:Sampler:Parallelizing the sampling on 4 cores.
INFO:History:Start <ABCSMC(id=130, start_time=2020-06-15 19:24:00.496237, end_time=None)>
INFO:ABC:Calibration sample before t=0.
INFO:ABC:t: 0, eps: 18.524729283158507.
INFO:ABC:Acceptance rate: 100 / 383 = 2.6110e-01, ESS=1.0000e+02.
INFO:ABC:t: 1, eps: 2.384675437876244.
INFO:ABC:Acceptance rate: 100 / 349 = 2.8653e-01, ESS=9.8885e+01.
INFO:ABC:t: 2, eps: 1.0.
INFO:ABC:Acceptance rate: 100 / 190 = 5.2632e-01, ESS=9.6103e+01.
INFO:ABC:Stopping: minimum epsilon.
INFO:History:Done <ABCSMC(id=130, start_time=2020-06-15 19:24:00.496237, end_time=2020-06-15 19:24:04.042482)>


We use a pyabc.StochasticAcceptor for the acceptor, replacing the default pyabc.UniformAcceptor, in order to accept when

$\frac{\pi(D|y,\theta)}{c}\geq[0,1],$

where $$\pi(D|y,\theta)$$ denotes the distribution of noisy data $$D$$ given non-noisy model output $$y$$ and parameters $$\theta$$. Here, we use a pyabc.IndependentNormalKernel in place of a pyabc.Distance to capture the normal noise $$\pi(D|y,\theta)\sim\mathcal{N}(D|y,\theta,\sigma)$$. Also other noise models are possible, including Laplace or binomial noise. In place of the pyabc.Epsilon, we employ a pyabc.Temperature which implements schemes to decrease a temperature $$T\searrow 1$$, s.t. in iteration $$t$$ we sample from

$\pi(\theta,y|D) \propto \pi(D|y,\theta)^{1/T_t}p(y|\theta)\pi(\theta),$

where $$p(y|\theta)$$ denotes the model output likelihood, and $$\pi(\theta)$$ the parameters prior.

Each of acceptor, kernel and temperature offers various configuration options, however the default parameters have shown to be quite stable already.

[9]:

_, ax = plt.subplots()
for t in range(history_acceptor.max_t + 1):
pyabc.visualization.plot_kde_1d_highlevel(
history_acceptor, x="theta1", t=t,
refval=theta_true, refval_color='grey',
xmin=theta_min, xmax=theta_max, ax=ax, numx=200, label=f"Iteration {t}")
ax.legend()
plt.show()


We see that we get essentially the same posterior distribution as with the noisy output (a bit more peaked, showing that the approximation error with the noisy output was not negligible yet), however at a much lower computational cost, as the below plot shows:

[10]:

histories = [history_noisy_output, history_acceptor]
labels = ["noisy model", "stochastic acceptor"]

pyabc.visualization.plot_sample_numbers(histories, labels)
plt.show()


Thus, the stochastic acceptor is the proposed method to use in practice. For further details consult the API documentation.

## Estimate noise parameters¶

Our formulation of the modified acceptance step allows the noise model to be parameter-dependent (so does in theory also the noisified model output). Thus one can estimate parameters like e.g. the standard deviation of Gaussian noise on-the-fly. A parameter-dependent noise model is specified by passing a function to the kernel, which takes the parameters and returns an array of variances corresponding to the data. This is currently implemented for the pyabc.IndependentNormalKernel, pyabc.IndependentLaplaceKernel, pyabc.BinomialKernel.

Parameters are often estimated on a logarithmic scale if fold changes are of interest. We show this here exemplarily with the example of the standard deviation of a normal noise kernel:

[11]:

theta_true_var = {'theta1': theta1_true, 'std': np.log10(sigma)}

prior = pyabc.Distribution(
theta1=pyabc.RV("uniform", theta_min, theta_max-theta_min),
std=pyabc.RV("uniform", -2.5, 2))

def var(p):
return 10**(2*p['std']) * np.ones(n_time)

acceptor = pyabc.StochasticAcceptor()
kernel = pyabc.IndependentNormalKernel(var=var)
eps = pyabc.Temperature()

abc = pyabc.ABCSMC(model, prior, kernel, eps=eps, acceptor=acceptor, population_size=100)
abc.new(pyabc.create_sqlite_db_id(), measured_data)
history_acceptor_var = abc.run(max_nr_populations=10)

INFO:Sampler:Parallelizing the sampling on 4 cores.
INFO:History:Start <ABCSMC(id=131, start_time=2020-06-15 19:24:04.966413, end_time=None)>
INFO:ABC:Calibration sample before t=0.
INFO:ABC:t: 0, eps: 18.853333200943837.
INFO:ABC:Acceptance rate: 100 / 315 = 3.1746e-01, ESS=9.9992e+01.
INFO:ABC:t: 1, eps: 11.24653846776521.
INFO:ABC:Acceptance rate: 100 / 313 = 3.1949e-01, ESS=8.5619e+01.
INFO:ABC:t: 2, eps: 7.119315454596579.
INFO:ABC:Acceptance rate: 100 / 323 = 3.0960e-01, ESS=7.6864e+01.
INFO:ABC:t: 3, eps: 4.827923867251667.
INFO:ABC:Acceptance rate: 100 / 423 = 2.3641e-01, ESS=8.3261e+01.
INFO:ABC:t: 4, eps: 3.7136446447402682.
INFO:ABC:Acceptance rate: 100 / 367 = 2.7248e-01, ESS=6.6476e+01.
INFO:ABC:t: 5, eps: 2.8565397729146036.
INFO:ABC:Acceptance rate: 100 / 347 = 2.8818e-01, ESS=6.6422e+01.
INFO:ABC:t: 6, eps: 2.1972537102600755.
INFO:ABC:Acceptance rate: 100 / 377 = 2.6525e-01, ESS=7.4152e+01.
INFO:ABC:t: 7, eps: 1.6901301053216593.
INFO:ABC:Acceptance rate: 100 / 367 = 2.7248e-01, ESS=6.8683e+01.
INFO:ABC:t: 8, eps: 1.1783208490227317.
INFO:ABC:Acceptance rate: 100 / 389 = 2.5707e-01, ESS=6.4552e+01.
INFO:ABC:t: 9, eps: 1.0.
INFO:ABC:Acceptance rate: 100 / 266 = 3.7594e-01, ESS=7.6172e+01.
INFO:ABC:Stopping: minimum epsilon.
INFO:History:Done <ABCSMC(id=131, start_time=2020-06-15 19:24:04.966413, end_time=2020-06-15 19:24:15.057312)>

[12]:

fig, ax = plt.subplots(1, 2)
for t in range(history_acceptor_var.max_t + 1):
pyabc.visualization.plot_kde_1d_highlevel(
history_acceptor_var, x="theta1", t=t,
refval=theta_true_var, refval_color='grey',
xmin=theta_min, xmax=theta_max, ax=ax[0], numx=200, label=f"Iteration {t}")
pyabc.visualization.plot_kde_1d_highlevel(
history_acceptor_var, x="std", t=t,
refval=theta_true_var, refval_color='grey',
xmin=-2.5, xmax=-0.5, ax=ax[1], numx=200, label=f"Iteration {t}")
ax[1].set_xlabel("log10(std)")
ax[1].set_ylabel(None)
ax[1].legend()
fig.set_size_inches((8, 4))
fig.tight_layout()
plt.show()


We see that we are able to estimate both parameters quite reasonably (the exact details of course depending on the data and model). For the present model, one could still derive the analytical posterior distribution, which we here omit for computational reasons. For an example, see this notebook.