SyntheticDifferenceInDifferencesWeightFitter#
- class causalpy.pymc_models.SyntheticDifferenceInDifferencesWeightFitter[source]#
Bayesian weight fitter for Synthetic Difference-in-Differences.
Encodes both the unit-weight module and the time-weight module in a single PyMC model. Unit weights balance control units against treated units in the pre-treatment period; time weights balance pre-treatment periods against post-treatment periods for control units. Both use the softmax-over-Normal-logits parameterization with a pinned reference level.
The treatment effect is not estimated inside this model. It is computed analytically from the weight posteriors via the double-difference formula in the experiment class.
Defines the PyMC model:
\[\begin{split}\omega &= \mathrm{softmax}(0, \tilde{\omega}_2, \ldots, \tilde{\omega}_{N_\text{co}}) \\ \bar{Y}_{\text{tr},t} &\sim \mathrm{Normal}(\omega_0 + \boldsymbol{\omega}^\top \mathbf{Y}_{\text{co},t},\; \sigma_\omega) \\ \lambda &= \mathrm{softmax}(0, \tilde{\lambda}_2, \ldots, \tilde{\lambda}_{T_\text{pre}}) \\ \bar{Y}_{i,\text{post}} &\sim \mathrm{Normal}(\lambda_0 + \boldsymbol{\lambda}^\top \mathbf{Y}_{i,\text{pre}},\; \sigma_\lambda)\end{split}\]Notes
This model implements the cut-posterior formulation of Bayesian SDiD. Modules 1 and 2 share no parameters and are conditionally independent given the data. Running them in a single MCMC call is a convenience; the important property is that no treatment-effect likelihood feeds back into the weight posteriors.
The prior scales on the logits play the role of the regularization parameter in the frequentist SDiD:
omega_rawdefaultsigma=1.0(zeta_omega=1.0): moderate regularization, allowing weights between SC-sparse and DiD-uniform.lam_rawdefaultsigma=100.0(zeta_lambda=0.01): essentially flat, letting time weights concentrate on the most informative pre-treatment periods.
References
Methods
SyntheticDifferenceInDifferencesWeightFitter.add_coord(name)Register a dimension coordinate with the model.
SyntheticDifferenceInDifferencesWeightFitter.add_coords(...)Vectorized version of
Model.add_coord.SyntheticDifferenceInDifferencesWeightFitter.add_named_variable(var)Add a random graph variable to the named variables of the model.
SyntheticDifferenceInDifferencesWeightFitter.build_model(X, ...)Build the PyMC model with both unit-weight and time-weight modules.
SyntheticDifferenceInDifferencesWeightFitter.calculate_cumulative_impact(impact)SyntheticDifferenceInDifferencesWeightFitter.calculate_impact(...)Calculate the causal impact as the difference between observed and predicted values.
SyntheticDifferenceInDifferencesWeightFitter.check_start_vals(...)Check that the logp is defined and finite at the starting point.
SyntheticDifferenceInDifferencesWeightFitter.compile_d2logp([...])Compiled log probability density hessian function.
SyntheticDifferenceInDifferencesWeightFitter.compile_dlogp([...])Compiled log probability density gradient function.
SyntheticDifferenceInDifferencesWeightFitter.compile_fn(outs, *)SyntheticDifferenceInDifferencesWeightFitter.compile_logp([...])Compiled log probability density function.
Clone the model.
SyntheticDifferenceInDifferencesWeightFitter.create_value_var(...)Create a
TensorVariablethat will be used as the random variable's "value" in log-likelihood graphs.Hessian of the models log-probability w.r.t.
Debug model function at point.
Gradient of the models log-probability w.r.t.
SyntheticDifferenceInDifferencesWeightFitter.eval_rv_shapes()Evaluate shapes of untransformed AND transformed free variables.
Draw samples from posterior, prior predictive, and posterior predictive distributions.
SyntheticDifferenceInDifferencesWeightFitter.get_context([...])SyntheticDifferenceInDifferencesWeightFitter.initial_point([...])Compute the initial point of the model.
Elemwise log-probability of the model.
SyntheticDifferenceInDifferencesWeightFitter.logp_dlogp_function([...])Compile a PyTensor function that computes logp and gradient.
SyntheticDifferenceInDifferencesWeightFitter.make_obs_var(...)Create a TensorVariable for an observed random variable.
Check if name has prefix and adds if needed.
Check if name has prefix and deletes if needed.
SyntheticDifferenceInDifferencesWeightFitter.point_logps([...])Compute the log probability of point for all random variables in the model.
Predict data given input data X
SyntheticDifferenceInDifferencesWeightFitter.print_coefficients(labels)Print the model coefficients with their labels.
SyntheticDifferenceInDifferencesWeightFitter.priors_from_data(X, y)Set default priors for unit and time weight modules.
SyntheticDifferenceInDifferencesWeightFitter.profile(outs, *)Compile and profile a PyTensor function which returns
outsand takes values of model vars as a dict as an argument.SyntheticDifferenceInDifferencesWeightFitter.register_data_var(data)Register a data variable with the model.
SyntheticDifferenceInDifferencesWeightFitter.register_rv(...)Register an (un)observed random variable with the model.
SyntheticDifferenceInDifferencesWeightFitter.replace_rvs_by_values(...)Clone and replace random variables in graphs with their value variables.
Score the Bayesian \(R^2\) given inputs
Xand outputsy.Change the values of a data variable in the model.
Update a mutable dimension.
SyntheticDifferenceInDifferencesWeightFitter.set_initval(...)Set an initial value (strategy) for a random variable.
SyntheticDifferenceInDifferencesWeightFitter.shape_from_dims(dims)Produce a graphviz Digraph from a PyMC model.
Attributes
basic_RVsList of random variables the model is defined in terms of.
continuous_value_varsAll the continuous value variables in the model.
coordsCoordinate values for model dimensions.
datalogpPyTensor scalar of log-probability of the observed variables and potential terms.
default_priorsdim_lengthsThe symbolic lengths of dimensions in the model.
discrete_value_varsAll the discrete value variables in the model.
isrootobservedlogpPyTensor scalar of log-probability of the observed variables.
parentpotentiallogpPyTensor scalar of log-probability of the Potential terms.
prefixrootunobserved_RVsList of all random variables, including deterministic ones.
unobserved_value_varsList of all random variables (including untransformed projections), as well as deterministics used as inputs and outputs of the model's log-likelihood graph.
value_varsList of unobserved random variables used as inputs to the model's log-likelihood (which excludes deterministics).
varlogpPyTensor scalar of log-probability of the unobserved random variables (excluding deterministic).
varlogp_nojacPyTensor scalar of log-probability of the unobserved random variables (excluding deterministic) without jacobian term.
- __init__(sample_kwargs=None, priors=None)#
- Parameters:
- Return type:
None
- classmethod __new__(*args, **kwargs)#