3.4.2.4.1.1.4. statsmodels.emplike.aft_el.AFTResults.test_beta

AFTResults.test_beta(b0_vals, param_nums, ftol=1e-05, maxiter=30, print_weights=1)[source]

Returns the profile log likelihood for regression parameters ‘param_num’ at ‘b0_vals.’

Parameters:

b0_vals: list

The value of parameters to be tested

param_num: list

Which parameters to be tested

maxiter: int, optional

How many iterations to use in the EM algorithm. Default is 30

ftol: float, optional

The function tolerance for the EM optimization. Default is 10’‘**’‘-5

print_weights: bool

If true, returns the weights tate maximize the profile log likelihood. Default is False

Returns:

test_results: tuple

The log-likelihood and p-pvalue of the test.

Notes

The function will warn if the EM reaches the maxiter. However, when

optimizing over nuisance parameters, it is possible to reach a

maximum number of inner iterations for a specific value for the

nuisance parameters while the resultsof the function are still valid.

This usually occurs when the optimization over the nuisance parameters

selects paramater values that yield a log-likihood ratio close to

infinity.

Examples

import statsmodels.api as sm

import numpy as np

# Test parameter is .05 in one regressor no intercept model

data=sm.datasets.heart.load()

y = np.log10(data.endog)

x = data.exog

cens = data.censors

model = sm.emplike.emplikeAFT(y, x, cens)

res=model.test_beta([0], [0])

>>>res

>>>(1.4657739632606308, 0.22601365256959183)

#Test slope is 0 in model with intercept

data=sm.datasets.heart.load()

y = np.log10(data.endog)

x = data.exog

cens = data.censors

model = sm.emplike.emplikeAFT(y, sm.add_constant(x), cens)

res=model.test_beta([0], [1])

>>>res

>>>(4.623487775078047, 0.031537049752572731)