BayesTestR

Author

Prof. Eric A. Suess

library(rstanarm)
Loading required package: Rcpp
This is rstanarm version 2.32.2
- See https://mc-stan.org/rstanarm/articles/priors for changes to default priors!
- Default priors may change, so it's safest to specify priors, even if equivalent to the defaults.
- For execution on a local, multicore CPU with excess RAM we recommend calling
  options(mc.cores = parallel::detectCores())
library(bayestestR)
library(insight)

1 Linear Regression Example

model <- lm(Sepal.Length ~ Petal.Length, data=iris)
summary(model)

Call:
lm(formula = Sepal.Length ~ Petal.Length, data = iris)

Residuals:
     Min       1Q   Median       3Q      Max 
-1.24675 -0.29657 -0.01515  0.27676  1.00269 

Coefficients:
             Estimate Std. Error t value Pr(>|t|)    
(Intercept)   4.30660    0.07839   54.94   <2e-16 ***
Petal.Length  0.40892    0.01889   21.65   <2e-16 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 0.4071 on 148 degrees of freedom
Multiple R-squared:   0.76, Adjusted R-squared:  0.7583 
F-statistic: 468.6 on 1 and 148 DF,  p-value: < 2.2e-16
library(ggplot2)  # Load the package

# The ggplot function takes the data as argument, and then the variables 
# related to aesthetic features such as the x and y axes.
ggplot(iris, aes(x=Petal.Length, y=Sepal.Length)) +
  geom_point() +  # This adds the points
  geom_smooth(method="lm") # This adds a regression line
`geom_smooth()` using formula = 'y ~ x'

model <- stan_glm(Sepal.Length ~ Petal.Length, data=iris)

SAMPLING FOR MODEL 'continuous' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 2.1e-05 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.21 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.026 seconds (Warm-up)
Chain 1:                0.035 seconds (Sampling)
Chain 1:                0.061 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'continuous' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 8e-06 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.08 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.025 seconds (Warm-up)
Chain 2:                0.038 seconds (Sampling)
Chain 2:                0.063 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'continuous' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 7e-06 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.07 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.023 seconds (Warm-up)
Chain 3:                0.035 seconds (Sampling)
Chain 3:                0.058 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'continuous' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 7e-06 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.07 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 4: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 4: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 4: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 4: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 4: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 4: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 4: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 4: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 4: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 4: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 4: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.024 seconds (Warm-up)
Chain 4:                0.034 seconds (Sampling)
Chain 4:                0.058 seconds (Total)
Chain 4: 
describe_posterior(model)
Summary of Posterior Distribution

Parameter    | Median |       95% CI |   pd |          ROPE | % in ROPE |  Rhat |  ESS
--------------------------------------------------------------------------------------
(Intercept)  |   4.31 | [4.15, 4.46] | 100% | [-0.08, 0.08] |        0% | 1.000 | 4311
Petal.Length |   0.41 | [0.37, 0.45] | 100% | [-0.08, 0.08] |        0% | 1.000 | 4256
posteriors <- insight::get_parameters(model)

head(posteriors)  # Show the first 6 rows
  (Intercept) Petal.Length
1    4.237796    0.4151414
2    4.240779    0.4209020
3    4.305548    0.4183448
4    4.348451    0.4061654
5    4.342654    0.4086804
6    4.355828    0.3988900
ggplot(posteriors, aes(x = Petal.Length)) +
  geom_density(fill = "orange")

mean(posteriors$Petal.Length)
[1] 0.4085576
median(posteriors$Petal.Length)
[1] 0.4079891
map_estimate(posteriors$Petal.Length)
MAP Estimate

Parameter | MAP_Estimate
------------------------
x         |         0.40
ggplot(posteriors, aes(x = Petal.Length)) +
  geom_density(fill = "orange") +
  # The mean in blue
  geom_vline(xintercept=mean(posteriors$Petal.Length), color="blue") +
  # The median in red
  geom_vline(xintercept=median(posteriors$Petal.Length), color="red") 

  # The MAP in purple
  # geom_vline(xintercept=map_estimate(posteriors$Petal.Length), color="purple")
hdi(posteriors$Petal.Length, ci=0.89)
89% HDI: [0.38, 0.44]
describe_posterior(model, test = c("p_direction", "rope", "bayesfactor"))
Warning: Bayes factors might not be precise.
  For precise Bayes factors, sampling at least 40,000 posterior samples is
  recommended.
Summary of Posterior Distribution

Parameter    | Median |       95% CI |   pd |          ROPE | % in ROPE
-----------------------------------------------------------------------
(Intercept)  |   4.31 | [4.15, 4.46] | 100% | [-0.08, 0.08] |        0%
Petal.Length |   0.41 | [0.37, 0.45] | 100% | [-0.08, 0.08] |        0%

Parameter    |       BF |  Rhat |  ESS
--------------------------------------
(Intercept)  | 2.06e+92 | 1.000 | 4311
Petal.Length | 1.65e+19 | 1.000 | 4256

2 Correlations

result <- cor.test(iris$Sepal.Width, iris$Sepal.Length)
result

    Pearson's product-moment correlation

data:  iris$Sepal.Width and iris$Sepal.Length
t = -1.4403, df = 148, p-value = 0.1519
alternative hypothesis: true correlation is not equal to 0
95 percent confidence interval:
 -0.27269325  0.04351158
sample estimates:
       cor 
-0.1175698 
library(BayesFactor)
Loading required package: coda
Loading required package: Matrix
************
Welcome to BayesFactor 0.9.12-4.7. If you have questions, please contact Richard Morey (richarddmorey@gmail.com).

Type BFManual() to open the manual.
************
result <- correlationBF(iris$Sepal.Width, iris$Sepal.Length)
describe_posterior(result)
Summary of Posterior Distribution

Parameter | Median |        95% CI |     pd |          ROPE | % in ROPE |    BF |         Prior
-----------------------------------------------------------------------------------------------
rho       |  -0.11 | [-0.27, 0.05] | 91.55% | [-0.05, 0.05] |    20.95% | 0.509 | Beta (3 +- 3)
bayesfactor(result)
Bayes Factors for Model Comparison

    Model         BF
[2] (rho != 0) 0.509

* Against Denominator: [1] (rho = 0)
*   Bayes Factor Type: JZS (BayesFactor)
library(see)

plot(bayesfactor(result)) +
  scale_fill_pizza()

3 T-tests

library(dplyr)

Attaching package: 'dplyr'
The following objects are masked from 'package:stats':

    filter, lag
The following objects are masked from 'package:base':

    intersect, setdiff, setequal, union
library(ggplot2)

# Select only two relevant species
data <- iris %>% 
  filter(Species != "setosa") %>% 
  droplevels()

# Visualise distributions and observations
data %>% 
  ggplot(aes(x = Species, y = Sepal.Width, fill = Species)) +
  geom_violindot(fill_dots = "black", size_dots = 1) +
  scale_fill_material() +
  theme_modern()

result <- BayesFactor::ttestBF(formula = Sepal.Width ~ Species, data = data)
describe_posterior(result)
Summary of Posterior Distribution

Parameter  | Median |         95% CI |     pd |          ROPE | % in ROPE
-------------------------------------------------------------------------
Difference |  -0.19 | [-0.31, -0.06] | 99.78% | [-0.03, 0.03] |        0%

Parameter  |    BF |              Prior
---------------------------------------
Difference | 17.72 | Cauchy (0 +- 0.71)