Skip to content

Commit 18d03d4

Browse files
committed
docs
1 parent e7b865f commit 18d03d4

File tree

5 files changed

+148
-117
lines changed

5 files changed

+148
-117
lines changed

R/estimate_means.R

Lines changed: 40 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,12 @@
6969
#' (_Chatton and Rohrer 2024_).
7070
#'
7171
#' You can set a default option for the `estimate` argument via `options()`,
72-
#' e.g. `options(modelbased_estimate = "average")`
72+
#' e.g. `options(modelbased_estimate = "average")`. When you set `estimate` to
73+
#' `"average"`, it calculates the average based only on the data points that
74+
#' actually exist. This is in particular important for two or more focal
75+
#' predictors, because it doesn't generate a *complete* grid of all theoretical
76+
#' combinations of predictor values. Consequently, the output may not include
77+
#' all the values.
7378
#' @param backend Whether to use `"marginaleffects"` (default) or `"emmeans"` as
7479
#' a backend. Results are usually very similar. The major difference will be
7580
#' found for mixed models, where `backend = "marginaleffects"` will also average
@@ -146,40 +151,42 @@
146151
#' [emmeans::emtrends()] or this [website](https://marginaleffects.com/)) is
147152
#' recommended to understand the idea behind these types of procedures.
148153
#'
149-
#' - Model-based **predictions** is the basis for all that follows. Indeed,
150-
#' the first thing to understand is how models can be used to make predictions
151-
#' (see [estimate_link()]). This corresponds to the predicted response (or
152-
#' "outcome variable") given specific predictor values of the predictors (i.e.,
153-
#' given a specific data configuration). This is why the concept of [`reference
154-
#' grid()`][insight::get_datagrid()] is so important for direct predictions.
155-
#'
156-
#' - **Marginal "means"**, obtained via [estimate_means()], are an extension
157-
#' of such predictions, allowing to "average" (collapse) some of the predictors,
158-
#' to obtain the average response value at a specific predictors configuration.
159-
#' This is typically used when some of the predictors of interest are factors.
160-
#' Indeed, the parameters of the model will usually give you the intercept value
161-
#' and then the "effect" of each factor level (how different it is from the
162-
#' intercept). Marginal means can be used to directly give you the mean value of
163-
#' the response variable at all the levels of a factor. Moreover, it can also be
164-
#' used to control, or average over predictors, which is useful in the case of
165-
#' multiple predictors with or without interactions.
166-
#'
167-
#' - **Marginal contrasts**, obtained via [estimate_contrasts()], are
168-
#' themselves at extension of marginal means, in that they allow to investigate
169-
#' the difference (i.e., the contrast) between the marginal means. This is,
170-
#' again, often used to get all pairwise differences between all levels of a
171-
#' factor. It works also for continuous predictors, for instance one could also
172-
#' be interested in whether the difference at two extremes of a continuous
173-
#' predictor is significant.
154+
#' - Model-based **predictions** is the basis for all that follows. Indeed, the
155+
#' first thing to understand is how models can be used to make predictions
156+
#' (see [estimate_link()]). This corresponds to the predicted response (or
157+
#' "outcome variable") given specific predictor values of the predictors
158+
#' (i.e., given a specific data configuration). This is why the concept of
159+
#' [`reference grid()`][insight::get_datagrid()] is so important for direct
160+
#' predictions.
161+
#'
162+
#' - **Marginal "means"**, obtained via [estimate_means()], are an extension of
163+
#' such predictions, allowing to "average" (collapse) some of the predictors,
164+
#' to obtain the average response value at a specific predictors
165+
#' configuration. This is typically used when some of the predictors of
166+
#' interest are factors. Indeed, the parameters of the model will usually give
167+
#' you the intercept value and then the "effect" of each factor level (how
168+
#' different it is from the intercept). Marginal means can be used to directly
169+
#' give you the mean value of the response variable at all the levels of a
170+
#' factor. Moreover, it can also be used to control, or average over
171+
#' predictors, which is useful in the case of multiple predictors with or
172+
#' without interactions.
173+
#'
174+
#' - **Marginal contrasts**, obtained via [estimate_contrasts()], are themselves
175+
#' at extension of marginal means, in that they allow to investigate the
176+
#' difference (i.e., the contrast) between the marginal means. This is, again,
177+
#' often used to get all pairwise differences between all levels of a factor.
178+
#' It works also for continuous predictors, for instance one could also be
179+
#' interested in whether the difference at two extremes of a continuous
180+
#' predictor is significant.
174181
#'
175182
#' - Finally, **marginal effects**, obtained via [estimate_slopes()], are
176-
#' different in that their focus is not values on the response variable, but the
177-
#' model's parameters. The idea is to assess the effect of a predictor at a
178-
#' specific configuration of the other predictors. This is relevant in the case
179-
#' of interactions or non-linear relationships, when the effect of a predictor
180-
#' variable changes depending on the other predictors. Moreover, these effects
181-
#' can also be "averaged" over other predictors, to get for instance the
182-
#' "general trend" of a predictor over different factor levels.
183+
#' different in that their focus is not values on the response variable, but
184+
#' the model's parameters. The idea is to assess the effect of a predictor at
185+
#' a specific configuration of the other predictors. This is relevant in the
186+
#' case of interactions or non-linear relationships, when the effect of a
187+
#' predictor variable changes depending on the other predictors. Moreover,
188+
#' these effects can also be "averaged" over other predictors, to get for
189+
#' instance the "general trend" of a predictor over different factor levels.
183190
#'
184191
#' **Example:** Let's imagine the following model `lm(y ~ condition * x)` where
185192
#' `condition` is a factor with 3 levels A, B and C and `x` a continuous

man/estimate_contrasts.Rd

Lines changed: 36 additions & 28 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

man/estimate_means.Rd

Lines changed: 36 additions & 28 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)