Loading [MathJax]/extensions/mml2jax.js
Skip to main content
Library homepage
 

Text Color

Text Size

 

Margin Size

 

Font Type

Enable Dyslexic Font
Statistics LibreTexts

Search

  • Filter Results
  • Location
  • Classification
    • Article type
    • Author
    • Cover Page
    • License
    • Show TOC
    • Embed Jupyter
    • Transcluded
    • OER program or Publisher
    • Autonumber Section Headings
    • License Version
  • Include attachments
Searching in
About 9 results
  • https://stats.libretexts.org/Courses/Kansas_State_University/EDCEP_917%3A_Experimental_Design_(Yang)/06%3A_Multiple_Linear_Regression/6.04%3A_Effect_Size
    Because SS Total = SS Reg + SS Error, we see that the effect size is the percent of the variance, or deviation in \(y\) from its mean value, that is explained by the equation when taken as a whole. \(...Because SS Total = SS Reg + SS Error, we see that the effect size is the percent of the variance, or deviation in \(y\) from its mean value, that is explained by the equation when taken as a whole. \(R^2\) will vary between 0 and 1, with 0 indicating that none of the variation in \(y\) was explained by the equation and a value of 1 indicating that 100% of the variation in \(y\) was explained by the equation.
  • https://stats.libretexts.org/Courses/Fresno_City_College/Introduction_to_Business_Statistics_-_OER_-_Spring_2023/13%3A_Linear_Regression_and_Correlation/13.05%3A_The_Regression_Equation
    Using the OLS method we can now find the estimate of the error variance which is the variance of the squared errors, e 2 . This is sometimes called the standard error of the estimate. (Grammatically t...Using the OLS method we can now find the estimate of the error variance which is the variance of the squared errors, e 2 . This is sometimes called the standard error of the estimate. (Grammatically this is probably best said as the estimate of the error’svariance) The formula for the estimate of the error variance is:
  • https://stats.libretexts.org/Courses/Kansas_State_University/EDCEP_917%3A_Experimental_Design_(Yang)/06%3A_Multiple_Linear_Regression/6.05%3A_Multicollinearity
    The goal of the regression technique is to tease out the independent impacts of each of a set of independent variables on some hypothesized dependent variable. In this case it is impossible to determi...The goal of the regression technique is to tease out the independent impacts of each of a set of independent variables on some hypothesized dependent variable. In this case it is impossible to determine the variable that is the true cause of the effect on \(Y\). Furthermore, multicollinearity often results in failing to reject the null hypothesis that the \(X\) variable has no impact on \(Y\) when in fact \(X\) does have a statistically significant impact on \(Y\).
  • https://stats.libretexts.org/Bookshelves/Applied_Statistics/Business_Statistics_(OpenStax)/13%3A_Linear_Regression_and_Correlation/13.04%3A_The_Regression_Equation
    This page covers regression analysis, focusing on the estimation of variable dependence through linear models. It details the ordinary least squares (OLS) method, including residuals and the significa...This page covers regression analysis, focusing on the estimation of variable dependence through linear models. It details the ordinary least squares (OLS) method, including residuals and the significance of error variance for hypothesis testing. Key concepts like multicollinearity, which complicates the isolation of independent variable effects, are discussed, along with the importance of the multiple correlation coefficient \(R^2\).
  • https://stats.libretexts.org/Bookshelves/Applied_Statistics/Basic_Statistics_Using_R_for_Crime_Analysis_(Choi)/01%3A_Chapters/1.10%3A_Linear_Regression
    This page provides an introduction to regression analysis, highlighting its relationship with correlation analysis. Regression is a statistical method used to understand and predict the relationship b...This page provides an introduction to regression analysis, highlighting its relationship with correlation analysis. Regression is a statistical method used to understand and predict the relationship between dependent and independent variables. The chapter focuses on simple and multiple linear regression, explained through an inmate survey study assessing the impact of low self-control and age on risky lifestyles.
  • https://stats.libretexts.org/Courses/Kansas_State_University/EDCEP_917%3A_Experimental_Design_(Yang)/06%3A_Multiple_Linear_Regression/6.02%3A_Multiple_Regression_Equation
    The \(\hat{\mathrm{y}}\) is read "\(\bf y\) hat" and is the estimated value of \(\bf y\). (In Figure 13.8 \(\hat{C}\) represents the estimated value of consumption because it is on the estimated line....The \(\hat{\mathrm{y}}\) is read "\(\bf y\) hat" and is the estimated value of \(\bf y\). (In Figure 13.8 \(\hat{C}\) represents the estimated value of consumption because it is on the estimated line.) It is the value of \(y\) obtained using the regression line. \(\hat{\mathrm{y}}\) is not generally equal to \(y\) from the data.
  • https://stats.libretexts.org/Courses/Kansas_State_University/EDCEP_917%3A_Experimental_Design_(Yang)/06%3A_Multiple_Linear_Regression/6.03%3A_Regression_Coefficients
    If the error term is normally distributed and the variance of the estimates of the equation parameters, \(b_0\) and \(b_1\), are determined by the variance of the error term, it follows that the varia...If the error term is normally distributed and the variance of the estimates of the equation parameters, \(b_0\) and \(b_1\), are determined by the variance of the error term, it follows that the variances of the parameter estimates are also normally distributed. where \(b_1\) is the estimated value of the slope of the regression line, \(\beta_1\) is the hypothesized value of slope of the regression line, which is always zero, and \(S_{b_1}\) is the standard deviation of the estimate of \(b_1\).
  • https://stats.libretexts.org/Courses/Fresno_City_College/Book%3A_Business_Statistics_Customized_(OpenStax)/13%3A_Linear_Regression_and_Correlation/13.05%3A_The_Regression_Equation
    Using the OLS method we can now find the estimate of the error variance which is the variance of the squared errors, e 2 . This is sometimes called the standard error of the estimate. (Grammatically t...Using the OLS method we can now find the estimate of the error variance which is the variance of the squared errors, e 2 . This is sometimes called the standard error of the estimate. (Grammatically this is probably best said as the estimate of the error’svariance) The formula for the estimate of the error variance is:
  • https://stats.libretexts.org/Courses/Saint_Mary's_College_Notre_Dame/BFE_1201_Statistical_Methods_for_Finance_(Kuter)/08%3A_Linear_Regression_and_Correlation/8.05%3A_The_Regression_Equation
    Using the OLS method we can now find the estimate of the error variance which is the variance of the squared errors, e 2 . This is sometimes called the standard error of the estimate. (Grammatically t...Using the OLS method we can now find the estimate of the error variance which is the variance of the squared errors, e 2 . This is sometimes called the standard error of the estimate. (Grammatically this is probably best said as the estimate of the error’svariance) The formula for the estimate of the error variance is:

Support Center

How can we help?