Loading [MathJax]/jax/input/MathML/config.js
Skip to main content
Library homepage
 

Text Color

Text Size

 

Margin Size

 

Font Type

Enable Dyslexic Font
Statistics LibreTexts

Search

  • Filter Results
  • Location
  • Classification
    • Article type
    • Author
    • Cover Page
    • License
    • Show TOC
    • Embed Jupyter
    • Transcluded
    • OER program or Publisher
    • Autonumber Section Headings
    • License Version
  • Include attachments
Searching in
About 2 results
  • https://stats.libretexts.org/Courses/Cerritos_College/Introduction_to_Statistics_with_R/19%3A_Bayesian_Statistics/19.01%3A_Probabilistic_Reasoning_by_Rational_Agents
    To say the same thing using fancy statistical jargon, what I’ve done here is divide the joint probability of the hypothesis and the data P(d,h) by the marginal probability of the data P(d), and this i...To say the same thing using fancy statistical jargon, what I’ve done here is divide the joint probability of the hypothesis and the data P(d,h) by the marginal probability of the data P(d), and this is what gives us the posterior probability of the hypothesis given that we know the data have been observed.
  • https://stats.libretexts.org/Workbench/Learning_Statistics_with_SPSS_-_A_Tutorial_for_Psychology_Students_and_Other_Beginners/14%3A_Bayesian_Statistics/14.01%3A_Probabilistic_Reasoning_by_Rational_Agents
    To say the same thing using fancy statistical jargon, what I’ve done here is divide the joint probability of the hypothesis and the data P(d,h) by the marginal probability of the data P(d), and this i...To say the same thing using fancy statistical jargon, what I’ve done here is divide the joint probability of the hypothesis and the data P(d,h) by the marginal probability of the data P(d), and this is what gives us the posterior probability of the hypothesis given that we know the data have been observed.

Support Center

How can we help?