Basic Probability Concepts

Probability is a field of mathematics that deals with the likelihood of different outcomes. It is used in a wide range of disciplines, from statistics and computer science to economics and physics.

Experiment, Sample Space, and Event

  • An experiment is a procedure that yields one of a given set of possible outcomes.

  • The sample space of an experiment is the set of all possible outcomes.

  • An event is a subset of the sample space.

Probability

The probability of an event is a number between 0 and 1 that expresses the likelihood of the event. The closer the probability is to 1, the more likely the event; the closer it is to 0, the less likely the event. The probabilities of all the outcomes in a sample space add up to 1.

Basic Probability Rules

  • Complementary Events: The probability of the complement of an event (the event not happening) is 1 minus the probability of the event.

  • Addition Rule: The probability of the union of two events (either event or both happening) is the sum of their individual probabilities minus the probability of their intersection (both happening).

  • Multiplication Rule: The probability of the intersection of two independent events (both happening) is the product of their individual probabilities.

Conditional Probability

Conditional probability is the probability of an event given that another event has occurred. It's denoted P(A|B), read "probability of A given B."

Bayes' Theorem

Bayes' Theorem provides a way to update probabilities based on new information. It's especially useful when dealing with conditional probabilities.

Random Variables and Distributions

A random variable is a variable whose value is subject to variations due to chance. A probability distribution assigns a probability to each possible value of the random variable.

Two important types of distributions are the binomial distribution (for a series of Bernoulli trials) and the normal distribution (for a large number of independently and identically distributed random variables).

Expectation and Variance

The expected value (or mean) of a random variable is the long-run average value of repetitions of the experiment it represents.

The variance of a random variable is a measure of how much the values of the random variable vary around the expected value. The standard deviation is the square root of the variance.