1.0 DEFINITIONS
Probability is a quantitative measure of the likelihood of occurrence for chance events.
The concept of pure chance is not absolutely true because all events are pre-determined. Humans use chance or probability
estimates because of their limited knowledge. What appears random or chance to humans has an underlying deterministic order
known only to the Creator. The consistency of probabilities and predictions is based on underlying physical laws that govern
the universe.
Probability is commonly defined as relative
frequency of an event on repeated trials under the same conditions. Each possible outcome is called a sample point. The set
of all possible outcomes is called the probability space, S. If the probability space consists of a finite number of equally
likely events, probability of event ‘A’ is defined as: Pr (A) = n (A) / n (S) where n(A) = number of events of
type A and n(S) = the total number in the probability space. Special mathematical techniques called arrangements, permutations
and combinations, can enable us calculate the probability space theoretically without having to carry out the trials.
2.0 CLASSIFICATION OF PROBABILITY
Probability can be subjective (based personal feelings or intuition) or objective (based
on real data or experience). Objective probability can be measured or computed. Prior probability
is knowable or calculable without experimentation. Posterior probability is calculable from results of experimentation.
Bayesian probability combines prior probability (objective, subjective, or
a belief) with new data (from experimentation) to reach a conclusion called posterior probability. Bayesian probability is
a good representation of how conclusions are made from empirical observation in real life. Conditional probability is employed when there is partial information or when we want to make probability computations
easier by assuming conditionality. In conditional probability, the event depends on occurrence of a previous event.
3.0 TYPES OF EVENTS
On the scale of exclusion, events are classified as mutually exclusive or non-mutually
exclusive. Mutually exclusive events are those that cannot occur together like being dead and being alive. Not all mutually
exclusive events are equally likely. On the scale of independence, events are classified as independent or dependent. Under
independence, the occurrence of one event is not affected by occurrence or non-occurrence of another. Independent events can
occur at the same instant or subsequently. Some independent events are equally likely while others are not. On the scale of
exhaustion, two events A and B are said to be exhaustive if between them they occupy all the probability space ie A U B =
S and Pr (A U B) = 1.
Confusion occurs between mutually exclusive and independent events. Mutually exclusive
events cannot both occur at the same time i.e. Pr (A n B) = 0. Mutually exclusive events cannot be independent of one another
because the occurrence of one will prevent the other one from occurring. Independent events can both occur at the same time
but the occurrence of one is not affected by the occurrence of the other i.e. Pr (A n B) = Pr (A) x Pr (B).
4.0 LAWS OF PROBABILITY and MATHEMATICAL PROPERTIES
The total probability space is equal to 1.0. This is stated mathematically as Pr (S) =
1.0. If the probability of occurrence is p, the probability of non-occurrence is 1-p. If the sample space has equally likely
outcomes, then Pr (A) = n (A) / n(S). For two events p and q, p + q =1 where p is probability of occurrence of an event and
q is probability of its non-occurrence. Note that certainty has a probability of 1. This can be restated as Pr (Ā) =
1 – Pr (A) or as Pr (A) + Pr (Ā) = 1. The additive law, also called the ‘OR’ rule, refers to occurrence
of any or both events and is stated as Pr (A u B) = Pr (A) + Pr (B) - Pr (A n B) where Pr (A n B) = 0 for mutually exclusive
events. The multiplicative law for independent events refers to the joint occurrence of the events or the ‘AND’
rule and is stated as Pr (A n B) = Pr (A) x Pr (B). The range of probability is 0.0 to 1.0 and cannot be negative. Pr = 0.0
means the event is impossible. Pr = 1.0 means the event is absolutely certain. The odds of an event can be defined as {Pr
(A)} / {1 – Pr (A)}.
5.0 USES OF PROBABILITY
Probability
is used in classical statistical inference, Bayesian statistical inference, clinical decision-making, queuing theories, and
probability trees.
Computation exercises
- If the probability of success in a cardiac operation is 40%, the probability
of success in 5 consecutive such operations on 5 different patients is (a) 0.4 + 0.4 + 0.4 + 0.4 + 0.4 = 2.0 (b) 0.4 x 5 (c) 0.4 x 0.4 x 0.4 x 0.4 x 0.4
- Explain how you can compute the probability of measles in a child with both
fever and a rash and the Pr (fever) and Pr (rash) are known from the database.
- If 2 independent events have probability 1/4 each, the probability of their
joint occurrence is (a) 1/4 x 1/4 (b) 1/4 + 1/4
- Compute the probability that in a family of 4, the 4th child is
a boy
- In a medical class of 15, 8 are male and 10 are from the Brunei-Muara. Compute
the probability that a student selected at random will be (a) a male (b) from Brunei-Muara (c) is both a male and from the
east coast
Practical assignments
- Toss 1 coin 20 times and compute Pr (heads) and Pr (tails). Draw a graph of Pr (heads) vs. # tosses.
2. Toss 2 coins at a time. The successful event is when both show heads. Complete the following table
showing the probability of success for different throws. Complete the table and also draw a graph.
# Trials |
#Total outcomes |
#Successes |
Probability |
1 |
|
|
|
2 |
|
|
|
4 |
|
|
|
4 |
|
|
|
5 |
|
|
|
Key words and terms: Computer-Assisted
Diagnosis, Decision Support Techniques; Event, complimentary; Event, independent; Event, mutually exclusive; Gambler's fallacy;
Law, ‘and law; Law, ‘or’ law; Law, addition law; Law, multiplicative law; Probability space; Probability
trees; a priori probability; Anterior probability; Bayesian probability; Probability, classical probability; conditional probability;
empirical probability; frequentist probability; laws of probability; objective probability; posterior probability; subjective
probability; theoretical probability