Given a sigma algebra <mrow class="MJX-TeXAtom-ORD"> <mi class="MJX-tex-caligraphic" mathva

ureji1c8r1

ureji1c8r1

Answered question

2022-05-13

Given a sigma algebra S over some set X which is generated by some set S P ( X ), and a probability function P : S R why is it sufficient to check the conditions: P ( A ) 0 , P ( X ) = 1 , P ( n A n ) = n P ( A n ) for all A , A n S where ( A n ) is a disjoint sequence, to determine that P is a valid probability? Intuitively it is obvious that if we check the conditions to be true on a generating set then we should have the conditions true for all the elements of S , but how do we rigorously prove this fact?
For example suppose X = R , S = B (the Borel sigma algebra) and S = { I : I  is an interval in  R }. Now it is easy to check that P ( I ) = 1 2 π I x e x 2 / 2 d x satisfies the requisite conditions. How does it follow that the conditions are now met for P? Is it because of the result known as the Carathéodory's extension theorem? If so, can anyone refer me to proof, preferably in the context of probability measure?

Answer & Explanation

pulpasqsltl

pulpasqsltl

Beginner2022-05-14Added 18 answers

In the example you gave, you can just use P ( A ) = 1 2 π A e x 2 / 2 d x for A B ( R ).
Maybe you are interested in general situations when a "measure" defined on a smaller class S of sets can be extended to a measure on the sigma algebra generated by S. For that, see Caratheodory's theorem. Proofs are given in Klenke's probability theory book and Folland's real analysis book.

Do you have a similar question?

Recalculate according to your conditions!

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?