## On a niblet of wisdom

So I was doing something rather dull at work, basically I had a handful of expense distributions, like, 25% of the expense goes to this category and 75% to that category, and I had to aggregate them in some way, so I averaged each category out only to find out that the resultant average of buckets summed to one! Â In other words, the finite average of discrete distributions (over finite, comparable categories) is a distribution itself. Â I thought this was a little bit surprising, so I'll just elucidate the proof here.

**Claim:** Â Take discrete distributions over comparable categories. Â They are distributions because . Â Then . Â In other words, the (finite) average of each of the categories yields a new (probability) distribution, with .

**Proof:** Â We knowÂ . Â Thus, such distributions must sum to : Â . Â In other words, . Â Absolute convergence of the sums (they are, after all, finite sums) allows us to switch the order of the sums, and . Â Finally, a division by yields the desired result:Â .

This can be extrapolated to a finite number of continuous distributions (averaging finitely point-wise across the distributions).

**Claim:** Â Take continuous probability distributions , so that , Â . Â Then .

**Proof:** Â Again distributions sum to , and we have . Â The linearity of the integral operator allows us to exchange the sum within the argument, thus . Â Finally, dividing by yields the desired result: Â .

One has to wonder if one cannot average infinitely, as by if there were point-wise sequences of the distributions such that they converge. Â This is an interesting thought in my mind at present.