#
All entries for January 2012

## January 12, 2012

### Obviousness – norm implies metric

We have a norm, which satisfies positivity, linearity and the triangle inequality. We have a metric, which satisfies positivity, symmetricity and the triangle inequality. We wish to prove that every norm is a metric. "Obviously", the only property we need to prove is... the triangle inequality.

At the time, I couldn't see how linearity implied symmetry was obvious. Now, I think it's about as obvious as the triangle inequality was - there's a step to take, even though it's simple.

Linearity requires that . Symmetricity requires that d(x,y)=d(y,x).

To prove it, we let d(x,y)= ||x-y|| = |-1|||x-y|| = ||-x-(-y)|| = ||y-x|| = d(y,x).

## January 05, 2012

### [citation needed]; the difficulty of finding things

Typing in “tower property” in Google, I find that the first result is the ever ubiquitous Wikipedia (whose mastery of SEO means it turns up, with an occasional irrelevant article, on whatever subject you could care to name) article on Expected Value, in this case. Actually typing in “tower property” returns the article on “law of total expectation” which is apparently the one of its myriad names that Wikipedia has decided is most common. Looking at the other results on Google, even adding a helpful “statistics”, I find that “tower property” doesn’t appear to return anything else relevant. In fact, the only other place I can find it called “tower property” is in my notes :)

For nameless results, I find my best bet is simply to type in the result itself. For example, that E[XY]=E[YE[X|Y]] is proven at the end of this pdf document, which is likely lecture notes. If something has a lot of roots or powers, this is somewhat less applicable.

As of yet, I’ve not been able to find anything on what my notes refer to as “Fisher’s theorem”. It’s a theorem named after a famous mathematician who had many theorems named after him (some with others), so we’re already off the a bad start trying to find it. The theorem reads:

Let be indepedent random variables. Define and . Then:

*

* and are independent.

*

*

It looks like it has something to do with sample mean and variance, but I’m only taking the first module on this topic, so what its use is I can’t say.

### P4: It's All in the Presentation

Tutor was Bev Walshe.

Retrospective:

I now know more about giving presentations – while the vast majority of this was learned /in the workshop/, I can at least say I’ve learnt to look out for things to use.

I am still not calm most of the time – the only way I deal with it for now is taking a brief pause, and when that fails to calm my racing heart I get more nervous – OH NO WHY ISN’T IT WORKING – so for now my strategy is to try to avoid it in the first place, or just ignore it.

I haven’t had a chance (or haven’t taken a chance) to present to a large group of people – my favourite, as is, is one or two people asking questions as we go along – this enables me to keep up the interaction and remain fairly casual.