George Rebane
These pages see a lot of debate between commenters of widely varying ideological stripe. One of the problems often encountered by the reader of comment threads, embedded in a posted article's comment stream, is that one debater suddenly posts a comment that somehow does not follow what went on before – it seems to have taken a turn and headed off to left field. If the other commenter or debater attempts follow or point out the sudden detour, then the debate essentially disintegrates with no clear understanding by readers of where at least one (usually the dear departing) contenders stands on the original proposition.
I am posting this little piece as one that I and others can refer (and link) to in the future when it is desired to highlight the basis for a discussion thread’s going awry. And I will provide for this by defining the notions of semantic orthogonality and the independence of likelihoods (or probabilities) of propositions being true/correct.
A more subtle example might be debating the purpose for adding a particular rule to a game, and then switching the thread to whether or not a specific player has historically hewed to or violated other rules of the game. The two topics are orthogonal, and without bringing to light some heretofore unknown connection, the player’s historical behavior should have no bearing on the current discussion.
Next, to understand probabilistic independence we need to start with a little background.
‘What are the chances that a proposition is true?’ The truth value of a proposition (‘It will rain tomorrow.’) is usually understood as a likelihood (odds) or a probability (ranging from 0 to 1, or 0% to 100%). The probability of propositions is compactly expressed as P(X) where X is the propostition or its label. Say, you want to discover and express the chances for rain tomorrow which happens to be October 12th. So X = ‘It will rain on October 12th’. If that is all that you know, then you will look up the historical weather record for your location and find out how frequently it has rained on October 12th in the recent decades. If that statistic is 3 times during the last 20 years, then you’d probably accept that P(X) = 3/20 = 0.15 or 15%.
But now you look outside at sunset and you see the sky getting very cloudy. Would you still accept the 15% number? Now you’re interested in including an additional piece of information, the cloudy sky, into your estimation. In shorthand what you want to find is P(X|Y) which reads ‘the probability of X being true GIVEN that Y is true’. In this case the proposition Y = ‘It is cloudy on the eve of October 11th’.
To find this conditional probability, probability of X conditioned on Y being true, you’d probably need to do some more digging in the data, and discover that in October around here it has rained 64 times on days following 315 cloudy October evenings. Then you would compute that P(X|Y) = 64/315 = 0.203 = 20.3%, and conclude that cloudy evenings do have a bearing on it raining the following October day. In other words, those two propositions, X and Y, are somehow dependent on each other – a cloudy evening bears on the chance of rain the next day. And, of course, the strength of that dependence is first and foremost indicated by the probability jump or the difference P(X|Y) – P(X). [For the technical reader we recognize the reliability of this difference is also a function of the data sample sizes.]
With the above understanding we can now introduce the notion of the probabilistic independence of two propositions. Two propositions A and B are deemed to be independent if P(A) = P(A|B). For example, if A = ‘It will be fair tomorrow.’, and B = ‘John will have pie for dessert tonight.’, then most people will readily understand that quantitative definition of probabilistic independence. The weather doesn’t depend on John’s choice of dessert. [Again, the technical reader will recognize the equality as being the Bayesian’s definition of independence as opposed to the frequentist’s definition which is P(A,B) = P(A)P(B).]
For people who find these notions difficult to swallow, it is very hard to discuss the kinds of social, economic, political, and science topics found in these pages. Nevertheless, I believe that a good faith effort and a reread of the above will provide almost all RR readers the necessary understanding of orthogonal semantics and probabilistic independence; at least enough of an understanding to be alerted when some commenter makes that sudden leap into left field.
Check out today's (6/3) Dilbert.
Posted by: RL Crabb | 03 June 2011 at 06:59 PM
Good point Bob, I've already pointed to Russ Steele's NCMW which features the Dilbert cartoon here
http://ncwatch.typepad.com/media/2011/06/this-reminds-me-of-some-of-the-discussion-on-this-blog.html
Posted by: George Rebane | 03 June 2011 at 07:06 PM
Very good George. It brings back fond memories of my school days. Great stuff if you use it. For me, I have simplified my life to "ya know, I think it is going to rain tomorrow" and leave it at that.
As it relates to your comment thread, very appropriate. Liberals love to violate these rules. It helps them quickly move away from answering the question, or simply answer a question with a question of their own moving perpendicular to the original question.
Posted by: JohnS | 21 June 2011 at 11:30 AM