Saturday, April 23, 2011

Site Move!

I've finally taken the next step and bought some internet real estate. Please direct your attention to:

http://www.carmanline.com

for all your future blogging needs. All existing posts and comments have been replicated there.

Thank you!

Subjective Objectivity

It’s the Easter weekend! This means that the work week finished one day early and many of us felt a lot of familiar ‘Friday-y’ feelings on Thursday. In my case, I have a pair of pants that I only wear on Friday that are particularly snazzy. You see, they’re dry-clean only, so I don’t want them getting dirty too fast, so I allow myself to wear them on Fridays to celebrate the end of the work week.

On Thursday morning, I found myself naturally reaching for my Friday pants. I began to think about the nature of days. Today wasn’t Friday, it was Thursday. Yet it was also the end of the work week, which made it substantially similar to a Friday. Further, I began to wonder what the days of the week really meant and this spiralled off into thinking about subjectivity and objectivity.

In moments like this, I’m reminded of my classes on post-modernism. Jean-Paul Baudrillard may argue that not only did last Thursday feel like Friday, last Thursday was in fact Friday. After all, what does Friday really mean? For most of us, the major thing that Friday means is that it’s the last day of the work week. Therefore, this week, Thursday was Friday not merely in our minds but in fact. How can this be?

Objective subjectivity
We think of things that are definite in our lives as objective. Friday is an objective thing: everyone agrees that it’s Friday. Yet cosmologically speaking, there is nothing objective about our days of the week. There is no fact of nature or anything external to society on which the days are based. Even having seven days is arbitrary; it just means that we can (almost) get four weeks into a phase of the moon. Yet three weeks of ten days would be even closer.

Even the year is arbitrary. It makes some sense in that it’s the time it takes for things to get warm, cold and warm again. Yet this means little to anyone living in the equator, and the point at which our year begins could be anywhere on the calendar. Indeed, in other cultures, New Year is at entirely different points of the Earth’s transit around the sun.

On the larger scale, there’s the disagreement over when the Millennium celebrations should have been. Most of us celebrated on New Year’s Eve 1999. As many rightly pointed out, mathematically speaking, the Millennium should have been celebrated a year later, as there was no year zero. My argument then and now was this: what is more important? Either we celebrate 2000 years since an arbitrary-chosen date that is meaningless to most people in the world and, even for those it does have meaning to, does not correspond to any actual historical event but to early estimations of the birth of someone who may or may not have been divine... or we celebrate the fact that all four digits of the year are changing all at once. Can we really argue that either is more important, or less arbitrary?

So strictly speaking, there is absolutely nothing about our calendar, except possibly the length of the year, which is objective; yet we see it as fundamentally objective. Why? because it’s shared by wider society. The European calendar is fundamental to organising our lives. It may be arbitrary, but it’s useful – if we say ‘I’ll see you on Monday’, everyone who uses the same calendar (which is most of the people we’re likely to be talking to) knows what we mean. That power, that utility, is what makes it seem objective.

Morality as a calendar
I suspect that the same holds true for other ‘objective’ elements. Take morality: is murder wrong because it is inherently, objectively wrong; or is it wrong because society decides it is wrong? Certainly, every human society since the dawn of history has had rules against murder (though many have varied in their definition of murder, or their exceptions that rendered killing acceptable). Does that mean it’s objective, or simply that rules against murder are useful for holding societies together? After all, if murder is acceptable, contract law breaks down. How can I get you to build a house for me if you can’t trust that I won’t kill you at the end instead of pay you? A society that condones killing in all cases is no society at all.

As with most things... it isn’t that simple. Hard cases may make bad laws, as lawyers remind us, but they’re also very useful for ironing out logical flaws. Take the classic case of the Hindu practice of Sati. This was the widespread but infrequent practice of wives burning themselves alive after their husband’s death in order to remit his sins and ensure his passage into paradise. Defenders of the practice pointed out that it was voluntary; yet as in today’s controversy over the Burkha, opponents say that this ‘choice’ was often illusory, governed as it was by societal values and rejection of the woman if she chose poorly.

A purely subjective view of morality cannot possibly condemn this, and supporters of objective morality hold up cases like this as an example of the failure of alternative systems of morality. The British colonials solved this via the time-honoured method of declaring their morals superior by virtue of possessing more guns than anyone else involved in the debate.

Is there a solution to this? Youtube user SisyphusRedeemed (whose channel I would heartily recommend) points out that in the philosophical literature, there is a lot of muddying of the water between the very ideas of ‘objective’ and ‘subjective’, particularly in the area of morality. I have not read this literature and do not intend to offer my insights as either original or conclusive, merely as interesting and personal. My attempt to add to the debate is this:

I reject the notion that if morality is based on societal norms that it is not useful. As we can see by the calendar, societal norms lead to a certain level of objectivity in what is otherwise subjective. The subjectivity of the calendar does not lead to anarchy and disagreements over what day it is (unless you’re dealing in real time with people in other time zones). There is no reason to believe that the lack of a purely objective basis for morality would lead to people doing whatever they want.

Not to avoid the question
Nevertheless, the problem of Sati remains. Can we criticise the British for imposing their morals on another culture, or should we say they did the right thing? As a student of anthropology, the issue of dealing with and judging other cultures is a familiar one. Early in the 20th Century, anthropology as a discipline turned away from judgment into complete cultural relativism. These days, there’s a trend away from that towards a limited relativism: that is, withholding judgment of a culture until one understands what it means to those inside it. We stay away from uninformed, kneejerk responses and judge from a position of wisdom instead of cultural and moral superiority.

I hope that by that, I have provided a pathway by which a morality which is not completely objective (yet also not entirely subjective) can provide real, practical solutions to the usual objection. Perhaps I will revise my thoughts once I delve further into the literature and immerse myself in the debate. For now, this suffices for my purposes.

Thursday, April 21, 2011

What do we really know?

The topic of knowledge is one often relegated to the more esoteric branches of philosophy. It does, however, become of profound practical importance when weighing up different fields of knowledge. Take the obvious conflict between science and certain types of religion (namely, literalist/fundamentalist churches that reject science when it disagrees with their texts). A common ploy by apologists is to say that science can’t be trusted because it always changes; even scientists admit they don’t really know that what they say is true! The believer, on the other hand, knows what the truth is, because it’s written in their book.

This problem doesn’t only affect religious discussion, but also discourses in popular science. When Phil Jones of the University of East Anglia’s Climatic Research Unit (CRU) said that he could not be ‘certain within the 95th percentile’ that warming had occurred when looking only at the data from the last 15 years, this was widely reported by the Daily Mail as ‘CRU head says no warming since 1995’. Not only was this profoundly dishonest journalism in that it twisted his words, it betrayed a poor knowledge of what scientific certainty means.

Science, after all, doesn’t really claim to ‘know’ anything at all. What does this mean? What do we really know?


Armchair philosophy time
Like many discussions, the question of what we ‘know’ depends very heavily on the definition of terms. In this case, the most important one is the word ‘know’. What do we really mean by this? This question isn’t quite so simple as it seems; we use the word to cover quite a wide range of potential certainly values, up to 100% certainty.

Even then, as human beings we are very poor at dealing with concepts like 100% certainty. We can feel 100% sure about something even if we recognise that there are plausible ways that we can be wrong. Our thought processes (and importantly, our use of language) projects an inaudible and invisible ‘reasonably’ into many sentences; we are ‘reasonably’ sure, ‘reasonably’ certain.

My claim here is that if we define ‘knowledge’ as things we are 100% certain about, that there is no way at all to claim that we ‘know’ anything. This is where certain philosophical though experiments and science-fiction concepts come from. How can we be sure that we aren’t wired into the Matrix? We can’t. Every single piece of data we collect on the natural world might simply be computer-generated algorithms, with no way of telling the difference.


Self-Identity
Surely we can be sure of ourselves. Cogito Ergo Sum – I think, therefore I am. Even asking the question of our own existence means that we exist. To this end, I’d like to ask you to imagine a far future science-fiction world. In this world, there are computers capable of generating extraordinarily complex patterns and transmitting them into the minds of citizens. These citizens experience something tremendous: all of a sudden they have memories of other lives projected into their heads. These are not the memories of real people; they are generated by the computers. The process lasts only seconds, but the memories stay around for some time.

These memories feel very real to the person who experiences them, even though they are false. They are profoundly affecting, but not ‘real’ in any conceivable sense. Crucially, the character present in the memories, like any truly memorable cinematic hero, believes that he or she genuinely inhabits the world that they perceive. They are not separate to it, they are part of it. They are not aware that they are merely created by a computer and have no real existence.

Now imagine that you are in fact not a person, but one of these memories. You exist in a single moment of time; your entire existence is simply your collection of memories. Is there any way to argue that someone like this is, in fact, a person? If the construct existed in time and could learn new things, then perhaps. But being a pattern simply implanted into someone’s head, memories already formed... such a ‘person’ is more analogous to a cinematic character than a true human being.

How could you tell the difference? You may protest that we’re present in now, we perceive time, so we are forming new memories. True; but think back to five minutes ago. Then, you were convinced you were in the present, and now that’s just a memory. What if all you have is memories? You would not be able to tell the difference because you exist in a single point of time.

I think this thought experiment does have issues and could certainly do with some work; nevertheless, I think that there is a path we can see towards being unsure of the nature of our being. While cogito ergo sum proves that we exist... it does not prove what we are. Fundamentally, it remains incapable of proving that we are people and not simple procedurally-generated computer images.


Uncertainty
It’s my position that there is no fact beyond our mere existence that we can be 100% sure of. We exist in a world not of certainty, but of doubt. However, our minds have created methods – psychological, social and linguistic – of dealing with doubt. Hence, we talk about (and even think about) certainty as being total when it quite clearly is not. We don’t even have language that can properly communicate the difference between ‘certain enough to satisfy any reasonable doubt’ and ‘beyond any possible doubt, no matter how unlikely’. The latter is so alien to us that we never create a word for it.

Science is one of the few disciplines (along with certain branches of philosophy) that attempts this task. Using proper scientific language, we can never really claim to ‘know’ something. After all, what does experiment really prove? Even discounting dishonesty, all that is really proven is that a particular experiment was conducted and a particular result occurred. If a hundred experiments all turned out the same way, the evidence is certainly growing... but it can by definition never reach 100%.

This is where the evangelical preachers such as William Lane Craig come in. They understand that science is never 100% certain and take that as a weakness. Look at all the arguments about evolution; that just shows they aren’t really sure. Christopher Monckton uses similar arguments against climate change: just look at the debates! Any consensus is a myth.

It’s my suggestion that what they’re doing here (deliberately or not) is jumping between different definitions of ‘certainty’, ‘knowledge’ and similar words. They use the scientific sense (which is precise and means complete lack of any possible alternative, no matter how bizarre) when dismissing scientific findings they don’t like and the common sense (which has necessary levels of uncertainty built in) when discussing the things that they do like. Monckton can dismiss claims for global warming because they aren’t scientifically certain, but accept claims against it when they are only colloquially ‘certain’... even if they’re less likely to be true than the ones he dismisses.

(Particularly ingenuous is the habit of apologists such as Craig to take any admission by scientists that there might be a god as a major concession. The corollary of never being 100% certain that something is true is of course that you can never be 100% certain that anything is not true. Of course science cannot prove the non-existence of God; a truly omnipotent deity could make experiment return whatever results were desired. Science doesn’t squeeze out religion; it squeezes out the need for religion.)


Belief
Regardless of what we ‘know’, we can always say that we ‘believe’. This is another word that has a variety of meanings, having long ago been co-opted by religion. Again, unscrupulous argumentation will take advantage of that by shifting between meanings and contexts. Again, Craig will happily argue that science is a religion because scientists ‘believe’ in a rational universe, which is equivalent to his own ‘belief’ in God.

We all have a set of beliefs – that is, things we hold to be true simply because it’s easier than doubting them all the time. Among these beliefs is the notion that we truly exist; that we are real, thinking human beings inhabiting a physical space that closely resembles our perception of our surroundings. It’s just easier that way – after all, it’s what appears to be real, and we don’t gain anything by behaving as if this is not the case.

Science is, of course, founded on a belief that experimentation and human reason are capable of explaining the natural world. This is not a belief pulled from the ether; it is one backed up with a great deal of evidence and an impressive track record. To compare this to religious faith (another word with multiple meanings) is false equivalence of the first order.

All in all, saying we ‘believe’ something is, while correct, fraught with perils of miscommunication.


Knowledge
So what can we really say we know? It depends on the sense we use the word. If we’re using technical language in a scientific paper, we need to stay away from the word, using it as sparingly as possible. In general conversation, conversely, we can say we ‘know’ anything that we’re reasonably sure of. Depending on the person in question, that might be a low bar or a high one – it all depends how good their critical thinking is. Fundamentally, it’s an emotional feeling, one of confidence, that we gain by virtue of the fact that the object of our ‘knowledge’ agrees with (is parsimonious with) our existing set of beliefs.

I feel quite confident saying that I ‘know’ that the universe can exist without God, that the world is warming, that I am truly a person who exists in time, that all animal species are evolved from a common ancestor, or that when a drop a pebble it will fall to the earth and not simply float there. Strictly speaking I don’t ‘know’ those things; I’m reasonably confident and I believe them. Such language is important when speaking technically, and when I’m trying to explain a complex point I’ll usually try to stick to the technical terms.

I have nothing but contempt, however, for those who take advantage of the precision of technical language to undermine an argument; the William Lane Craigs of this world who will point to the uncertainty of science and say that it ‘isn’t really sure of anything after all’. I feel particularly offended at this because uncertainty is not, in fact, a weakness of science after all. Uncertainty is in fact science’s greatest strength. Sokrates claimed that ‘wisest is he who knows that he knows nothing’ – hidden in that paradox is the idea that someone who believes they know everything will never learn anything. Only those who admit that they don’t know will stand a chance of ever knowing anything at all.

So in conclusion, I suppose I feel sorry for Craig as much as I’m annoyed by him.

Monday, March 28, 2011

Unbelief

Some Christians say that lack of belief in God is punished by Hell because He sent his Word to earth and it is clear; any concrete proof like modern-day miracles would undermine free will – it would force us to worship Him instead of allowing us to come to Him on our own. I object to this on several fronts. For one, one of the Apostles, Thomas, displayed doubt as to Jesus’ resurrection even after viewing it himself. While he was rebuked, it was only gently; yet in the same book, we learn that those who do not believe, despite having seen no miracles, will be punished.

No, my main problem is that we have lots of different competing Words written down that still survive to this day, let alone those that have fallen by the wayside. Even if we do believe in the Christian God, there are a multitude of different takes on that particular belief. Do we believe in salvation of works or faith, or both? Do we keep Sunday holy and do no work, or do we just go to church then? Do we pray in solitude, or in groups? Etc etc etc.

How many of these are correct? They can't all be, because they're contradictory. The world was created by God or sprang forth from Brahman, or something else; they can't all be right without them ALL being wrong in significant ways. So if the majority of religious Words are wrong, then why can't they all be wrong? If they're all flawed, how much of any is correct?

So even if Christianity is right, and their religion (the one they were presumably born into) happens, against all odds, to be correct... God sent down His Word, yet failed to make it stand out significantly among the many existing in the world. Logically, rationally, we have no more reason to believe in Yahweh or the divinity of Christ then we do in Vishnu, Allah or the Buddha. You find the Christian scriptures particularly inspiring; I don't. I see nothing in them that could not have been written by ordinary, non-divinely-inspired mortals from the time they were written. Some of the teachings I like, some I find wretched and abominable... and even those I like are not unique to those writings. The Golden Rule, for instance, existed in the form it's present in Christianity for hundreds of years prior to the arrival of Christ.

So if I die, and I ascend, and I find Yahweh standing there with Christ at his right side and they ask me why I didn't believe, I will simply say: "Lord, I can see now that I was mistaken. But I am a rational, thinking man, as You made me. I never felt Your presence in any way that could not have been explained without invoking a force that I could neither see nor hear nor touch. You hid yourself away so effectively that I had no logical reason to believe that You existed. However, despite not believing in any eternal reward, I acted as a good person. I loved my neighbour, I forgave my enemies, I turned the other cheek. I did this not in hope of eternal life but simply because it was good to do so. If you choose to send me to Hell, then that is your decision, but I would feel that that would be an injustice. I am as You made me, and I feel you would have been worse-served by a believer who did all of the above merely to gain reward than you were in life by myself."

And any God who wants to say no to that is no God I want anything to do with.

Saturday, February 12, 2011

Religious Intolerance

Why do we, as a society, put up with discrimination as long as it comes from religion? It seems that bad behaviour gets a free pass as long as it’s in line with ancient traditions and given a veneer of respectability by being tied to ritual.

There’s a perfect example in the paper today, dealing with New South Wales’ anti-discrimination laws, which provide an exception for religious schools. These schools are permitted to expel homosexual students. They can also fire or block the advancement of homosexual teachers, or those who are divorced or become single parents.

Commenting on the law, the NSW Attorney-General commented that a balance must be struck “between protecting individuals from unlawful discrimination while allowing people to practise their own beliefs.”

Sounds good. Now, how far does this go? What religious beliefs wouldn’t be tolerated? Would we let religious schools discriminate on the basis of race? What about gender? If not, why not? By what basis would we permit discrimination on the basis of sexuality (or marital status) but not race and gender? It would please me greatly if the Attorney-General of New South Wales would clarify this point.

Aside from this clause making a joke of both sexuality as a protected category, it throws the issue of permissibility of religious intolerance into sharp relief. I think it has something to do with the notion that churches, as many religious advocates will tell you, are where flawed humans learn about morals. This gives any church activity a sheen of morality and a sense of decency. Thus, they don’t attract as much criticism: if a church takes an action out of their sense of morality, then it’s permissible. It raises attention only when the church goes against its moral sense, or performs actions that are well outside of the community’s morality.

The greatest example of this in recent memory is the Catholic Church’s treatment of abuse cases. Not only was the Church hypocritical in its handling of the cases, it offended community sensibilities. The community found itself in a position of being morally superior to the supposed source of morality and, seeing the feet of clay, reacted harshly.

If this is the case, it says one sad but obvious thing about the NSW law: discriminating against homosexuals is not far enough outside of community sensibilities that it raises ire. At least, it isn’t when churches do it. Apparently, enough people look the situation and shrug... or even approve. After all, how is ‘churches discriminate against homosexuals’ new? Sadly, it isn’t; while gay-friendly churches do exist, the best homosexuals can expect from most of them is to be ignored. It is a sad moment, however, when such discrimination on the public purse is met with such apathy... or even outright defence of the action. Remember that private and religious schools receive part-government funding in Australia, funding that appears to come with few strings.

In the United States, a country dominated by religious rhetoric, not even the Boy Scouts were immune to following anti-discrimination laws when they received government funding. If churches are to be free to follow their conscience, then there’s no reason that taxpayers have to fund that. If they want to make a real moral choice, then they can choose between discrimination and government funding.

I think that, in truth, most of society has shed much idea that churches a more moral than other institutions. While the general sense lingers on after the concept has fled, there have been enough scandals that the aura of sanctity has faded from all churches, not just the Catholic. I believe we're ready to move against this sort of thing, and it's an important fight. Religion is the last great excuse for prejudice and homosexuals (and single mothers) are one of the last great legitimate targets for prejudice. The more this is talked about, the harder it'll be for them to keep it up.

Monday, January 24, 2011

"Great Men" and Science

Those who’ve read me before know that I’m a big supporter of the scientific method, the scientific community and the results of science. That’s not an unqualified support, of course... the success of any construct composed of people is, of course, the people involved.

The key for me is that science should not (and generally does not) designate Great Men (apologies for the gendered language, but I feel using an outdated term makes my point better). Let me illustrate why this important this way: one common tactic of Creationists to assault Evolution (as if disproving Evolution could in any way validate Creationism) is to attack Darwin himself. This is usually done either by attacking the man, pointing out that he was religious, or claiming that he recanted on his deathbed.

The fact that he was religious is no secret. Many evolutionary biologists are religious (by some counts, most of them) and Darwin himself saw no incompatibility between his theory and religion itself... only between his theory and a literal interpretation of Genesis. The personal attacks do not reflect on his theory, while his deathbed recanting has been repeatedly debunked.

However, the counterargument is unnecessary. The disconnect is this: that even if his religious life was a factor in his research, even if he were a terrible person, even if he did recant... it would not affect his theory. In religion, people believe what powerful people say because they are Great Men. Jesus is trusted because he’s the Son of God, Mohammed because he is Allah’s prophet, Paul because he was supposedly revealing divine wisdom. Their teachings are taken wholesale and treated as truth because of their source. Apologists are fond of arguing that their texts are more reliable than science because textbooks change while scripture is eternal.

But this is not how Science works. Darwin, Einstein, Hawking all had to prove their theories before they were accepted. Evolution through natural selection was in fact not fully accepted until the second half of the 20th century, when it was merged with its principle competitor – gene theory – thanks to the discovery and analysis of DNA. In science, we do not believe people because they are Great. We believe they are Great because their discoveries were tested, proven and withstood the test of time. Had Darwin recanted on his deathbed, it would not have changed the fact that his discovery was right; truth is independent of the researcher.

At least, that’s how it’s supposed to work; an ideal case. Unfortunately, it often isn’t so simple in the real world, where professional reputations can intimidate others from testing hypotheses. This seems to be particularly the case in medical research, where evidence is emerging of peer review being used not to improve knowledge but to quash competition. Medical knowledge has, for some reason, always proven resistant to scientific inquiry and reverent of the Great Men. The Greek physician Galen made many advances in his own day and perhaps deserved his reputation; but it was only in the 17th century that his theory of uni-directional blood flow was overturned. Up until the 20th century, medical schools often preferred to teach Galen directly rather than later, more reliable texts (and making Galen instead part of the history of medicine instead of its present). To medicine, too, the unchanging nature of the old texts had value.

I feel it’s important to point out the failings of science, just as it is to point out its successes, and to add my voice to the concerns of the Cult of the Great Man over Evidence. Good science is skeptical; good science challenges what we know and recognises that our knowledge is imperfect. It searches for perfection, knowing it will never reach it. There must always be researchers who challenge the status quo, because a strong theory will of necessity stand up to any scientific scrutiny. If it cannot, it must be replaced. Great Men are no substitute for evidence and scientific rigor.

Thursday, December 2, 2010

The case for civil rights

If there’s one mistake that gay marriage advocates have made, it’s to separate the issue from the larger narrative and debate it independently. Perhaps they’ve simply allowed the opponents to do this and gone along with it. Regardless of the reason, I propose we stop talking about ‘gay marriage’ as a separate issue and instead site it squarely within the larger scope of civil rights where it belongs.

One of the civil rights that has been held as fundamental by every generation in historical record is marriage. In modern times, this has become read as the right to marry the person of your choosing. Historically, marriage has been between men and women. Times, however, change.

Christopher Pearson (quoted by Ben-Peter Terpstra) notes that many proponents of this civil right fail to give very good reasons why they support it. The reason for this is simple: we live in a liberal democracy. In liberal democracies, actions are legal unless there is a compelling reason to make them illegal, not the other way around. The default position is permissiveness; there must be reasons to bar civil rights, not the other way around.

No more argument need be mounted in favour, only counter-arguments dismissed. The principle for this one area of civil rights is the same as any area in the field: that there is a right (in this case, the right to marry the person of one’s choosing) that is permitted to the majority but denied to a minority and that this is a harm that must be addressed. Do we allow the majority to decide what is right for the minority? Democracy, as Benjamin Franklin observed, "must be more than two wolves and a sheep deciding what to have for dinner."

The American founders referred to this as ‘tyranny of the majority’; the great danger inherent in democracy. The rhetoric that speaks out most strongly against such tyranny is this: that the civil rights that are in most need of defence are those that are unpopular. Those possessed by minorities. Popular views will always be heard; unpopular views require support. The rights of majorities will be defended at the ballot box every time; the rights of minorities will be crushed unless the majority sees its own tyranny for what it is. Therefore, a ‘right’ is possessed only if it is defended for all. Not just the popular, not just the majority, but by all. If it is not possessed by all, then it is not a right.

While we do not have Constitutional rights in the same vein as the United States, we do talk about rights extensively. There is a common belief that none are enshrined in law; this is false. We have Common Law and customary rights; these differ from Constitutional rights in that they can be overridden by parliament. We cannot mount a Constitutional challenge in favour of gay marriage as they have in several American States.

This does not, however, mean that these rights are absent. The law offends these rights and our remedy, rather than the courts, is through fora such as these. To speak our mind, to persuade, cajole and exhort.

We do this because we understand a simple truth: that everyone has civil rights; or no-one does. All we have in Australia are those things that we are currently permitted to do by society. They are not rights; just things that not enough people object to right now.

Of course the opposition will rise against us. They mount arguments that we’ve all heard before: that marriage is between a man and a woman, that marriage is fundamental to our society. In fact, we have heard those arguments before: decades prior to this, when interracial marriage was discussed. To many people of that generation, those same arguments were trotted out almost verbatim. If we were to re-write Ben-Peter’s words and substitute ethnicity for sexual preference, almost none of the content would need to change. “Not all black people want to marry whites! Why should we change the definition of marriage to satisfy a small minority?”

It is pointed out that some homosexuals have no wish to get married, or don’t support the proposition. So what? I have no doubt there were non-whites who were opposed to letting any of their own people marry outside their ethnicity, too.

In Australia, we never banned such marriages in law. We didn’t have to; societal pressure was enough. We did neglect to count our Aboriginal population in the census, barring them from taking part in the democratic process. When the census to correct that came for vote in 1967, every single State approved it by over 80%. We came together as a nation and declared that this violation of civil rights, even though it did not affect them, was too much for them to bear.

This may not seem as fundamental as disenfranchisement of an ethnicity, and to many reading this, it won’t be. Those who seek the right to marry the person of their choosing, however, may disagree. We all have rights, or no-one does. Mount your arguments against gay marriage, therefore; but leave out the claims that you ‘just don’t believe it’s right’. Whether you would like it, or even approve of it is irrelevant. Show harm, or you have no basis to deny civil rights to all.

This article was published on Online Opinion.

Thursday, November 11, 2010

Autotune the whatever

I don’t like autotune. This much I’ve known for a long time. I agree with all the conventional arguments, saying that it makes singers sound robotic at higher settings, or even at lower, making it just a little fake.

However, I’d object to it even if it were able to be done perfectly. And indeed, it can be done with minimal interference – we just don’t notice when it is, unless the listener has a very good ear. Some vocalists have made it part of their style. I can sometimes even like that. My main objection to it is the very reason it was invented: that it smooths out flaws in a singer’s voice. That is what I feel is its main threat. As autotune and similar technologies are attacked for how fake they make singers sound, they’re developed to become better and better. To do precisely what is intended, making singers sound perfect.

Personally, I love listening to a great voice. I also like watching great runners, brilliant actors, or a well-executed dance number, regardless of the genre. I watch team sports not because I barrack for a team (I detest sports culture), but because watching teams act in unison is magical. I like being witness to the human ability to do great things; just like steroids ruin sport for me, auto-tune ruins my enjoyment of song.

Perhaps I should compare this to painting. Anyone who’s been to a gallery and seen oil paintings up close has seen the brush-work on the canvas. This can be subtle, achieved with fine brushes, or it can be extreme, applied with a knife rather than a brush, creating a textured surface. At a distance, the ridges and brushes blur together, making a more unified picture; up close, the details stand out. Compare that to seeing only a print of the painting; a whole dimension is missing.

An auto-tune for painting would be designed to make people paint like prints.  The richness and messiness of the brush-work would be lost, the paintings produced just so would have no texture. They would be beautiful, but distant, lacking character. Perfect.

Human beings are not perfect. Human beings are flawed by definition; the creations of humans are no exception. It is that very flawed nature that makes our artwork real. The mistakes in an hour-long orchestral recording that remind us that the performers really did do that all in one take. The cracking of a voice at the end of a hard live concert, the last rally as the final number of the night is belted out.

Auto-tune threatens that. It approaches looking like a democratising tool; making anyone capable of singing in tune. I’m not so concerned about autotune making things ‘too easy’; I worry about it making things too perfect. Beautiful voices, like Freddy Mercury or Dame Joan Sutherland, are celebrated because they are rare... but also because they’re human. Real human beings produced those sounds. I care about a perfect autotuned voice about as much as I do a steroid-influenced world record. Or perhaps as much as a print compared to the original art. It’s an achievement, and might even be pretty... but it’s not what I want to hear. Give me a flawed, real human voice, as imperfect as it comes, over that same voice smoothed to artificial perfection.

Tuesday, November 2, 2010

A Game as Art

You wake up, no memory of your past, no idea where you are now. Your body is covered in scars of a long life, but you can’t remember where any of them were earned. Before you have a chance to adjust to your new surroundings, you are greeted by a floating skull, who proceeds to talk to you...

Over the next twenty or so hours of gameplay, Planescape: Torment continues the sense of wonder, foreboding, mystery, threat and revelation present in the opening minutes. Along the way you meet colourful characters, explore a world utterly unlike our own, engage in the obligatory battles... but what truly sets the game apart is its exploration of identity.

Your character, you see, is immortal. His body is, anyway. Each time he dies he regenerates, the cost is the loss of his memory. All his actions, his experiences, everyone he’s ever met all fade from his mind and he’s left a blank slate. While the body is immortal, the game posits, is the individual in fact immortal, or just as mortal as anyone else? When the memories are gone, is that individual in fact dead and gone, despite the fact that their body is still walking around?

During the course of the game, you meet the psychological remnants of several of your past selves; each turned out so very different to the others. One is scheming and left enough clues to your identity to try to trick later selves to restoring his mind. Another was driven mad by the revelation and destroyed many of those same clues, seeing the past and any potential future selves as ‘thieves’ who would steal his body from him. One felt terribly guilty at the atrocities of some of his past selves and killed himself after obscuring many of the clues to spare future incarnations the pain of his discovery.

All of these individuals came from the same body, the same neurochemistry, the same physical form as the others. The monsters and angels, the madmen and saints, they were all the same physical person... but their identities were so very different. A question is echoed throughout the game: “What can change the nature of a man?”

No single answer to the question is ever supplied by the authors. Rather, it is left to the player to determine it. Different characters have different ideas; some say that nothing can, others that anything can – one answer is that only the man can change himself. (My apologies for the exclusive language.)

Art is a matter of debate; no matter what the creation, there will be someone who denies that it is art. The one definition that I really like (and tries to be at least a little objective) is that art is whatever illuminates the human condition. Here we have a computer game that is about that aspect of humanity central to our selves: identity. I put Planescape: Torment forward as art. A Game As Art. If one game is art, then games can be art; let the floodgates open. Games can be art.

Wednesday, October 27, 2010

Commodification of Green

A couple of years ago, I bought an electric kettle. This kettle advertised itself as ‘eco-friendly technology’. I was buying a lot of things for a new house at the time, so I thought I may as well get it – eco-friendly is good! It must be more efficient or something. When I got home, I discovered that its secret was that it had a water reservoir from which you could feed water into the boiling chamber, ‘so you only boil what you need’. This, instead of simply filling it only to the level you need. For that, you get a kettle with two chambers, moving parts including a water pump capable of moving water from a nearly-empty chamber into a nearly-full one and a lot more plastic than a regular kettle. The carbon price on that would outweigh any energy savings created by being too lazy to carry the kettle to and from the sink a little more each day.

This line of thought crystallised more recently when I had one of these home energy efficiency checkups. One of the items on the advice list was to ‘sell your desktops and buy laptops.’ I do agree that laptops are more energy-efficient. On the other hand, how would this save carbon? The lower energy usage of the laptop would not compare to the energy cost involved in creating the laptop, let alone the environmental cost of certain components (namely those involving gold – whose mining is still quite toxic environmentally – and the battery). The true answer should have been to use the computers less, turn on power saving settings, and when it came time to replace them, to then buy laptops instead of new desktops.

All around us we see the industry that has grown out of Green. Unfortunately, for everyone actually doing good work, there are ones who exist only to peddle the image of Green; the warm happy feeling of ‘doing good’ rather than the hard slog of actually helping. It stands to reason; the climate problems facing us are massive and the difficulty of dealing with them is high – it’s very confronting, which is part of what drives climate scepticism. On the individual level, what is our best response? For everything we think of doing, there’s some cost attached. It’s just easier to go out and buy something, get some product that solves our problem. It’s what our consumer culture has taught us, after all.

We know that what we have to do is to simply use less energy by changing our lifestyles to be more efficient. To use the car less, to turn lights and computers off; but changing lifestyle is a hard thing to do, as everyone who’s ever been on a diet knows. It’s so much easier to get a lap band or take diet pills, or to complain that ‘those doctors don’t know anything about weight’. We’re daunted by the prospect, so we reach out to consume.

And so we are sold hybrid-electric cars, even though the battery manufacture is toxic and none of the parts are recycled. The old line is that the most cost-effective car you can possibly drive is the one you currently have. These days, you can adapt that line about carbon efficiency – the carbon cost of manufacturing a new car, especially a hybrid, is nothing that will be made up by its increased fuel efficiency. The electric component is not without carbon cost, either; but in that case, it’s out of sight, out of mind. If it’s not spewing out the tailpipe, the drivers think they’re helping. If you were going to replace your car anyway, that’s one thing; but those celebrities who buy a new model of hybrid every year are being fashionable, not environmentally-conscious.

We are sold carbon offsets, even though many of them are ineffective (and in some cases, counterproductive; planting trees in latitudes too far north can change white tundra into dark forest, thus trapping more heat and making the situation worse). This practice has been compared to medieval indulgences, those ‘permission slips for sinning’ handed out by the Catholic Church of years past in exchange for favour. They give people cover to continue living as they are. Green has been divorced from the environment; it has become simply another commodity we can buy at the store.

Ultimately, this is a global issue and must be addressed on a global level. As individuals, we have very little power to change the climate; but we can do our own little part. Even if we don’t manage to halt global warming, at least we’ll be spending less on energy ourselves, seeing the sun more from walking instead of driving, and enjoying cleaner air in our cities, if those around us follow suit. Just don’t make the mistake of thinking this is a problem you can just throw money at to make it go away.

This article was published on Online Opinion

Thursday, September 30, 2010

Batman

Frank Miller’s best known work, Batman: The Dark Knight Returns, was written at a turning point in the comics industry. It was 1985, the start of what has now been called the ‘Iron Age’, a period where the stories became darker, more violent and (sometimes) more skewed towards a mature audience. Deconstruction of characters was common; the better of these took characters apart to explore how they worked and then put them back together again, the more ham-fisted just tore characters apart and left them wounded.

The Dark Knight Returns revitalised interest in Batman. It brought us many of the modern Batman tropes, most importantly the link between Batman and his villains (positing the idea that without Batman, the villains wouldn’t exist), Batman as sociopath acting out against legitimate targets (not directly supported by the narrative, but this can be read in) and The Joker becoming a mass-murderer (with successive writers each trying to top the atrocities of the previous). All of these ideas have been used badly in successive years, but we can’t blame that on the original, which has become a classic.

A classic which I re-read recently in order to re-examine the ideas it raised. I’ve long felt that it was a piece of flawless quality which has been tarnished by its poor imitators. I have since revised my opnion, first of him, now of it. My attitude towards Miller has soured over the years; I found 300 to be not only objectionable, but boring. The follow-up to Dark Knight Returns, titled Dark Knight Strikes Again, I found virtually unreadable drivel from start to finish. Upon re-reading the original, among the genuine insights, I found poor writing, woeful stereotyping and precisely the sort of juvenile, ego-charged power trip that has marked Miller’s later works. I still found it to be a good read, though certainly not flawless. Part of this is due to his characterisation of Batman: in particular, the pleasure the man takes in inflicting harm on criminals. While he refuses to kill, he is not averse to injury; in fact, he takes actions that result in injury to his opponent even when other options are available, and expresses no remorse when criminals die, even in painful ways.

All in all, what struck me most is the same thing that occurred to me while reflecting on Saving Private Ryan. That, too, was a piece that deconstructed the war movie. It showed the gore and violence that affected and twisted soldiers and was purportedly about finding humanity in the midst of war. What it utterly failed to do was extend the same courtesy to the enemy. The Nazis remained the same faceless, unknowable, implacable enemy that they have been in virtually every other World War II movie ever made. The one Nazi who gets anything like a persona is a prisoner they allow to live... and who returns to kill Tom Hanks and is later executed by the peaceful war correspondent. For a movie purporting to be about keeping one’s humanity amidst atrocity, it certainly went out of its way to justify atrocity.

In The Dark Knight Returns, the criminals are interchangeable. Most of them wear goggles or other facial adornments that render them impossible to tell apart other than colour scheme (the one criminal who wears anything different is, ironically, a neo-Nazi). While Batman and his personal nemeses are deconstructed, the ‘criminal’ is presented as a faceless enemy to be despised and crushed. They have no redeeming features, no individual personalities. They are caricatures of humanity. They are not to be understood, not to be empathised with. They are to be hit until they stop moving. There is no problem in Batman’s crusade that cannot be solved by hitting it hard enough and the only thing that stops him from cleaning up Gotham is the Government, in this case represented by Superman (who is little more than the President’s stooge).

This is the attitude I find disturbing. It’s what lowers Dark Knight Returns from a serious deconstruction to a power fantasy. ‘What if we just beat up criminals?’ it asks, with the answer being: ‘It would be AWESOME.’ If this attitude were isolated to this work, it wouldn’t be so bad. Unfortunately, the criminal element is similarly-portrayed in other works (including much of modern Batman) and, unfortunately, real life. People in reality call for the execution of violent offenders, justifying their statements by saying: ‘They chose to be criminals, screw ‘em.’

While in fiction, power fantasies are relatively harmless, in real life, they are damaging. Crime and poverty are societal ills, but cannot simply be dealt with by hitting people. Increasing sentences tends not to reduce crime; it just makes us feel better as we take a proxy revenge against anyone who’s hurt us in the part. We empathise with Miller’s Batman because there is a part in the human psyche that enjoys being a bully. We cast our enemies as faceless evil because that justifies our actions, turns them from bullying into some form of a noble crusade against the criminal infidel.

Simple answers are attractive. We naturally prefer simplicity in our explanations. But Occam’s Razor tells us which explanations are preferred, not which are correct. Sometimes the simple answer is right. Sometimes the complex answer is right. There are indeed some problems violence can solve. In this case, I’m afraid of the people who think that hitting people solves everything.

And the comic? Well, I still think it’s pretty good. It has a glaring flaw, but Miller hadn’t quite descended into the madness that characterised his later works. A lot of it is there – the violent, hyper-masculine protagonist; the two female characters in the story were a tough lesbian (who hates Batman) and the former Catwoman, who runs an escort agency; the equating of optimism with naivete – all of which were magnified in his future pieces. As it is, it really is one of the best Batman stories. I just wish that while he was deconstructing Batman’s relationship with his named villains that he put some effort into Batman’s relationship with the criminal element.

(Yes, I'm a comic nerd. Amid the posts on politics, expect some on comics, books, movies and TV from time to time.)

Wednesday, September 29, 2010

Faith vs Trust

There is a particularly insidious attack on science, one which attempts to undermine the whole basis of rational thought. This is one that has appeared in many avenues, including normally science-supporting, left-wing arenas. It is this: that science, like religion, requires faith. That to believe in the big bang, or evolution, requires you to hold that belief with faith, just like believing in the God of the Bible.

I want to state this very clearly: this is false. I fear that all too often, this idea is adopted by those who either have religious faith in order to defend their faith from rational inquiry, or by those who wish not to offend the religious by subjecting their faith to such scrutiny and feel that a statement of ‘Hey, we all require faith’ eases relations.

Truth, we may say, is more important than sophistry.

First, the definition I will use from here on is ‘scientific inquiry’. By this I mean observation and analysis of phenomena and the search for rational explanations which can illuminate our knowledge of the world and also be used to make predictions. I include not only the traditional sciences, but also the ‘soft’ or social sciences. This form of inquiry requires evidence. If there is no evidence for a claim, then that claim cannot be made with any form of authority. ‘I do not know’ is not an excuse to fill in the gaps with something for which there is no evidence. In short, simply because we do not know for sure where the universe came from, there is no rational basis for assuming it was created by a deity (especially a particular deity). Claims such as ‘God made it so’, ‘water has a memory’ or ‘it must have been magic’ do not help us to understand the world around us. Statements such as ‘disease is caused by germs’, on the other hand, allow us to investigate these things called ‘germs’, find out why and how they create disease, and how we can prevent that. It leads us towards further knowledge. Claims of supernatural intervention prevent further investigation.

So when Stephen Hawking claims the universe’s origins can be explained via natural means, he isn’t simply making things up. He isn’t saying that, for certain, the universe began this way. He is speaking out of knowledge – knowledge that telescopes can see a very long way away; that relativity and the speed of light mean that seeing a long way away is the same as seeing far back in time; that the indications are that at one point, the universe occupied a single space. Could he be wrong? Of course; any self-respecting scientist must admit that. But he is not exercising faith. Rather, he exercises his reason.

Now, the situation is different for non-scientists. We read books (or opinion blogs) and accept the conclusions in them. How is this different to reading the bible, or a book on acupuncture, and accepting those claims? The difference is that science has the best track record of success in terms of analysis and prediction. Alternative medicine has yet to produce significant results from the vast majority of treatments, homeopathy remains nothing more than water, dowsing has failed every test assigned to it. The history of religion is filled with failure of prophecy and a view of the physical world that has proven to be incorrect in every case that has been able to be analysed. Scientific inquiry has brought us effective medicines, weather monitoring, communication. It increases our understanding of the world on a daily basis.

When comparing the track record of science and other philosophies or beliefs in their ability to illuminate the natural world, we need look no further than this track record and ask whether science is worthy of trust. The answer is clear – it is. When evolutionary theory predicts that particular fossils should be found in a particular area, there they are. When analysis of those fossils suggests a particular lineage, DNA analysis backs that up. When the theory of relativity predicts time dilation and increase in mass as objects increase in speed, experimentation proves this (in fact, GPS satellites would be inaccurate without an understanding of relativity). Religion, of course, is not denied by science; there may well be a God out there... though certain aspects of various faiths are explicitly denied by scientific inquiry. This may be the true root of dissatisfaction with science from some areas of the community; not only religion but also homeopathy, chiropractic, astrology and the coal industry.

So when I say that I accept evolution, the big bang, relativity or climate science, I don’t do so through faith. I do so because science has the record to back it up. It has a power to bring understanding to the natural world that is possessed by no other philosophy or institution. I trust the scientific community when it makes bold pronouncements, because it does not do so if the claims are not justified. If a book tells me that the world was created in seven days, or that by drinking this pure water I will cure my insomnia, I ask: ‘where is the evidence?’ Science does not require faith, it requires understanding. To those who lack that understanding, the trust many of us have in science looks indistinguishable to faith. This does not change the essential character of our relationship with science: that it is rational and justified, requiring no leap of faith or logic to accept and trust its conclusions.

Wednesday, September 22, 2010

Anti-science

Clive Hamilton, the Greens candidate for the Higgins By-Election last year, has concluded a series of essays at The Drum concerning the anti-science nature of climate change denialism (that term being used for those who are against climate change on an ideological basis –rejecting all evidence for and embracing all evidence against – as opposed to those who are genuinely skeptical).

I could go further into the issue of climate change. I instead link to his articles, the website Skeptical Science, and the video channels for Potholer54 and Greenman 3610 which lay out all the issues for and against in excellent detail. In short, don’t believe the backlash: the issue is settled science, the scientific community has spoken. It’s up to the non-scientific community, and particularly the political community, to decide what to do about it; but simply denying the science is not a reasonable option.

I will instead broaden the discussion to include all anti-science brands out there. This trend is limited not only to the climate, but any area where science tells people that what they believe isn’t true. While certainly not the first example, the most famous is, of course, Galileo being forced to recant his findings in front of what was effectively an inquisitorial court. His reputed utterance ‘E pur si muovre’ proved to be more durable than his public statement and science won out in the end: the old Church model of the heavens as the perfect heavenly abode ended. It is important to note that in centuries prior, the Church had been the bastion of learning and research. Roger Bacon, one of the most-cited figures when discussing the origin of the scientific method, was a Franciscan. The Church supported research, copied books and extended as well as preserved knowledge. It was only once science began to show that certain church teachings were wrong that the power of Holy Mother Church was turned against the scientists.

So it is today with evolution. While the modern Catholic Church has no problem with evolution (as is true for most Christian churches around the world), there exists a surprisingly large fundamentalist, literalist backlash against the theory, both Christian and non-Christian; since while there is nothing stopping anyone from both accepting evolution and believing in a liberal interpretation of various holy texts, evolution does explicitly oppose strictly literal readings. We end up with public figures making fools out of themselves in their ignorance, showing their misunderstanding of radiometric dating, fossil records, thermodynamics and even the meaning of the word ‘theory’. They oppose evolution because they must; because to eliminate the cognitive dissonance they feel when their beliefs oppose reality, they decide to deny reality instead of adjusting their beliefs.

So it is also with homeopathy, along with much of the rest of alternative medicine. Homeopathy is focussed on specifically because it is to utterly anti-science, so opposed to scientific inquiry and rationality that it beggars belief. More on homeopathy can be found in this article and this video. The synopsis is that homeopathic remedies are simply water sold at enormous volume. Yet the defenders will insist that they work, based on anecdotal evidence and a lack of understanding of the placebo effect. When science tells them that they don’t work, that the benefits they feel have other explanations, they reject science. While homeopathy itself is rather harmless, it can lead to the sick not seeking competent medical care, or even rejecting it outright, as in the case of not only homeopathy but (more importantly) AIDS denialists or anti-vaccine campaigners. These two are deadly serious concerns, fuelled by general anti-science sentiment.

One simply cannot attack science on one of these issues without weakening it overall. The arguments used against science in each of the above areas (and others) are precisely the same: misrepresenting the state of the research, seeing debate as evidence of uncertainty instead of science’s greatest asset, dismissing science as not having all the answers, claims of conspiracy and suppression... to use any of these arguments against climate change is feeding into the arguments of those who oppose it on evolution. By opposing science on homeopathy, you give power to those who deny the link between HIV and AIDS.

If science is to have any place in the modern world, then we as thinking human beings must accept its findings regardless of whether or not they are convenient to our beliefs or ideologies. This is not to say that science is always right; but it is the institution with the greatest record of accuracy in the history of human inquiry. This is because it has no central organising theme or aim other than the search for truth. It does not ever claim to have found The Truth, either; science does not ever rest, even a ‘settled question’ (even climate change) is open for debate and will be overturned should new evidence contradict the dominant theories.

Most importantly, it is made up of people who advance their careers not by mindlessly repeating what luminaries in the field say, but by proving those luminaries wrong. If there is an angle to attack popular theories, they will be attacked. If they fail, they fail regardless of how popular they are or how well-regarded the scientists behind them: in science, theories are given prestige because they are right, not because they say something comforting.

We should follow that same ethos.

Euthanasia and choice

An example of what I was talking about in my last column has cropped up in the national media; very kind of them. The euthanasia debate has centred around rights – the right to life versus the right to choose to end it. The pro-euthanasia side, mostly on the Left, has been sounding very much like the right on this one, talking about the individual right to make one’s own decisions. The Right, meanwhile, has been... not so much sounding like the Left, but certainly putting societal interests ahead of individual. They do it for different reasons: in the case of the right, it is fear of what the change will bring and traditional concern for the sanctity of human life.


In a perfect illustration of why the ‘the Left cares about society, the Right cares about the individual’ meme is wrong, we see the Right putting forward slippery slope arguments (on The Drum (20/9), Tom Switzer used that term no fewer than three times in fifteen minutes) leading inevitably to the downfall of society. (Make no mistake, slippery slope arguments have their place; but that place is when outcomes are indicated by evidence such as historical trends, not simply as ‘if this happens, this other thing might happen!’)


Further, the arguments of the more-or-less Left show that indeed, the Left is concerned about the individuals; it’s just which rights they’re concerned about vary from the Right. Nor is it purely about government intervention: in this case, the government is already intervening and the Greens bill is trying to get them to stop. It’s a complete inversion of the conflict set up by the meme, extreme enough to be the exception that disproves, rather than the exception that proves.


This is because it is not an abandonment of principles by either party; the Right, which in Australia is actually genuinely conservative to some degree, is opposed to change, while the Left seeks a more just society, in which we do not force people to live who do not want to, at least in particular circumstances (a cancer patient in the final stages of the disease is a very different case than a teenager who is convinced that her life is over). These are the true colours of each side of politics; the ‘individual vs society’ meme is a cover for social conservatism and tyranny of property, not the actual motivating factor.

Wednesday, September 15, 2010

Freedom vs Capacity

A companion meme to ‘The Left wants to expand government’, and one more commonly seen in Australia, is: ‘The Left cares about the State and Society over the Individual, the Right defends individual rights.’

Again, at a simplistic and superficial level, this is appealing. The Left does traditionally put more emphasis on social programs, welfare and laws that protect minorities from prejudice. This includes laws that infringe on property rights, such as not allowing shop owners to bar individuals on the basis of race, or telling tour companies they have to allow women (or men).

Not exactly the way I’d put it, of course. As is often the case, the simple explanation is superficially pleasing but not the whole story. Here’s my version: that the Right is, traditionally, concerned with increasing freedom, yes. Freedom as defined as the theoretical ability to act in any way one chooses; the lifting of formal or legal barriers to action and choice. The Left, meanwhile, prefers to act to increase capacity: that is, the actual ability of an individual to make the choices he or she is theoretically free to.

This is a narrow distinction, but an important one. I feel it’s one not given sufficient attention in mainstream political philosophy, though sociology has written extensively about it. The basic premise is that even though we might be theoretically free to follow a particular path, there are usually obstacles in our way that prevent us from doing so. These could be personal (such as natural ability, emotional issues, drive and motivation), situational (access to resources, location, time and place, socio-economic status) or cultural (notably prejudice). In short, an Aboriginal man from Redfern is less likely to ascend to the highest level of politics or business than a white male Christian born into a wealthy family.

In an ideal world, both would be entirely free to achieve their every ambition; on this, Left and Right agree. The difference between the wings of politics is rather what they see as barriers and which barriers it is appropriate that government deal with. For the Right, the government must necessarily concentrate on economic issues; to the Left, the government has a more proactive role in shaping the cultural and social agenda.

And so we see that neither Left nor Right is more concerned about the individual than the other. Neither can really claim that they are ‘The Party of Individual Rights’. What differs, rather, is which rights they see as the most important: property rights or social equity. The former relieves formal or legal barriers to advancement, while the latter removes practical barriers.

In the future, I will expand on what I perceive these barriers to be, and what the responses to them are.

Monday, September 13, 2010

Media Burning

There is a clear difference between ‘old media’ (traditional print and televised news services) and ‘new media’ (blogs, YouTube channels, social media). Our thinking as a culture is still very much stuck in the ‘old media’ mindset – that things don’t happen unless the cameras are turned on. If we examine thethe  timeline of the development of the recent story about burning of the Qur'an by Pastor Terry Jones, this point is clear. According to The Guardian, traditional media outlets only took up the story after it had already started rolling.

First, I would like to modify our understanding of ‘old’ vs ‘new’ media. There have always been sources of information other than formal print and televised journalism. In the 1980s, when journalist John Pilger wanted to talk about things that he couldn’t broadcast on television, he turned to alternative media. In this day and age, he might have started a blog, but in his case he wrote books: books about events that were insufficiently covered by traditional media, books about the media industry itself. In a different context and time, when political organisations wanted to talk about the suffering of the black population of South Africa, they used traditional media, but they also organised protests, wrote pamphlets to be distributed at events, held talks at universities.

And so the argument that ‘new media’ is novel is itself a false narrative. Rather, the already-existing informal media dynamic has changed. The resources required to create a video or informative pamphlet have changed, now that a webcam can be used to upload a video to YouTube, or a site such as Blogspot can host your words for free. Therefore, the volume of informal media has increased. This is not new, just different and bigger than before. It’s far easier for a single individual or group to get mass attention, especially when the message is inflammatory.

It’s interesting that the people who fed the story about the Korna burning were not proponents of the event, but people opposing it. The facebook comments were overwhelmingly negative; even the first semi-mainstream (where?) report was negative. The outrage sparked a buzz, and that buzz increased in volume until the traditional mainstream media outlets simply could not ignore it.

So while many comments have been aimed at Big Media for pushing this story, I think those are mis-aimed. Big Media took it up only when they had no other choice. My advice would be aimed instead at the people who stormed in outrage: Don’t Feed The Trolls. The power the individual holds in the current media environment puts an onus on the individual to be at least a little responsible.

Wednesday, September 8, 2010

It's all about context

Stephanie Rice is in trouble for using A Bad Word. She’s been dumped by her sponsor and under fire for being homophobia as a result... but a teammate, who is gay, defends her by saying she isn’t. How can these be reconciled?

It’s easy if you understand one thing: it’s all about context. Offensiveness always is. Absolutely anything, no matter how innocuous normally, can be offensive in the wrong context... and absolutely anything offensive can be passable in the right context.

To expand on that: I use the word ‘faggot’ with some frequency. Other words, too. The context in which I use them is one where the people who hear me know that I mean them as parody more than seriousness. When I call my friends ‘faggots’, they know that I am making fun of the sort of people who use the term, not of the people who are usually targets.

However, if I used the same word in the same way in public, the context would be different. The people around me might know what I meant... but many would not. Not only would some of those who hear take offence, but they should; because some of those who hear and do not understand what I mean would be emboldened by hearing me. They would not be offended because they like using whatever word I just came up with. The more people they hear using it, the more likely they are to have their views reinforced and not only use that word, but act in a way consistent with its mainstream intent.

Too often, our discussion of racism, sexism and homophobia focus on the words we use instead of the ideas we express. This is why ‘Dr’ Laura Schlessinger was attacked for using ‘nigger’ 11 times, when the far more offensive statement she made was to tell her female black caller (who was complaining about her husband’s friends using the same word) that if she was so easily offended, she shouldn’t have married outside her race. She got away clean from that truly repulsive statement because most of the onlookers (though fortunately, not all) were too distracted by ‘The N-Word’. This gives her clearance to say that she was attacked for simply using a word... when she was actually attacked for being horribly racist. She probably wouldn't learn anyway, but this would be a good moment to talk about the attitudes in society that lead people towards racism... instead it was just about the Word.

I say: words, by themselves, are not offensive. The ideas they conjure and the contexts in which they are uttered make them offensive. Words to put down women, homosexuals and minorities are offensive not simply because they are designed to put people down but because they reinforce cultural memes of inferiority. They are a means of propagating not only prejudice, but prejudice that results in unequal access to society for those targeted by it.

For this reasons, I find nothing truly offensive about terms like ‘cracker’. The word might betray some crassness in the speaker, and I might decide I don’t like that person on some principle or another, but it lacks the sting that ‘nigger’, ‘kike’, ‘boong’ or ‘faggot’ do, because there is no societal prejudice leading towards unequal access to society for white men like me. Or rather, there is... but the inequality is in my favour. So the least I can do is to be a big man and suck up the occasional insult.

So I can understand where Stephanie Rice is coming from. I’m willing to believe her teammate and accept that she was tone-deaf rather than prejudiced. I think she should refrain from using certain words in public, but also that we should talk about why offence is caused, not simply use ‘That Is Offensive!’ as a politically-correct fire blanket.

Tuesday, September 7, 2010

Country Independents

This afternoon, we’re promised, we will find out where the three Country Independents will jump. Personally, my money’s on Labor. Only two of them have to go to Labor to guarantee another Gillard government, but they have been talking about wanting to move as a bloc, so it is possible that they’ll all lump in with the Coalition, giving them the 76 seats they need.

Thus far, Labor has been more responsive to the trio’s demands on costings and reform; the Liberals’ $11 billion budget hole didn’t help matters at all. Katter naturally aligns a lot better with the Coalition (in fact, he’s far to the right of them, being a Sir Joh man himself), but he has admitted that during their 12 years in power, the Coalition didn’t do squat for the country. I’ll point out that in the last Queensland state election, the Liberal party was routed – they ended up with so few seats that the Queensland Nationals actually dissolved the Coalition in that state. I wouldn’t be surprised if Katter held the federal Coalition in similar regard. The federal National Party certainly doesn’t seem to hold any sway in policy.

Meanwhile, seeing where Oakeshott and Windsor will jump is sort of like reading tea leaves. I said before that my money’s on Labor; this is just a sense I’m getting from them. Labor’s responsiveness and what the two (mostly Oakeshott, Windsor has been a bit quieter) have been saying. It’s just the impression I get, really.

If both of those go to Labor, you can bet Katter will follow. He’s been setting this up, with his comments about the Coalition failing the bush. You can also bet that he won’t want to be the useless 74th seat on the Coalition minority. Much better to get in with the side in power. Labor, meanwhile, might be better off without him, but they can’t exactly say no to him without annoying the other two, since they have wanted to move as a group. I do worry about someone so stone-age and what he’ll do to the party’s political stances.

Fortunately, all three have been immune to the barrage of ‘Beware the approaching Left-wing dooooom!’ that came out over the weekend. A chorus of current and former Liberal Party voices echoed the same claim, that the next Labor government would be ‘the most Left-wing in history’, thanks to the alliance with the Greens. At no point did they illuminate us on exactly why this would be a bad thing. Yet again, it’s the importation of an American meme: “Barack Obama is the most liberal member of Congress,” voters were told, despite the fact that his voting record put him at centre-left compared to other members of Congress, and actually slightly to the right of the population. Again, in both cases, nothing was said about exactly what would change, or why a left-wing government would be necessarily so terrible. Just vague warnings of: ‘Doooom!’ that might have come from any bad fantasy-novel oracle.

So we’ll see what happens this afternoon. Cross fingers, everyone.