Monday, 30 October 2017
I once told someone that Chinese people are more receptive to ideas like communism because the language contains no pronouns, and Chinese culture is therefore less amenable to ideas of individual liberty. This is, of course, an egregiously stupid idea, which I came up with in order to shut down a conversation I've been sick to death of having since about 1987. Firstly, there is more than one Chinese language, secondly, a lack of pronouns is neither here nor there - individual and group identities can be indicated at least as effectively through conjugation and so on, and thirdly, there is no such thing as 'Chinese culture' in the sense of a single, uniform set of ethnic practices across all people in the national construct now known as China. In spite of all these inherent, gaping invalidities, my interlocutor found this proposition to be immediately and profoundly convincing.
It basically comes down to the inherent vulnerability of 'common sense thinking' to specious argument. The whole essence of what we commonly (ha!) call common sense is one of reduction and closure. Nuance and complexity are deliberately stripped away with the ultimate goal of arriving at a simple, determinate conclusion. It's fundamentally geared towards disposal rather than contemplation – a bid to force the round peg of understanding into the square hole of definite and immutable fact. Where we most frequently see this pattern of thought is in polemic, usually of the conservative variety, but reasonably frequently across all bands of the socio-political spectrum.
I've never been able to understand our perennial love affair with common sense. As a framework for analysis, understanding, or even just simple cognition, it's appallingly unreliable. The very basis of the modus requires the thinker to isolate and subjectify - to operate in a solipsistic and essentially idiosyncratic framework, blithely selecting and rejecting elements of the subject at will in accordance with deliberately subjective, usually emotionally driven, criteria. There is an actual requirement for the creation of false equivalencies, reliance on biases (such as frequency), and deliberate or inadvertent disregard of known cognitive glitches, in order to create the oversimplified, selectively supported narrativisation and personalisation of reality which common sense thinking almost always produces.
This is especially apparent when we examine its operation in conspiracy theories. Pretty well every conspiracy theory, be it Flat Earth, Ancient Aliens, or 9/11 Truther, to name just a few, has as its central platform an appeal to common sense. How could they have made those pyramids without alien help? They're so big – it's just common sense. Buildings don't collapse like that, so the whole thing must have been an elaborate hoax – common sense. The world looks flat from where I'm standing – you guessed it: common sense again.
And this isn't limited to the lunatic fringe of cognitive dysfunction. Mainstream ideas backed by common sense have included all of the ugliest aspects of racial theory (people who look different must be innately different and therefore rankable by race), sexism (these uneducated women are uneducated, therefore educating them would be a waste), and xenophobia (the Iranian revolutionaries are crazy, and must therefore be indicative of the sanity of all Iranians). Common sense thinking is arguably responsible for the manifold survivals of prejudice, junk science, and the blatant lies and misrepresentations we widely accept as political truth.
Of course, common sense is very useful in some regards. For the kind of thinking required to decide not to run naked into the middle of moving traffic, common sense patterns of thought are admirably well suited. But when it comes to the navigation of complex, non-binary situations, it is rarely or never either appropriate, adequate, or even remotely valid, by simple virtue of its extreme reductionism. And this is important because we, as citizens, have powers and responsibilities to fulfil in a world which is unquestionably complex and non-binary.
All of which makes it very difficult to see a future any less confused and stupid than the present without a major re-examination of the insane assumption that animate bags of meat and water designed for social aggregation on the basis of emotional bonding are capable of valid rational thought by default.
Tuesday, 10 October 2017
There have been a couple of incidents where her opinions have landed her in trouble - her public support for a Moslem ban in Australia, and, more recently, her bizarre cheerleading for the federal government's push for the states to make all license photos available to a national facial recognition database. One can't help but feel for her, made a focal point for disputes 'bred of an airy word' as she gurns confusedly down the camera while sound engineers struggle frantically to mask the clank and grind of her brain attempting to navigate complexity.
See? Even I'm doing it - it's just too easy. She's blonde, and female, she speaks in a certain tone of voice I'm hardwired to associate with stupidity, and her default expression is one of slightly anxious confusion. For those who consider themselves politically sophisticated, she may as well just be a gigantic bullseye. But just like everyone else, it is beholden upon me to police my initial, knee jerk reactions. It is very important, if I'm to retain what credentials I have as an intellectual, to understand where she's coming from and what, in fact, she actually is.
I'm vague on what it is she actually does, but I am aware that she appears on breakfast television of some sort, which must mean that she is a very popular personality. And by extension, breakfast television must also be popular. This must mean that a significant portion of the electorate is fully engaged by inane chatter, footage of happy people being happy, and political analysis delivered by the same people who sell vacuum cleaners and mops over the phone. So Sonia must be representative of a large portion of the population. The inescapable conclusion is that there is a significant group who actually care about Ashton Kucher's opinions on Christmas, who are avid followers of the Kardashians, and who operate at a level of engagement so low that comments like, "I like it. I do. Bring it on. Big Brother, bring it on," constitute political thought.
This being the case, Kruger must be considered in the light of a champion of the people. Or at least, that section of the people who just can't be bothered thinking about this crap. A section which I am inclined to think is an actual majority. I'm pretty sure this is the section of the population being referred to when right wingnuts refer to 'the silent majority' - the confused, reactionary, but fundamentally decent bulk of lumpenproletariat, rocketed by wealth and geography into the middle class apparently against their will. This is, in actual fact, a voice we do not hear often enough. It is this voice which elected Trump in the US, Pauline Hanson in Queensland, and which quietly seethes as that minority capable of thinking in multisyllables dominates the debate whilst calling them idiots.
I personally think that deriding or shouting down this voice is a bad idea. As much as it might annoy me, the idea that the opinions of the befuddled are valid in and of themselves by virtue of the fact that they exist seems fundamental to the idea of democracy. Which means it's very important to engage - to explain, slowly and carefully and in words of two syllables or less, why they might want to think again.
While this is significantly less fun than pointing out that thinking like Sonia Kruger's would be embarrassing in an early primary classroom, it's probably the high road forward.
Monday, 9 October 2017
It's a truism that interpretations of history (including pre-history) are a sort of weathervane for the contemporary concerns of the historians in question. Just take a look at the weight given to climate change and complex systems based explanations in current thinking on issues such as the Bronze Age collapse, collapse of the Roman Empire, and the origins of the world wars. What's also interesting, though, is what these discourses reveal about contemporary methodologies of thought.
Archaeology and ancient history have become unlikely pioneers in the area of multi-disciplinary studies. Unlikely because these are traditionally such conservative fields, but easily comprehensible in hindsight given the nature of the undertaking. This means that there is an admirably collegiate culture, especially in archaeology, characterised by strong openness to discussion of finds and findings from pretty well any specialist in any field. A good example of this is the recent discussion of the Gobekli monoliths as an ancient astronomical observatory.
It all started, as most archaeological controversies start, with a stupid and poorly researched news article. A whole series of articles, more or less factually incorrect, ran with a paper put together by engineers from Edinburgh University which purported to have used statistical data analysis to match symbols on the gigantic T-Pillars to astrological signs. Apart from variously mis-describing these engineers as 'archaeologists', or vaguely referring to them as 'scientists', every one of these articles described the findings of the paper as if they were incontrovertible fact. As it happens, the team at Gobekli Tepe run an excellent and very informative blog, possibly because they're sick of all the ancient alien morons taking all the oxygen. What can also be found on their blog is the discussion they had with the authors of the report.
Some of this is a little bit abstruse, so I'll provide a quick summary here for context. Basically, the engineers decided (for reasons which are unclear) that a particular scorpion symbol was a sign for the Zodiac constellation Scorpio. Using various data analytics tools, they then cross compared a selection of symbols on various pillars with the calculated positions of constellations in 10000 BCE. Finding that they were able to associate a number of animal symbols with current astrological designations, they then wrote their paper claiming that the site must be an observatory and, further to this, decided that one of the images must represent the still somewhat dubious Younger Dryas impact. So far, so depressingly standard for the use of data in academia. But what's really revealing here is the nature and content of the discussion which ended up happening between the archaeologists and the engineers.
The team at Gobekli Tepe, to whom I am admittedly partial, reacted fairly scornfully to this paper. They pointed out various flaws in the methodology and selection of evidence, and raised a huge question mark over the scorpion/Scorpio thing, making the very valid point that Zodiac signs as we know them aren't really attested prior to approximately 2000 BCE, and that those signs, which form the basis for current Zodiac iconography, are from a very great distance away from the site. Basically, they contended that the distance in time and space rendered the very first and basis assumption invalid, or at least highly dubious. They then went on to point out that the selection of pillars seemed random, was not in any way comprehensive, and had a distinct look of 'cherry picking' about it. And on top of all this, they raised the point that the monument had been altered and reconfigured over many generations, rendering a single purpose unlikely. This done, they pointed them to their own theories about the monuments being indicative of emerging social complexity for consideration.
The response from the engineers was revealing. The engineers airily dismissed the social complexity theory in a single truncated sentence which labelled it as 'opinion'. They then proceeded to blame the archaeologists' slow publication rate (it's actually really fast) for the incompleteness of their data. No coherent defence was made of the basis assumption beyond 'scorpions have always stood for Scorpio', which is ludicrous, and they also embarked on a long, inexpert, and rather sterile discourse on the survival and transmission of stories. All of this was capped off with the bland assertion that “… given the statistical basis [of their] interpretation, any interpretation inconsistent with [theirs] is very likely to be incorrect.”
And there's the kicker. It doesn't matter that the base assumptions for their data analysis are basically pants. The fact that archaeologists and ancient historians spend their lives studying ancient iconography and mythology is utterly insignificant. The fundamental flaws in their evidence selection are irrelevant. All that matters is that their analysis is 'statistical', which must mean it accesses the highest possible level of truth because 'science'. This is unbelievably moronic and, unfortunately, symptomatic of a lot of thinking today.
We can see evidence of this malaise shot through every aspect of our lives. From elaborate psychometric testing to bizarre, data driven theories of law enforcement, the absolute bane of poorly interpreted statistics and numbers in general in politics, and the bizarre and occasionally insane conclusions of data models in genetics, linguistics, and urban planning - data would appear to be the new god. Don't get me wrong - statistical analysis and sufficient data to do it with are vital components of any scientific or theoretical inquiry, but the fundamental component of all of this is humans. Data doesn't think. And if we try to make it do our thinking for us instead of using it as it should be used - to validate or check human thinking - we risk becoming as stupid as the machines we make and use. And that, if you actually think about the last time you asked a computer a question, is pretty damn stupid.
Monday, 2 October 2017
Overnight, I watched an appalling, horrifying thing unfold in one of my favourite cities, Las Vegas. There's no real need to rehash the details here, as what little that is known, heavily salted with speculation and deplorable sensationalism, is already ubiquitous. What's also unfortunately ubiquitous is the immediate politicisation of the event.
I suppose this isn't any individual's fault. Mass shootings in the US are all too frequent, and the basically pre-programmed response of influencers and opinion makers is to turn them into a discussion on gun control. I use the term 'pre-programmed' advisedly in that it's a bone deep reflex, rationalised on the grounds that the cause trumps considerations of decency, appropriateness, and restraint - the argument is that gun control is the root of the problem, and that the imperative to advocate is necessarily greater than any other.
This may or may not be the case. I personally agree with the limitation of access to firearms, but that's neither here nor there in this discussion. Because what I'm mostly aware of is the life changing horror of being involved in a shooting in any way. The gut wrenching terror of knowing friends or loved ones might have been senselessly taken in an incident one can neither parse, influence, or affect. The sight, real or imagined (both equally abhorrent) of faces known and cared for, down in the dust and bloodied with random or targeted violence. The human aspect, basically. The one which, in most cases, has been dealt with in curt expressions of vague sympathy before the immediate commencement of political drumming.
I know the tributes and vigils are coming. In the next few days, there will be moments of silence, candlelit gatherings, and declarations of solidarity in the face of pain. But what I wonder is if it's worth thinking about the fact that these have become emphatically secondary responses. That the order of reaction is now outrage, political advocacy and argument, and then grief. I wonder if it's worth thinking about what that says about the nature of our humanity in this new historical epoch of the information age.
Because I think that if we do think about it, we'll see that it doesn't say anything very good about us at all.