Featured

A Digression: Universal Design

This is connected with my first blog on the Technical University and the businessification (pronounced business-ify-cation…couldn’t resist it) of education. I read three ‘corporate’ documents yesterday, all glossy pages, colourful graphs and photographs of happy smiley students, or ‘learners’ as the businesspeak that permeates the things insists on calling them. Strange that, our children’s primary school calls folks ‘students’. That’s my first point, if whoever produced these things was serious, there wouldn’t be any ‘classification’ or ‘boxing’ in their terminology: a simple reference to ‘people’ would seem the obvious choice. This is the first irony in documents that claim to be ‘informed’ by equality, diversity and inclusion. The second is in the “How to do EDI” produced by an Irish university, which starts with a quote from John Henry Newman. Really? Is that the best they could come up with?

However, these pale in face of the central irony: all three suggest that Irish universities are (a) bastions of right-wing, middle-class privilege and, (b), (a) is shored up by academics who have no interest in their ‘learners’ and cannot be trusted to recognise their own ‘failings’. These are of a piece with the direction of third level over the past number of years: codes of conduct for staff; producing “learning outcomes” as part of a bureaucratic paper trail; bureaucratic control of the language that can be used in writing said ‘outcomes’; ‘new’ examination procedures (that turn folks into mere percentages); “profitability audits”; the imposition of a creeping monolithic “command structure” which dictates what academics should do and how they should do it.

All of these share one common denominator, of which EDI and Universal Design (henceforth ‘UD’) is simply the latest manifestation: mistrust. They begin from the premise that unless academics are told what to do, and held to account for not doing it, then they won’t do it. A further advantage of these impositions is that successive governments can say “Look, we’re making these people work. They’re under our control.”, an extension of their false representation of teaching in general (I would include teachers in primary and secondary schools in the category of those who have fallen victim to bureaucratic managerialism) as all holidays and short working days. Easy targets, as any vocational profession is, be it nursing, social care or teaching. The “general public” have been taught to resent these professions: when nurses strike, they are represented by the media as putting lives at risk for money; ditto social care workers. However, at least these professions have both positive and negative representations. Academics are seen in entirely negative terms, charlatans taking tax payers money then doing as little as possible.

This is the starting point of these recent ‘developments’ in third level: mistrust and, therefore, control. This situation is exacerbated by the kind of people who take management roles in universities (and the HEA). What self-respecting academic moves into management? Unfortunately, they tend to be those who cannot do the job for which they were employed. They lack intellectual ability and vocation. Thus, we have tiers of people in management who see their role as simply complying with government dictates (again, what self-respecting university follows such dictates? A university exists as a public good, not a government vassal). In the current ‘climate’, we now have these folks playing at being businesses, unable to recognise that education is not a business, and any attempt to treat it as such should be resisted as antithetical to the “common good”.

This being said, these latest ploys to control academics (adding to the erosion of academic freedom) are rather ‘clever’: dress up control as equality, diversity and inclusion (to which surely no one can object) then introduce this as the central component of “universal design”, something to be used in all Irish universities…and the sting in the tail, that puts this beyond discussion? By using UD, universities will be complying with the law. This final point, which appears in the introduction to one of the documents, gives the game away. These documents are not about ‘learners’, they are to protect universities from being sued. The control aspect is just “added value” for the managerial class.

As I read through these three documents, the same thought occurred to me again and again: We already do this, and have been for years. We might not phrase it in the same wilfully obtuse businesspeak, but it’s just a part of our role. For example, I get to know the folks in my lectures and tutorials; I’m aware of backgrounds (in every sense); I’m aware of problems, both academic and personal, that people face; I’m aware of what I teach and why I teach what I teach; I’m aware of the shortcomings of what I teach; I know that my own ability to teach X is limited by my knowledge of X (which I then take steps to remedy); I’ve seen the ‘cohort’ (a favourite word of managerialism) change over the years; I’ve changed my assessments to ‘fit’ the person; I’ve recognised neurodiversity…and on and on. I’m not suggesting that my practice is ‘special’, rather the opposite – my practice is standard, ‘normal’, call it what you will (An aside: a recent research project found that lecturers tended to be more left-wing in their outlook). All UD does is enable management to accuse me of “failure to obey” if someone brings a case against them. In short, as with other professions’ code of conduct etc., it enables management to “hang you out to dry” in the event of any incident.

Of course, what it also enables is the usual corporate whitewash: “Look at us! We have policies on X, Y and Z. We’re moral.” It is indicative of standard business practice: once we have a policy on X, we can ignore it until we need it. There is no sincerity behind UD…what looms behind it are the insidious spectres of: uniform curricula across universities; the ‘normalisation’ of capitalism as ‘natural’; blind obedience to “the needs” of business; management control of the academic (in that management will, no doubt, ‘assess’ curricula in some algorithmic way). In regard to the latter, the academic becomes a mere wage labourer – a practice already put in place by managements as they employ people on temporary and zero hours contracts (dangling the carrot of “something better” in front of people), paying them only for what they call “front-facing hours” (teaching hours), conveniently ‘forgetting’ preparation time, marking, research hours etc.

In this we can see what all of these policies and strategies are designed to do: create a climate of fear. In colloquial terms, “Do everything we ask of you without complaint or things might go badly for you…”. I was recently privy to a discussion in which “staff apathy” was raised, yet he unanswered question here was: Who created this apathy? Who does apathy benefit? The use of the term suggested it was interchangeable with ‘laziness’. Not the kind of basic error one expects to encounter in a university setting. Here, I’d link back to my previous post: apathy was represented as being an individual trait, rather than a condition created by the environment in which people work. This situates UD in a meta-context: failure to take account of the ‘charter’ is an individual failing – in much the same way that ‘mindfulness’ conceals the reintroduction (the 19th century notion) of mental health issues as being something that the individual is responsible for, nothing to do with the meta- and micro-environments in which they are forced to live and work

All of this takes place under the guise of one’s being part of a ‘community’. However, the traditional definition of a community is one in which all members are equal, have an equal share of power, can make meaningful contributions to decisions, and can speak out without fear of retribution. The ‘community’ being referred to here is that of business – a monolithic, “top down”, command structure, that masquerades as an institution committed to equality, diversity and inclusion. As usual, this intersects with the notion of “the individual” that business (and capitalism in general) mobilises: you can be an individual if, and only if, you behave in, and obey, ways X, Y and Z as laid out in these rules.

Sometimes I despair, but then I remember all the amazing people I’ve had the good fortune to meet over the years, and am still meeting.

Featured

Art as History

If Heidegger is right, that Art creates society rather than society creating Art, then what would this be like?

Obviously, artefacts that are ‘favoured’ (approved of?) are, on a micro-level, passed down or passed around. Until recently, one’s taste in popular music was a shorthand for telling others who you were: you referenced bands as a way of indicating your emotional and moral dispositions. Allegiance to a particular band or pop/rock star indicated your attitude to the world. This kind of behaviour originated in the 1950s, when the rise of popular music – Elvis, Buddy Holly, Gene Vincent – was coupled with the availability of cheap transistor radios, record players and a younger generation who were not going to war. The concept of “the teenager” was borne: an intermediate group, between children and adults, apparently rebellious – mods & rockers, hippies, punks – before “settling down” to the banalities and tedium of adult life. In passing, it’s worth mentioning here the Marlon Brando film, The Wild Ones, and that classic, depoliticising exchange, when the shopkeeper asks what he’s rebelling against…to which the Brando character replies “Whatdaya got?”. This is crucial: it transforms rebellion because of dissatisfaction with “the ways things are” into a simple, hormonal side-effect before the teenager “grows up” and accepts that “this is just the way things are”.

From this we can infer that, again on that micro-level, the liking for the band(s) of our youth that we “pass down” to, say, our children is based on nostalgia. Our memories transform our youth into halcyon days before our dreams and aspirations, often seen to be represented by and in our musical choices, were reeled in by responsibilities. When I play the Rolling Stones, the Clash or the Sex Pistols to our children I am trying to represent a past version of myself, one with a very different set of priorities to those I have now. It’s worth noting here that another factor has come into play in the past twenty or thirty years: the cult of youth. Becoming ‘old’ is no longer acceptable, it is to be feared (from a psychoanalytic perspective, as a precursor to death). Thus, what could be described as my ‘clinging’ to these bands, insisting that they are still relevant, is a way of asserting that I am still relevant, that I am still ‘youthful’. There are, of course, other factors involved here: how one dresses; the rise in cosmetic surgery (to maintain one’s “youthful looks”); house decor; who one associates with. In short, the creation of a public self – in much the same way as the concept of self appears post-Copernicus during the Renaissance – to dominate the world around one. I shall leave the development of this self, through Wordsworth’s Preface to the Lyrical Ballads and Wilde’s idea of the mask, to one side for the moment. Suffice to say, that social media merely refines this public self: the smiling photographs; the ‘achievements’; the disclosing of “personal struggles”; the insistence on ‘happiness’. This is “self as advertisement”, a self conscious (or perhaps ‘unconscious’ is more accurate) of the need to have the events of one’s life validated by others. To be unseen is to cease to exist. I’m not suggesting that this desire to be seen is new, it isn’t: what is writing if not the desire to be seen by others? To communicate beyond death? Why build this, establish that foundation, carve your initials on a tree trunk? All of these are motivated by being seen (which, one might say, is a metric).

This leads us back to my original question though. Does Art create society? Why is it the case that certain artefacts are selected as being more valuable than others? Are seen as encapsulating certain values that should be perpetuated ad infinitum? There is also a related question, equally as important, “Who decides?”

What makes a work by Bach or Mozart valuable? Why have works by these composers been passed down to generations? Do we find their work appealing through some intrinsic sense or because we’ve been told they are ‘good’ (call it what you will)? Put another way, how have Bach and Mozart contributed to our contemporary culture? Which values do they encapsulate that we continue to value? For sure, in the canon of what we can call “classical music”, we can detect their influence in contemporary compositions – we can trace a ‘line’ of influence from their present to our present – but is there something else? The “something else” is difficult to discuss: I can say, by way of example, that the first time I heard Bach’s Cello Suites, I was struck by the thought that there was (is) something ‘right’ about them, that even though I had never heard them before I had an “inner sense” that I had. A stupid thing to say, but the right kind of thing to illustrate what I’m getting at. I could say much the same about Mozart and Mahler…and the Rolling Stones and the Clash. With the latter, I know it was because they articulated a political anger and frustration with way things were (Thatcher’s government; unemployment; poverty and so forth). Not so easy with Bach. Was it because of my upbringing? The person who first played them for me? The timing, in terms of experience and perception?

Easier with film. I was struck (“…like I was shot by a diamond bullet” as Kurtz says in Apocalypse Now) by Godard’s films because here was someone who didn’t treat me as an idiot, who demanded as much effort from me in interpreting the meaning of his work as he exercised in making it. His films have a complexity that standard pattern narratives do not possess. His work is film as philosophical treatise, artefacts one can (must) return to again and again. There is no closure and “on to the next one”. The camera stops filming the people onscreen, but their concerns, the issues they wrestle with, continue in the spectator/reader. Again though, I can argue that his work is political: he deals with how persons are formed by ideology; how this causes contradictions in their lives; how one might confront the dilemmas of one’s contemporary present. In these ‘themes’ his work is timeless – anchored in his present but relevant to one’s own (unlike, say, the poetry of John Dryden which requires us to “read around” the history of the time to male sense of his meaning). Whatever historical background is required is built into his films.

I’m not going to run through examples of each art form. Suffice to say, when we read the novels of, say, Eliza Heywood, Fanny Burney or Jane Austen, the issues stand out to us, ask us to compare our contemporary present with the time of writing (and, it has to be said, reach depressing conclusions.

I cite these novelists, Godard (I’d also include Resnais and Antonioni) and popular music (the Rolling Stones were, before they became their own tribute band, overtly political) because they rather contradict Heidegger’s idea. If Art creates society, then why haven’t we learnt lessons from these and moved on – progressed? Why aren’t these artefacts of merely historical interest, representations of “the way things were”?

Part of the reason, it seems to me, lies in the formalisation of Art through the school and then the university. At a young age, we are taught that “X is good”, “Y is a master/genius”, thus, composers, authors, painters are held up to us as figures we should aspire to. Their work is something we should study in order to be seen as ‘serious’ or ‘proper’. The timeline plays a crucial role in this: we can trace Bach through to Stockhausen; Defoe to Amis; Rembrandt to Rothko. Our ‘education’ can be gauged by adherence to metrics – which are seen as being ‘neutral’ when, in fact, they are anything but. You will probably have heard of all the males I refer to one sentence ago, but what if we replace these with women?

From this perspective, usually called ‘tradition’ (a word which carries with it all kinds of other phrases whose meanings are assumed, such as “common sense, “human nature”, “that just the way things are” and “any decent person…”), what we do is perpetuate historical sexism, racism, homophobia etc. etc., all apparently in the name of ‘education’. An insistence on ‘tradition’ is actually an insistence on stagnation, on the maintenance of traditional power structures. It is worth remarking here that social media – a product of the internet which, initially, promised a “new world” of community and co-operation – works as ‘enabler’ for this perpetuation; the limitations and stereotypes of our culture(s) are reproduced and reinforced. They merely become easier to engage in.

However, to return to metrics, which originate in feudal society, develop in proto-capitalist society and become sacred in the capitalist society of the twentieth and twenty-first centuries. Metrics demand that we measure ‘output’. In feudal society, the poet is measured by his (and it is ‘his’) knowledge of the designated classics in Latin and Greek. This metric continues into the proto-capitalist societies of the Renaissance and the early modern period (look at Oxford and Cambridge. N.B. I went to Abderdeen University, only narrowly avoiding the requirement that all first year students study Latin and Greek), but the “great leap forward” comes in the twentieth century, when universities establish timelines, therefore, curricula, for subjects, thus, enshrining metrics in the fabric of “third level education”. We should also note that who is included in these timelines is a reflex of patriarchy. Few, if any women merit inclusion – glaringly obvious in histories of the novel, histories of painting and philosophy per se.

This is important because these timelines instanciate the interests of the ruling class. As Marx argues, they allow the control of ideas. The metric becomes an effective mechanism for the control, and limiting, of thinking itself. Art as metric dooms society after society to simply reproduce the prejudices of the past – we might not forget history, but we do repeat it endlessly.

I have deliberately kept film out of the discussion of metrics because, unlike the other arts, it has arrived late to education (even in the 80s, when I was a student, ‘film’ was folded into English Depts as a “special subject”…not quite respectable) . The European directors who moved film from simple ‘entertainment’ to artefacts that challenged and critiqued had no formal training. They experimented with techniques, defied pattern narrative, inhabited a liminal space between the other arts. The other distinguishing feature of film is that it came from popular culture, the fairground shadow show, a marvel that required the spectator to simply marvel. It also ‘evolved’ on two continents, America and Europe: one chose pattern narrative (in itself a metric), the other saw a liberation of form and thought. Of course, what we have now come to see is that “Arthouse cinema” is measured against “Mainstream cinema”, categorised as ‘elitist’, a vehicle for “left-wing idealism” or, in the current vernacular, “woke ideas”.

Featured

How is Cinema?

OR Cinema, Perception and ‘Reality’…

What do we see when we go to the cinema? What can we see when we go to the cinema? What distinguishes this from ‘reality’?

Firstly, we see a re-presentation of the real; this is a standard response, yet it assumes that what we see as the ‘real’ has points of comparison to a ‘real’. Using the latter in its singular sense posits a ‘real’ that exists independently of the human person, that objectively exists on its own terms, in its own time and space. Yet this is, when we consider the statement, an unsupportable assumption. My perception of a/the ‘real’ is mine and mine alone. I cannot share it with others as they cannot share theirs with me. So my ‘real’ is perceptually unique, a creation of my experiences, my spacio-temporal moment, a ‘real’ that is both present and past simultaneously (or that is always past if we agree with Russell). In the cinema, what I perceive onscreen is a version of a real that is, for the duration of the film, the real, but only to me. What I see is determined not only by the form of film and its techniques, but by the experiences I contain, my remembrances, the socio-historical circumstances of my present/past, the socio-cultural circumstances of my present/past, my present/past emotions, the feel of the cinema seat, the smells of the auditorium…in short a version of a real that exists only at that fleeting moment.

“The real”? A phrase we think we understand because we’ve said, and heard, it so often. Can the eye see the real? Isn’t the field of vision always the field of some-one, a field that transforms the connected into objects, severed from their original connectedness? My vision is a film, transforming the objects of perception into a narrative, unified in me (not by me). When I pass my gaze over ‘things’, their story becomes an instance of my own. I exist not in my mind but in the objects that form the content of my consciousness; I assimilate and am assimilated simultaneously. The objects of my consciousness, identified by my perception of them, have an artificial connectedness, an imposed Being, by which I convince myself that I exist – they are not just the objects of my existence, they ARE that existence. A fragmentary whole, that exists in my memory…yet that memory is inexact, selective (for reasons which I cannot know – particularly if one accepts Freud) and forgetful. I forget, or am unaware of, far more than I think that I have perceived.

Following from this, how much of a film do I actually perceive? I can watch a film repeatedly but still fail to see it all. One can say the same in regard to reading a novel or poem, seeing a play, examining a painting. The standard/usual explanation here is that perception is limited by knowledge and experience (this is not just the case for Art; a nurse, for example, might initially recognise X after qualifying, but in ten years’ time, recognises X, Y and Z). When I read Eliot at 14, or see Godard’s Passion, I see and read something that is, arguably, completely different to that I return to at 24 with more knowledge and a different experience. This is hardly a startling revelation; as our knowledge develops and our experience grows (?), our desire for complexity expands. The simplicity of fairy tales, nursery rhymes and disney films become of only nostalgic value (although, or because, we recognise their insidious messages). As we become older – for we have to link knowledge, experience and time – we reassess, examine both our experience and intellect for the effects and affects these items had. We try to identify, test and, in some cases, rectify the ideological ‘messages’ of our received culture: why is X considered to be “better than” Y? In what sense is A a ‘superior’ work to B? How has it come about that I have taken these cultural values as mine?

Simultaneously, we try to identify artefacts that express our selves, our arrived-at values, that protest the injustices that we see too. In the case of our selves, our choices change over time, yet we also keep returning to the same artefacts, those which, in effect, change as we change, whose meaning(s) alter as we alter – we map our developing selves onto such artefacts, identifying fluctuations, new or alternative meanings, and vestiges of previous selves. This is also true of the content of our knowledge and our experience, although this raises the question of the possibility of separating what we might refer to as “the elements” of our perception – and whether we can refer to “our perception”, which seems to imply an intentionality on our part which may be misplaced; the elements of my perception, past or present, can appear unbidden, kindled by everyday encounters: a spoken or overheard word; a fragment of music; a glance; a photograph; a meeting with another. In a minor key, what Freud called the return of the repressed (in comparison to the larger ‘repressions’ he discusses), or more accurately, the return of the ‘forgotten’ or “unremarked at the time”.

Our perception is formed by what our culture values, by familial contacts, by our peer group. Initially, obviously, we are unaware of this ‘formation’ (formulation?), “ideological currents” meet in us unopposed; we are, for a number of years anyway, unthinking recipients assaulted by the various systems into which we are born: we ‘learn’ the history of our nation (in a biased, uncritical form) as part of our ‘education’; we ‘learn’ what our culture values and, therefore, what we as persons should value; we ‘learn’ how to interact with others. An integral part of this ‘learning’ is the ideology which, put simply, dictates our perception and, by underhand means (the unspoken claim that “this is just the way things are” or “this is the way any decent person thinks”), our thinking.

However…this can change (N.B. ‘can’ not ‘does’ – privilege breeds complacency, unquestioned acceptance of an ascendant position in society). We could look here at Max Stirner’s “pendulum theory” of self-creation. On one side we have ‘society’, on the other ‘self’. Initially, the pendulum swings from one side to the other equally yet, as we become older (when we become, say, teenagers), the pendulum begins to swing more to what we designate as ‘self’ – not in an egocentric sense, but in the sense that we begin to question, to interrogate, the assumptions of the society into which we are born, socio-politically and socio-culturally. What we had previously accepted becomes unsatisfactory, simplistic and, most importantly, unjust. We begin to analyse the actual using our ever-developing ideas of abstract concepts – justice; fairness; morality. Such concepts are drawn from our previous, unthinking lives, but changed (utterly) by our experience(s) and our developing intellects. This, in turn, alters our perception.

In terms of Art, we seek out artefacts which engage with the complexity of Being (in-the-world), that refract our dissatisfactions, echo our desire to protest against injustice (as Eisenstein says, all Art is borne of conflict). Artefacts that refuse to accept that “this is just the way things are”, that challenge and debunk (ruling class) ideology. Whether this is in the films of Resnais, Godard and Greenway the novels of Austen, Dickens and Amis, the theatre of Brecht, Pinter and Churchill, the painting of Rothko, Pollock and Emin, or the poetry of Marlowe, Plath and Carson. I’m using these as examples of those who fired my imagination, who encouraged me to think differently…and still do. They, to me, are examples of artists to whom one can return again and again, whose work shifts as my perception changes and whose work shifts my perception. Obviously, when my perception changes, so do my experiences, both past and present. As Heidegger argues, we are involved in a constant process of becoming, a continual fluidity, rather than series of static points.

When we engage with Art, we are forced to reflect, to refine and to rethink – anything else is simple laziness. Cinema is not ‘entertainment’, but a way of ‘doing’ philosophy, of encountering the world outwith our selves but, simultaneously, encountering our selves in that world as ideological constructs.

In cinema, we meet others who are similar to us; insofar as we are able to extrapolate that any other person can be similar to us – we involve ourselves in a constant series of everyday assumptions that “because X does/says Y, and I do/say Y, then X is similar to me” or “I am similar to them”. The latter would seem to be a ‘better’ way of thinking because it gives the lie to the idea that “I am (in some sense or other) special/unique”. Cinema shatters this illusion by (re)presenting those who appear to think, feel, be as confused as we ourselves are (N.B. Is ‘confused’ an ideological term here? Confusion suits who?). It provides intellectual and emotional ‘markers’ in a world that is increasingly self-obsessed, engulfed by an ideological individualism which, politically, denies the connectedness between human persons in virtue of their being human persons, a world in which ‘relationships’ are becoming transactional and/or contractual.

Even the poorest cinema (badly shot; badly lit; badly scripted; reliance on special effects etc.) gives the active spectator pause for thought. For example, the Marvel franchise post-9/11 can be read as articulating the desire for heroes, patriotism and American exceptionalism. For some though, the comic book superheroes of previous years are given a ‘reality’ in these films: they move from being fictional to aspirational, an extreme example of the original problem. This leads back to the original question: how are we to separate what we perceive in film from our everyday perception? Are there ‘clues’ or ‘markers’, or have these become assimilated into our media-saturated environment? Is there any difference between our contemporary present and the oral stories of Odysseus or Beowulf? The novels of the 18th and 19th century?

The obvious ‘answer’ is that these – films, tales, novels – are fictional, whereas we live in ‘reality’. However, our ability to define reality is informed/defined by the fictional. Our concepts, our ideas, of what reality is like are built by comparison with the (apparently) fictional. Yet these fictions are seen as the representation of potential “real life” situations that involve actual emotions: when we watch/hear/read we are, as the expression has it, “emotion testing”: we are discovering which emotion is which; acceptable ways of expressing emotions; if others have similar emotions (which they apparently do – I discover myself in the being of the other); how emotion can be created and manipulated.

This is a central process: we liken ourselves to “the other” through fictions.

What we need to ask is does cinema (and, by extension, other forms of screen media) play a unique role here? The fictional embedded in a ‘real’ landscape…

Featured

The Value(s) of the Critic

In Waiting for Godot, when Vladimir and Estragon have the insult competition, the ultimate insult is…critic. Yes, it’s a joke, but at the same time it isn’t. There’s a long tradition of seeing the role of critic as…what?…Contemptible? As someone who is unable to create, so they take on the lesser role of critic, but are driven by a kind of hidden envy of the “real artist”. Obviously, that’s in relation to Art. What we also see is the same kind of thing happening in regard to society: those who criticise the way society functions are accused of being negative, of only illustrating the problems without suggesting solutions or, if they do suggest these, of being idealists, out of touch with ‘reality’.

I’ve never really understood this dislike of critics; they seem to have a far more difficult ‘job’ than makars, in that they need to informed ni regard to the whole area under discussion. When a critic offers an analysis, it is based in history, the evolution of the genre in question and the societal dynamic as it exists throughout time. For example, there’s very little point in writing a critique of X’s realist novel, without a knowledge of how the knowledge emerged, how it developed, the kinds of directions it’s taken.

One of the main problems here is the mistake of thinking an opinion is criticism. It’s part of that idea everyone is “entitled to their opinion” which somehow morphs into “everyone’s opinion is valid”. The answer to both of those is ‘No’, unless we give a qualified response. Of course, everyone does have an opinion, but the important question is “How valid is that opinion?” (Which runs both questions into one answer.) Having an opinion is simply being alive, in that judgement is a fundamental component of being human. Judgements are, I would argue, inescapable, from the trivial, “What should I wear today?”, to the crucial, “Which party shall I vote for?” (N.B. Not “Who shall I vote for?”). One can argue that judgement is part of thought, must be part of thought. When we make judgements, we are taking a political position – as we grow older, one might say “grow into our own consciousness”, we realise this, hence, we the change in our judgements over time. These changes are a result of thought and experience; what I mean by ‘experience’ here are the ways in which our knowledge of particular areas grows and develops, thus, informing our thought. Therefore, the more one knows about a specific area, for example, film, the more one should be able to argue coherently for a particular analysis – the more one should be able to construct valid arguments. This is all very Humean: in Of the Standard of Taste, Hume argues precisely this, that one must be immersed in an area, have knowledge of it, and them demonstrate that knowledge/experience in one’s critique. There’s also an unspoken idea here: that one must be sufficiently self-ware to realise when your knowledge is inadequate yet this, in our digital, immediate age, seems to have been forgotten. Look, for example, at the moral panic we see generated by the gaming industry (I’m phrasing this deliberately – moral panic is also good marketing). Every so often a game appears that causes outrage, articles are written about the destruction of civilisation by game X, Y or Z, because these games are brainwashing “young people”. The problem here is the (lack of) knowledge of those expounding these ‘sentiments’ (as Hume would call them). If we look at these articles, they lack historical context: Plato warns that writing will destroy society because no one will remember anything anymore (substitute ‘googling’ for writing); he also warns against Art because it represents things and human persons as they are not; we have the puritans banning plays for the same reason; when the novel first appears in English, dire predictions are made that readers will prefer the fictional existences they read about over material reality; film is seen, initially, as a grave threat to society…followed by television (t is still the case that certain sections of society bemoan the influence of television), followed by video games (Macron, very recently, blamed the riots in France on “young people” gaming). In each of these occurrences, the warnings are issued by those who favour ‘tradition’ or the ‘conventional’. On the whole, they are the same argument, mobilised by right-wing commentators, sharing a common denominator: fear of the power of Art. While we might not go so far as to suggest a direct connection between Art and societal change, what we can see is a correlation between Art and changing the tenor of society, creating a desire for change. Right-wing commentators recognise that Art causes thought, causes consideration of issues, that, in the long run, leads to change. Perhaps I’m wrong; perhaps those who create the moral panic around games recognise the historical links – a historical ‘sameness’. However, what we also need to recognise is the way in which right-wing commentators claim to be representing “common sense”, or “the natural”: that is, they deny that they are writing from an ideological position (that’s something only the left do apparently). They are ‘neutral’, ‘objective’ – sitting outside history. Yet there is no political neutrality in claiming to be outside history.

What we might call “proper critics” state their political position. They do not posit themselves as an ‘everyperson’ figure. They also recognise the limits of their knowledge, and the factors that have a bearing on their analysis. For example, although I might critique games, I have to recognise that my knowledge is general and that my (old) age has a bearing on my thought. Interesting enough, Hume makes the claim that the young are less able to critique because they lack experience – this is in keeping with the assumed definition of rationality at the time. However, this no longer holds in our society where specific kinds of Art are marketed to specific audiences. These kinds of Art require critics drawn from what I suppose we have to call their “target market”, rather than those who will, almost automatically, find these forms wanting (which, one can argue, is a manifestation of fear). Look at the scorn poured on reality tv or soap operas; I’m not suggesting that we examine these forms in isolation but there appears to be a tendency to see these as inferior, to fall victim to nostalgia.

These forms can, however, be used to illustrate another point in regard to ‘opinion’. There appears to be a move towards the idea of ‘like’ and ‘dislike’ as ultimate critical terms. If I dislike something then I can dismiss it – I need give it no more thought. Yet what does liking or disliking establish? Nothing. What’s important is the ‘because’, otherwise we have nothing with which to debate. This is the fundamental point: likes and dislikes are irrelevant. Everything that goes to make up the “cultural web” of society is of critical interest. Likes and dislikes are pointless, meaningless oppositions, that fragment social life, that encourage the separation of subjects – economics from sociology from literature (itself a separation from Art) from philosophy – and the compartmentalisation of life itself. In this compartmentalised world, I can abhor child labour but fail to see the connection with my iphone, sympathise with strikers but moan about the effects on me. (N.B. Look at the way strikes are covered on tv news: the item leads with the effect, not the cause) The real critic sees that, as Derrida puts it, there is nothing beyond the text – and that ‘text’ is how we live as a human person with other human persons, the very thing that motivates Art.

This is why Art is being segregated and marginalised – confined to what is called ‘entertainment’ which we can define as “escapism which takes our minds off the detail of our everyday lives”. In the capitalist society that we inhabit, Art is fast becoming confined to the same kind of ‘exercise’ as weekend drinking sessions – a chance to obliterate consciousness, to forget the circumstances of our material existence.

Featured

The Valuation of Art

In our contemporary present, the value of Art has become simple valuation. Whereas we once would value the contribution that Art makes to human society, a quality we might say that is ‘measured’ in the amount of thought it causes (which cannot be measured because…well, even thought I’ve typed the phrase “the amount of thought”, I have no idea what this might mean because how could we isolate and attribute this ‘amount’ from the web in which thought exists), nowadays there is a price, a metric of monetary value. There is also the value of Art as a societal token, indicating your social position (class position) to others. Art serves marketing purposes, the advertising industry – it beautifies products, is part of the ceaseless consumerism in which we live. Artefacts are “broken off” dragged out of context, and reattached to cars, credit cards, motor oil etc. etc. This, I think, tends to be the main way of valuing Art: how does it contribute to the sales of other goods.

The bizarre prices that certain paintings command merely indicates that they have become ‘chips’ in the “possession game”: the person who pays millions for a Van Gogh or a Warhol (see the Robert Hughes documentary, The Shock of the New and the conversation he has with a New York stockbroker who owns the largest number of Warhols) doesn’t do so because they are compelled by appreciation of the work. It could be any item that others consider valuable, allows them to indulge in conspicuous consumption and is “an investment”. For them, the work is of little, if any, importance. What we’re seeing is the disconnection between artistic value and monetary value. The former cannot be calculated in the latter’s terms. This also raises questions in regard to reproduction, but in a different way to Benjamin’s discussion of this in The Work of Art in the Age of Mechanical Reproduction. In that, Benjamin argues that reproductions of the original erode the aura of work (before going on to posit that works are now created for reproduction). But isn’t the aura of a work a fleeting instant, vanishing with the final brush stroke or note? The work is then “in the world”, reproduced or not. What of plays, films, music? These have always been produced for reproduction; where is “the original” in these cases? The manuscript, the master reel? Their purpose is to engage the reader, to interact, to cause thought – whether that be with “the original” or with a copy is of no importance. To insist on the importance of the original is to turn that original into an object of supreme value, to value the object above and beyond its capacity to cause thought, reflection or meditation. The work may be exploited as, say, in the case of the Mona Lisa by the reproduction of the image on tea towels, cups, scarves etc. BUT in what sense is this a ‘bad’ thing? This exploitation might be for monetary game yet its effect is to bring the work into contact with those who feel that Art is “not for them” – in the same way, one might argue, as the use of classical music to sell cars. To some, it will be just another picture on a tea towel, or the backing track for a car. However, to others it might catch their imagination, draw them into a world from which they have previously felt excluded. And in that, Art is the ‘victor’.

The most obvious example here is, I suppose, films of books. How many people have been brought to Jane Austen by Clueless, or by the apparently ‘straight’ adaptations? Look at the fuss surrounding the film of All Quiet on the Western Front: how many were intrigued enough by this to read the novel? Then, from these initial points, how many started journeys to other works? Art may seem to be being exploited, but how often does this crude exploitation open new vistas to their readers?

As I’ve said, can we say that there is something intrinsically ‘wrong’ with the work of art produced for reproduction? On a very simple level, even works not produced for reproduction when they are reproduced give us access and, in both cases, surely this is the point? An artefact is created to be engaged with, to cause thought, with as wide an audience as possible. Mechanicial reproduction – or, in our day, technological reproduction – is an asset, an opportunity to bring the artefact to the widest possible audience and, thus, to have the widest possible influence. The question of ‘originals’ and ‘reproductions’ arise only when the artefact is inserted into “the markets”, with its attendant considerations of copyright, ownership, publishers etc.. To think of Art in this way is to begin in defeat, by unconsciously accepting the rules of “the markets”. We saw this during the early days of the internet and the idea of self-publishing, then with musical groups who released their albums directly online. Not that either of these came to anything: the markets adapted very quickly, and the existent rules proved to embedded to be overthrown.

There is another, interesting aspect to the monetarisation of Art. Some years ago, The Museum of Modern Art in Edinburgh paid €20,000 for a work that consisted on a canvas that had been slashed by a carpet knife. This caused outrage; the usual “money for nothing” and “my four year old could have done that” hysteria was deployed…and, possibly, in monetary terms there was some point. However, in terms of artistic value, this hysteria missed the point. I could spend paragraphs here discussing the artistic value, but suffice to say, the work itself was a commentary on divergent values. How many, once their intial outrage had dissipated, began to consider what the artefact was ‘saying’? How many recognised the validity of its commentary on the irrelevance of monetary value in Art – that slash leading us behind the canvas, leading us to the dark space of hanging in a gallery and, thus, into the question of how a work of art becomes a work of art by its placement in an environment where it will be perceived in a specific way that is dictated by that environment. What of the canvas itself? That space usually filled with colour and representation? Here the commentary seems to revolve around the importance of the reader in making meaning and thought, and the ways in which we project, through our imaginative powers, in environments that invite us to do so. Can we apply a metric of monetary worth here? No. Is it artistically valuable? Yes.

What this work also raises is the question of artistic intention. What did the makar intend? Do we need to know? Is their intention important? The answer to all of these is “Who cares?” Once a work is, in some sense or other, complete (better word than ‘finished’) and enters the world, it is open to interpretations. The makar’s intention only matters if we accept that they retain control of meaning, that they own the work and that the work is about them; we are, in other words, back to the idea of the individual as being the basic unit of society, that this work goes to reveal something about their psychological make-up, what I’d call “critique as psychological jigsaw puzzle” – the point of such criticism is to see each work as allowing us to reassemble a portrait of the artist, the more intense our research is – where did they live, what was happening to them when they produced X – the more accurate our reconstruction will be. This is as pointless as I hope I’ve made it sound. If a work is only ‘about’ the artist then it’s of no interest to us. It offers us no thought, no reflection on the life of human persons. It is limited by its attachment to a specific individual.

This still leaves the “my four year old could have done that” question. Unarguably they could have, and here there’s a contradiction with my previous paragraph. If a four year old produced the work, as opposed to, say, someone of thirty, then we can argue that the intentionality is different. For the sake of argument, our thirty year old has been to Art College, has lived in a garret (which, interestingly enough, comes from the French for “watch tower” or “sentry box”), and endured poverty. Our four year old has done none of these things. However, if we move away from the individual to the work itself, do the experiences and circumstances of the individual alter the meaning of the artefact? The temptation is to say yes. The thirty year old’s work possesses an intentionality that examines Art and the Art world. The four year old was amusing themselves on a wet afternoon. Now take another step: we do not know the ages of the artists, in fact, we know nothing about them other than their names. The focus is now on the works themselves and the critics interpreting these works. When further information is revealed, the critic may feel rather foolish, but their analysis has been produced as a result of concentrating on the work itself, without any of the attached ‘noise’. Does this new information make the analysis invalid?

Featured

The Values of Value

In one sense, a ‘definition’ of the concept of value has now begun to emerge, on the whole by suggesting what value is not. I’ve also begun to suggest the kind of grounds we need to establish value. I want to limit this though, in regard to the specific case (I’ll come back to the abstract concept later) of Art.

As I suggested in the previous entry, to approach the idea of value, we need to engage in comparison. Art enables us to do this, whether it is comparison with our contemporary present, or with previous historical ‘formulations’ (?) of society. When we examine a painting, watch a film, see a play, listen to music, read a novel, each of these conjures a picture of the society that produced them for us – both in terms of “What was it like?” and “What could it have been like?”. This is partly because Art is ‘motivated’ by discontent, by conflict, with what exists, and partly because Art is progressive – this is to say, it is not suggesting ways in which society can be perfect (although in some cases this might be the case), but suggests ways in which society can ‘improve’; by ‘improve’ here, I mean move towards a fairer, more just, state. Art does this in a variety of ways but, on the whole, by examining the human person, their relation to their community, and their relation to themselves. In these relations, the spectator/reader is invited to compare themselves, to ask questions of themselves, to empathise and, ultimately, to judge. The emotional reactions that these fictional constructs invoke in us are, despite the fictional scenario, real emotions. We are moved by these scenarios – made sad, angry, melancholy, happy – in identical ways to those we would experience in our lives. This is hardly surprising: what we find in Art are representations of experienced emotion and/or meditations on how one might experience certain emotions in particular situations. There is also, of course, an instructional element too: in situation X, then the ‘proper’ emotion is Y; this is how you should control emotion Z. What runs in the background of all of these instructional elements is “within your community”. From this idea comes cultural difference or, as some would argue (but not me), emotions characteristic of a particular ‘nation’. What we also find in regard to these two notions is the idea of change – that emotions are not ‘fixed’, they may, over time, alter. Art lets us see these changes, lets us compare past and contemporary reactions, even anticipate those of the future. In this latter ‘ability’, we can also see another ‘benefit’ of Art: hope.

Art provides hope in a specific way: the hope that Art generates is ‘structured’ in such a way that it indicates how change may take place. It is not, as hope so often is, of the “one day, this might change” variety but, in its conscious (or unconscious) analysis of society, highlights the injustices of X and the pathways to changing X – moreover, even if you don’t agree with the pathway to change, the onus is on the spectator/reader to devise an alternative.

This illustrates another (tempting say the other) quality of Art: it generates thought, it causes thinking. One cannot stand before/watch/read a work of Art without its causing thought, engaging with the issues and/or human relations raised. Some works deliberately set out to do this, others – which tend to be classified as “low culture” (TV programmes; comics; games; marvel films) -generate thought by their assumptions and by the arrangement of their worlds, in terms of social and political organisations and structures (class stratification; aristocracy; totalitarianism; cults of personality). In what might be scribed as a “covert manner”, many popular forms of ‘entertainment’, surreptitiously employ the Brechtian formula, inviting the spectator to compare the fictional world with their own – thus, to take a step back and compare the two worlds. In this way, the screen, page, music transcends its own frame, venturing out into the ‘real’ world, causing us to critique that world – sometimes in ways we’re unaware of.

In this we might say, the world of business and profit contains the seeds of its own destruction. Take, for example, the franchise cinema industry – Marvel, DC etc.. These films are made for profit yet, at one and the same time, cause their “target audiences” to compare and contrast these fictional world with their own. In many cases, the organisations and tyrants that rule these fictional universes invite direct comparison with the lives that their viewers live outwith the filmic world. Admittedly, all the usual Aristotelian formulaic elements are there: identification with a central character to guide the spectator through the film, thus, obscuring the partiality of the perspective; the high character brought low; repeated moments of catharsis. Yet, ‘behind’ these elements we can see that the plot lines are grappling with contemporary issues and desires: notions of patriotism; the meaning of nationalism; honour; responsibility and obligation; protection of the vulnerable; exploitation; concepts of good and evil. What we can also see in these films is the desire for an invincible hero, one who can defeat all comers.

This is merely a simple example to illustrate the levels on which popular culture operates; even instantly forgettable pop music (bearing in mind that what I’m calling “instantly forgettable” isn’t aimed at me) serves a purpose, in that, we can see in it a desire to escape from the realities of everyday life. We must also acknowledge that a great deal of the attraction of popular culture involves vicarious living: living through these characters and the situations represented is, to an extent “wish fulfilment” – I’d include the adoration of ‘celebrities’ in this, as well as the fascination with social media (although here the situation can be reversed – as in “I’m glad that isn’t happening to me”, “A cheap holiday in other people’s misery” as Johnny Rotten put it), the worship of sports people and so forth.

Yet all of these connections that people have with popular culture involve thought; they involve the spectator being active, making decisions, drawing on past knowledge (of events and themselves) and speculating on future events. This elitist notion of ‘high’ culture and ‘low’ culture is simply that (elitist), usually cited by those who have little knowledge of the artefact(s) which they are dismissing. These two broad categories are designed to exclude (mainly on the basis of class) the majority of the population, hence the term “mass culture”, with its implication of simple consumption and, to put it bluntly, the suggestion that most people are stupid. As Nietzsche says, “The Lordly class take possession of a thing by naming it”…We even see, in the 1930s, left wing critics, such as Benjamin and Adorno, trying to control the kinds Art that should be made available to “the masses”; they can’t be trusted, therefore, must be nudged in the ‘right’ direction. So here we have a kind of double elitism.

Our insistence on the reader (in future, I’ll use this term to refer to anyone who stands in a particular relation to an artefact of nay kind – play, poem, painting, film, novel etc.) possessing a certain degree of articulacy also acts to exclude people. I’m guilty of this: I demand that someone who claims to ‘like’ something can then explain why they like it, why they prefer this to that and to do this in the received language (the discourse) of criticism – with its inbuilt values and its aspiration to the middle class.By learning this discourse though, we can disrupt it – introduce Art that, at first sight, does not ‘belong’. The more often we do this, the less rigid the discourse is, the more fractured it becomes, because unlike discourses that apparently have clearly defined parameters, the discourse of Art must, due to its own ‘guiding’ concepts, be fluid, be able to change and flow in different directions simultaneously.

What the discourse of Art does is to hold up for examination the values of all other discourses; Art causes us to identify and examine what these values are, how they work and, ultimately, how they relate to the good for human persons. A definition of “the good”? A quality in which everyone is cared for, is treated as an equal subject whose needs, desires and wants are recognised (and striven towards). This, it seems to me, is how we measure our being-in-the-world. It is, one can argue, unachievable but nonetheless should be the single, motivating factor of human existence. This is what we encounter in Art and, to be considered Art, this is what the object/artefact/thing should do – cause reflection on the self and the facets that go to make up that self which, in turn, spread into the world of material reality.

Does this mean that to be a good artefact the artefact must be good? In Kant and Hume, we see the claim being made that to be good Art, the Art must be morally good. However, does this follow? If a work revolts us, then we know why it revolts us – it causes a revulsion in us. What it also causes us to do is to examine the ‘revulsion’ itself, to ask if this reaction is justified. It establishes a conflict between what we have been told to feel (by our upbringing, our peer group, the media) and how we, ourselves, feel. Sometimes, we maintain our revulsion because of fear or the desire to belong, yet we have still come to the realisation that this is the case – so we have learnt something meaningful about ourselves and our society.

These kinds of ‘revelations’ (call them that for the moment) niggle away at us, cast doubt upon our own authenticity, our own ability to live as we’d like, true to our selves. Our acts here are conscious, deliberate: your reaction to a work is yours and yours alone. We may choose not to share it, to lie about it, but it is still there.

Featured

The Discourse(s) of Value 2

Must any discussion of value necessarily be moral? What I mean by that is can we discuss value without introducing ideas of good and bad, “better than”, “worse in comparison to…” OR should we go with the postmodern notion of simply saying “X is different to Y”?

In regard to the latter though, is simply claiming that “X is different to Y” just another way of expressing judgement in a different way. For example, having made the initial statement, how does one proceed? When I begin to explain how X is “different to Y”, don’t I have to use pre-postmodern terminology? Or would my doing so simply be a ‘hangover’ from traditional language use? However, what am I doing when I explain difference? Describing a range of technical aspects coupled to spaciotemporal co-ordinates? For example, Bruckner’s 7th symphony was written at time t, in space s. It has the following features, a, b and c. Now, whilst we can do this, and the result will be ‘informative’, is this kind of analysis too ‘dry’, lacking an emotional ‘edge’? A symphony is different to a pop song, but merely describing the two different forms is precisely that – a description. I’d hesitate to call it analysis. To qualify as analysis we mobilise our abilities to compare, to judge- a socio-philosophical account. We also need to include our reasons for listening to the symphony or pop song – how it affects us, what it tells us, how it communicates, the insights into the time of its production or into “the human condition”. We are, for want of a more apt term, constructing an argument – what is more, we are constructing what we hope is a persuasive argument. By so doing, whether we realise it or nor, we are aiming to produce a sense of unity, an idea that you too might enjoy this, might see these elements for yourself and recognise these as binding.

By prioritising postmodern ‘difference’, we are moved towards the concept of the individual:” the force of the artefact is lost, its value (if, in this ‘scheme’ it can be said to have value) lies in what it means to me, and only me. It is shorn of its power to unify and to comment on its ‘surroundings’.. There is no metanarrative of which it is part. Another problem here: in talking only of difference, it seems to follow that everything is as valuable as everything else…from which it follows that everything is valueless. When I make the claim that, say, X is more valuable than Y, I cannot leave the conversation there, I must continue, I must explain why. On the other hand, if I say that X is different to Y, I can abandon the conversation and move onto the next item. The idea of a conversation with others, that “runs in the background”, has disappeared.

I think we can also see a connection here with the two different ‘truths’ that have emerged over the past number of years, (a) Rational Truth (RT) and, (b), Emotional Truth (ET). When I cite RT, I’m referring to a truth that exists independently of me, a communal truth that is assessed and discussed in reference to shared ‘standards’ – of comparison, of judgement. When I cite ET, I am referring to truths that I desire to be true, that I can hold as true on the grounds of that desire and nothing else. I suppose we could also call this “individual truth” – for example, if I desire that X = A + B then it does. This kind of thinking (if, indeed, we can call it ‘thinking’) means that I can construct a world that refers only to my desires. If undesirable truths attempt to intrude on this world, I can reject these as ‘false’, even when faced with evidence to the contrary.

To give a contemporary example, if I desire to claim that everyone has equal opportunity in the world we inhabit, then I can. In fact, I can assert that your view of the world and mine are simply ‘different’ because I reject the idea of metanarrative. You might think that there is (a metanarrative) but I don’t – it’s a simple difference of opinion, and all opinions are equal.

It seems to me that, in such a (postmodern) world, any concept of ‘progress’ grinds to a halt – history stops. We have a world that consists of individuals (and the notion of self-interest that goes with this) who believe what they like…

However (and there had to be one), this non-metanarrative world is organised by, and camouflages, the metanarrative of business and profit. Put bluntly, postmodernity is an attempt by capitalism to write itself out of history while, at one and the same time, controlling it.

The end of metanarrative, and of value, then becomes the (supposed) triumph of a meta-metanarrative and a single value – the latter is no longer recognised as one value among many because, in this model, it is the only value and, therefore, becomes something else…something like “just the way things are”, ‘natural’.

In terms of ‘traditional’ political thinking, democracy becomes redundant; the state, which exists to provide services that benefit the community, ceases to exist. Rights cease to exist. We are returned to what Hobbes and Rousseau call “the state of nature” – individuals battling with one another in an unending cycle of “the survival of the fittest”.

What we see here, in political terms, is populism revealed for what it really is: fascism. There is, however, one significant difference from the fascism of the 1930s. Modern fascism does not require camps, fear and secret policemen to achieve its ends. It is replacing the repressive state apparatus by taking over the educational apparatus – hard repression is replaced by soft repression as business methods and profit are represented as universal and natural. As education becomes ‘training’, the means of protest, the ability to consider and compare, is eradicated. The multiplicity of discourses becomes one and only one.

Featured

The Value of Power

Overall, the term ‘power’ seems, to me, to be a negative one – we talk about having power over someone or thing, people taking power, abuse of power, political power and so forth. This is probably because, in capitalistic society, power is linked to competition: power gives us an advantage over others, it is a means to an end in the sense that it links to profit and personal gain.

There’s a certain irony when we’re told that politicians, of all colours, want to “give power back to the people” because what is actually meant is that this power can only be exercised in a strictly delineated way – within the rules that have been laid out for us. Therefore, we can, I think, see the exercise of power as similar to the exercise of choice: I apparently have free will, and can express that free will by making choices. That is, however, a secondary concern. The primary concern is who decides what those choices are or, put another way, who defines “free will”?

As Badiou argues in The Communist Hypothesis, ‘freedom’ in a capitalist society is freedom to own and freedom to exploit. My freedom is involved i a constant struggle with your freedom; to be an individual, and to express my freedom, I must defeat your freedom – my freedom to be free always comes at the expense of your freedom to be free and vice versa. The choices I have within this system are mere facsimiles of choice because they have already been dictated by the system – in exercising my freedom I am, knowingly or not, serving the system, maintaining it. If we interrogate each of our choices, from the trivial to the serious, they have been decided on already by the limited and narrow range from which we are allowed to select. Within capitalism, this web of freedom and choices is inextricably bound to the corresponding web of oppression and repressions so, as I’ve stated above, my expression of my freedom comes at the price of the repression of your freedom.For example, as unions have ‘won’ concessions from employers in the West, those employers have sought countries where unions are not as strong/virtually non-existent (often due to government prohibition) where they can exploit cheap labour, weak health & safety laws, continuing to make a profit for their shareholders by offering us cheap products. Another irony here is that the workers who produce our goods will never be able to buy them because of low wages – in a similar way to, say, the craftsmen who built mansions for the wealthy in the eighteenth and nineteenth century.

This idea of freedom of choice always strikes me as being akin to Art and interactivity – video games that ley you choose what to do next; plays that take votes on which direction the narrative will go in; online ‘novels’ that let you choose how to proceed (not too keen on calling them online ‘novels’, particularly when there seems to be a notion that making an online book should mean “trying to reproduce the material novel”…which obviously it shouldn’t; all that space for creativity and it’s “Look! You can flick through the pages like a real book”). All of which have the ‘choices’ inbuilt before you start: oh, you can do X, Y or Z, but only because the makar has given you those choices (so they’re not so much choices as options). It’s much the same as language and our sociopolitical system: we are born into these, so our ‘choices’ are pre-programmed – as are the (apparent) ‘values’ by which they operate. I’m not suggesting that because these systems (all subsumed in the over-arching system, capitalism) pre-exist us that we can’t recognise them for what they are and alter/change them (or even sweep them away altogether). In fact, I’d go so far as to say that the “first step” (apart from being an album by The Faces) is to recognise this structure. All one has to do then is work out how to overcome them.

We could go back to Plato and Aristotle, and identify them as being the founding philosophers of what we call “Western Civilisation”. If we think of our conceptual scheme, and everything that flows from this, as a calculus, it is possible to argue that all we are doing (even 2000+ years later) is enumerating the propositions of this calculus. However, I’d also argue that recognising this is the first step in engaging with it and thinking ways out. I’ve phrased that deliberately, “thinking ways out”, to draw attention to it. As Marx wrote, not only do the owners of the means of production own the means of production, they also control the flow of concepts and what those concepts consist of and in. The fundamental basis of capitalist ideology is to convince the majority, the proletariat, that the interests of the minority, ruling class are their interests too (false consciousness). We can see this at the moment in the wrangling over inflation: apparently, if we all get wage increases that are in line with inflation, Armageddon will be the result. Somehow, this isn’t the case for companies making ever-increasing profits. Rent control is ‘bad’. Having utilities like water, electricity and gas in public ownership is ‘bad’. “The Market” is the solution to all our problems (as was God in centuries past – and look at the similarity in language), it all provide, find the “right level”. Inflation is the cause of austerity, not the blatant greed that lies at the core of capitalism, and the subsequent disregard for the millions of lives destroyed by, and lost to, it. Go back to that idea of the web: what it allows is for the imposers of austerity, who oddly enough never suffer its effects, to claim an entirely bogus distinction between ‘direct’ and ‘indirect’ impacts. These are all directly connected: austerity requires cuts in jobs (always for those at the bottom of the heap, and to maintain profits) and public services, which means support for those who become jobless, who homeless and who are then afflicted by metal health problems (as a result of the values imposed on them by capitalist ideology) for which there are no supports. People are plunged into abject misery, which turns groups against one another, turns people against one another, turns partners against one another, turns parents against children. People turn to ‘crime’ (although it’s arguable how we define this), turn to drugs (of which alcohol is at the top of the list), turn to self-harm, turn to suicide. There is no ‘indirect’ cause here. All of these are directly connected by this web of ideas. It is, quite simply, a case of thinking it through – of identifying the system and its effects. Having done this, to still take/approve (vote for) actions that will result in condemning those in misery to yet more misery is a deliberate act. A choice for which you can be held responsible. A value (that you see your own well-being and profit as more important than other human persons’ welfare) that you display publicly. An aside here: If politicians do not recognise that what they are doing to people is wrong, then why do they deny doing whatever it is, why try to spin it?

One might argue that the greatest ‘value’ in the capitalist armoury is apathy. The power to convince people that they are powerless, that whatever they might do or say will be ignored or sidelined. Apathy is the most dangerous weapon the neoliberals/anarcho-capitalists wield. It cuts down opposition before it becomes opposition but it also has an aspect that is more insidious: it causes people to feel that have have failed, failed themselves and failed others and, thus, accept the blame for the state of society while recognising their own powerlessness.

However, in the past two decades or so, a far more sinister weapon in the fight for apathy has emerged: mindfulness.

Featured

The Value of Consistency

Or we could swing this round, “The Consistency of Value”. I’ve already discussed the idea of consistency as a “power play”, a mechanism that forces the disempowered into preordained channels, yielding preordained results.

We can also discuss the idea of value in relation to what we are told is the human desire/need for security and safety (a film is said to fulfil this need by giving the spectator a beginning, middle and an end – something they can never attain in “real life”). However, can we think of something we want, or need, as a value? The word has an attached “moral aspect” in that when we talk about a value, our disposition is “I think that this is worthwhile, you should think so too” or “This value distinguishes us from X”. So once again, we have a competitive notion attempting to slide in unnoticed. Values become part of “culture wars”: we are a democratic society with an elected leader, they are fascist state, led by a dictator. Values facilitate separation and tribal identification (take flags, for example, a kind of short hand for “this is what we believe in”, or “this indicates my membership of this group”), therefore, one might say that they facilitate aggression between groups, whether this be county, national or international.

There is also the question of whether the values that we suppose that we ‘have’ (in some sense or other) are chosen by us. One can argue that, in many cases, we simply ‘adopt’ values, without giving thought to what they mean and their implications. Nor do we interrogate the values of our respective nation-states.

On a personal, microlevel, we u se what we class as our values to make statements about ourselves: our beliefs, our moral worth (based on the values we espouse). I’m not going to be diverted by this, but we should be aware of the distinction between the values one professes and the actions one engages in. This wasn’t a problem for the ancients because they drew no distinction between mind and body. One was judged on one’s actions in the world – the values you possessed were ‘distilled’ from empirical data. Only with the advent of Descartes, and the separation between mind and body, does thought begin to take priority. This becomes decidedly more important with the advent of mass media. What, for example, are we to make of someone who, on seeing TV coverage of a Famine, professes sorrow at the sight, and anger at the inaction in alleviating this?

Anyway, personal values. Where do we get these from? Are they static? I think the answer to the second question is “Obviously not”. To argue that values, once formulated, remain static is to deny the influence of others, and of mass media. There’s a few things to be said about the ways in which media influence value-formation, but I’ll come back to that.

Where do personal values begin? Well, usually at home, then in school, then? (it used to be Church, but those days are gone). So, initially, our values are not ours; they come from interaction with parents and grandparents, from the narratives we’ve read as infants, from the schools we attend. So our idea of value-formation is embedded in our consciousness from an early age. This is done by comparison, but not by us, by our parents. We’re told that X is good and that Y is bad…from this we learn to generalise, to create categories which are, at first, little more than guesses: if X is good then Z is good too. So we learn by association. Of course, what this imposed system deals in is binary oppositions, so the framework for reproduction is laid.

One of the headlines today (15/07/23) is “Sunak puts cap on ‘low-value’ degrees”. He means degrees that don’t result in students getting professional jobs that pay well. So, that would be A&H them. Of course, the headline was never going to read “Degrees that indicate motivation by something other than profit to be capped” or “Greed degrees to be given the go-ahead”. The Tories are using every trick they know to enforce the idea that everyone is an individual motivated simply by money; every so often something like this breaks cover – a move so despicable it takes your breath away.

But it does illustrates the point I was making in the paragraph prior to the one above. Binary opposition, profit and individualism: ideological factors that the right combine to shore up their argument that capitalist society is ‘natural’.

What does ‘natural’ mean?

Featured

Valuing Imaginative Thinking

At this point, I should be doing the usual “philosophy by numbers” game; that is, I should be investigating how IT relates to epistemology, to rationality, to morals etc. However, it doesn’t take much to see the obvious flaw in this: a way of perceiving is initially developed BUT it must then be reconciled with pre-existing (philosophical) categories which, in effect, render it useless. Any insights/power it may have had are lost as the makar of this way of perception attempts to link it into (traditional) ways of formulating rationality, knoweldge and morality. These “established categories” (would ‘establishment’ be a more accurate choice of word here?) act as a kind of final line of defence; indeed, we’ve seen this over the past decade in regard to ‘truth’ and “fake news”: what emerged was a distinctoon between “rational truth” and “emotional truth”, in that what most of us would call ‘truth’, a category which exists independently of the human person’s desires or wants (maths is the most obvious example) began to be dispalced by “emotional truth”, the idea that X is true because I want it to be true, I desire it to be true, regardless of there being no evidence or ‘logic’ (which is a whole other argument in itself) underlying my belief. Quite simply, “I want X to be true, therefore, X is true”. To put it in traditional, rational terms, I have no evidence or proof to support my claim to truth other than my desire that X be true. This is decidedly not what most of us have been trained to do when investigating the truth of a proposition – See? Even the language I’m using to describe this is “pre-ordained”. The suggestion is that the grounds for making a rational case for truth must exist independently of me as a human person. They must not include a human element. Thus, when calculating the worth of something and its relation to the defining characteristic of profit, we can make (or so we’re told) no allowance for the human (cost). For example, if Arts degrees do not turn a profit, they must be scrapped, despite my questioning whether this is the right decision or arguing that we cannot apply the same criteria to Arts degrees as we apply to business degrees. According to rationality, to be fair we must apply the same criteria of judgement across the board; this is the ultimate sleight of hand. It makes no rational sense to argue that the same criteria be used for different kinds of degree…because they are different kinds of degree – they set out to do different kinds of thing, to achieve different ends. If it were the case that Arts degrees had colonised the concept of rationality, then business degrees would start to disappear.

My point here, to put it bluntly, is that the “one size fits all” approach does not work, is characterised by irrationality…is logically contradictory which, in turn, makes a mockery of democracy.

Capitalist democracy is a sham. The shell of this capitalist democracy will be preserved, provided it does not get in the way of profit. Should this occur, the mask falls and capitalist brutalism returns. Take the current Writers Strike in the USA; a memo has been leaked which states, unashamedly, that there will be no offering of terms until writers begin to lose their homes, their partners, their children, and are unable to pay their bills. Once this has taken place, the employers can enter ‘negotiations’ from a position of strength. All pretence of reasonable, democratic practice is sacrificed on the altar of profit. The conclusion to be reached is not new: Capitalism and Morality are incompatible. Life in the former is a vicious struggle against others, using any tactic available. In the latter it’s about talking to other people, about concern for their welfare, about respect.

Of course, philosophy is about clarity. In pursuit of this, there is a tendency to examine the history of philosophy, looking for connections with, and between, past writers. Take the role that reason plays in aesthetics. Both Kant and Hume cite reason as a means to escape from the emotional (hardly a surprise, given that Enlightenment philosophers use reason as a kind of “magic bullet” when pursuing knowledge). The emotional is regarded as a kind of ‘pollutant’, skewing “proper judgement”. This attitude has fed into our society, whereby emotion is now generally regarded as a weakness (a reason to dismiss others’ arguments), a kind of betrayal of self. Look at the ways in which this format is built into both patriarchial and colonial thought: in the former, reason is the preserve of the male, a hierarchical distinction that holds that the male control of reason is the reason for their superiority. In the latter, this pattern is repeated; the colonised are at the mercy of their emotions, thus, they are feminised, seen as a group over whom control must he exercised for their own good.

In a similar, but more convoluted way, we can identify this use of reason in aesthetics. Firstly, there tends to be an assumption as to what reason is – a kind of negative definition too. We identify what reason isn’t. Secondly, reason is class-based – defined as a quality possessed in virtue of one’s position (which dispenses with the need to give reasons as to why one is in control of reason).

For example, Kant and Hume both make an appeal to reason, by trying to identify what it isn’t, and by defining emotion. In Hume’s case, we also see the introduction of the persistence-over-time argument. For example, if X is still recognised as an artefact in 100 years’ time then it qualifies. However, what Hume does not offer is an explanation as to why this might be the case – what ideological purpose does X serve?

In Kant’s case, what underlies his system, whether of morals or of aesthetics, is the idea of a benevolent god. So his system of reason is built on a supremely unreasonable premise: the existence of a character that is supernatural.

How does this connect to the main subject of this blog? Well, I’m trying to show how discourses operate by controlling the terms of the debate, by creating ideas of success and failure on the grounds that their system is objective…when it’s not. Every system contains, within itself, the criteria of success and failure, attempting to pass these off as objective when they have, in fact, been generated by this system. Analyses’ will reveal that ‘objectivity’ is itself a product of the respective system that claims to possess it.

Featured

The Value of Imaginative Thinking II

(ii) Do we (should we) value Imaginative Thinking? (IT from hereon)

Do we need to be able to define something in order to judge its value? If IT varies from person to person, must we, therefore, say it cannot be valued (have a value placed upon it) because there is no identifiable “common denominator”? In other words, do we need some kind of metric in order to assess IT?

That isn’t a rhetorical question (nor a kind of “Aquinas trick” – ask the question then answer it brilliantly). If we insist on a metric/metrics, then this would contradict IT. However, one kind of ‘measurement’ does occur to me: how does this IT contribute to human empathy? Does it encourage or exploit? I’m deliberately avoiding phrase like “human happiness” or “make the world a better place here” because each of those invites judgement and definition – we simply end up with a long discussion regarding what ‘happiness’ or ‘better’ mean. What I would argue though is that IT necessarily includes thinking of the human person as a subject rather than an object – there is no inherent idea of the human person as someone to be used in IT.

Obviously, what I’m arguing here is that IT is a crucial ‘component’ of artistic creation (in any Art). This may indicate that for X to be defined as Art, then it must contribute to/expand/explain what it means to be a human subject – it must increase the understanding of the spectator, expand their empathy fpr others, identify injustice, ‘move’ them closer to engaging their own IT (which is not to suggest that the spectator merely deploys a copy of someone else’s IT – my IT is my own, it cannot be someone else’s. My IT may possess similarities to that of others, but it cannot be one and the same).

The human person’s IT reaches out to the world, identifying injustice and exploitation from their unique perspective as someone born into that world. It identifies the ideological, seeing unity in uniqueness. A crucial aspect of IT here is the way in which we think of ‘uniquesness’: not as something which separates me from others, makes them “fair game” for exploitation, but as human persons simultaneously like and unlike me and, therefore, ‘worthy'(?) of my care and respect. My IT is guided by this in engaging with the world around me, of which I am part. I am also able to engage with myself as a “foreign subject” in this world, to see myself as (an)other.

Just to try to rephrase this: ‘difference’ in this figuration becomes a reason for unity, for connectedness to others, NOT as we’re encouraged to see it through capitalist ideology, which is as a reason for competition and fear, therefore, as the motivation for aggression. This takes us ‘back’ to one of the unifying ‘properties’ of Art, whereby when we encounter a work we think “I too have felt like this/have this perception”, reactions we then share with others in the act of criticism.

Thus, the value of IT is that it creates bonds with others, enabling us to see our humanity reflected (and refracted) in Art. These reactions can be both similar and different at one and the same time, thus, we are illustrating to both ourselves and others that we are simultaneously similar and unique, without this realisation becoming the basis for competition and aggression.

How does this occur? Art exposes capitalist ideology, identifying the fractures and deliberate tensions that capitalism creates insisting on profit as the motivation for human action, and trying to pass this off as ‘natural’. Art engages our ability for IT in a positive, communal way, while encouraging us to develop this ability.

Nor does IT, I think, produce the fractures that Art as a function of capitalism does. Regarding Art as a product, capitalism introduces “levels of division”, in that the consumption of Art becomes a symbol of status – an indicator of one’s level of wealth, one’s superiority to others. Hence, the (artificial) division between ‘high’ and ‘low’ culture, the tying of this division to class which, in turn, gives rise to the notion that “this work isn’t for me”. In this system, Art becomes a means of exclusion, of promoting elitism and competition.

Featured

The Value of Imaginative Thinking

This is where it gets more difficult; if this were a book, I’d have to work everything out beforehand, but it isn’t, it’s a blog, so the characteristics are different. Anyway, I’m not getting b(l)ogged down in that.

Two questions: (i) What is the value of Imaginative Thinking? (henceforth, IT) and, (ii), Do we (should we?) value Imaginative Thinking?

(i) In discussing IT, is the case that that one starts to attribute certain qualities or attributes to IT in order to establish value? This would seem to defeat the purpose; it would suggest that, unless this particular pattern and trajectory is revealed/followed, then what X is engaged in is not IT. Two things are striking here: firstly, this would follow the usual “negative definition” notion, in that, we start by identifying what IT isn’t in order to have an idea of what it is. Secondly, what this kind of argument does is essentially say “Unless your IT does/includes X, Y and Z, then it is not IT”, so what I’m doing is defining what IT is…which contradicts itself. IT is necessarily a different kind of thing for each human person. There are no necessary and sufficient conditions that one can point to in order to say “Yes, this is IT”. What I’m trying to do here is suggest that IT is inclusive rather than exclusive. Put another way, IT bears a direct relation to the the unique personality of the human person. This, however, seems to give too wide – too inclusive – a field. It leaves us with nothing to discuss: each human person can define their thinking as IT without fear of contradiction. Therefore, we need to go back a step to ideas of the human: are there essential characteristics of being human? It is here that we can make progress.

Compare two general notions of moral thinking. On the one hand, we have Western Philosophy, based on Plato and Aristotle. The “starting point” here seems to be that our moral behaviour must be guided/instructed by rules (in order for it to count as “moral behaviour”). The assumption is made that the human person is ‘naturally’ selfish, self-centred, with no sense of connection to community, therefore, must be forced into developing such a sense. This, generally, facilitates a rather cynical perspective on what it means to be human.

On the other hand, we have the Chinese view, what I’d call the Confucian view (although my understanding of this may be wrong – my understanding is self-read). This is the idea that human persons are good, have a sense of their connectedness with community, and that their moral behaviour is guided by their desire to find opportunities to express this good. This is to say that moral behaviour is that behaviour which provides a sense of good, defined as connection with our community.

Now, whilst I realise this is probably a simplistic reading, it does illustrate the fundamental difference between the two: the Western approach posits the idea of rules to ensure “good behaviour”, whereas the Chinese does not. Put another way, the latter does not start from a position of dis/mistrust (which, interestingly, is something built into business practice).

Where does this attitude come from? Why is it so entrenched in Western society that people argue that it is ‘natural’, part of “human nature”? Well, we might begin by looking at the structure of our narratives; myths, legends and folktales for example. These revolve around the idea of single, central character who is, in some sense, opposed by others. Narratives often begin with an act of deception or betrayal. Narratives are designed as warnings, particularly for women – of the sort “Do as you’re told or things will go badly for you”. This is, of course, when young men are not being warned of the deceptive ‘nature’ of women. In the standard format, we accompany the ‘hero’ on their journey, to their moment of triumph against others. Conflicts arise for a variety of reasons: spite; vendettas; revenge; economic gain; perceived slurs. From myth, legend and folktale, we learn the structure of narrative, a structure we then build into the stories we tell others, and into the stories we tell ourselves about ourselves. Thus, from an early age conflict with others is an expectation, a foregone conclusion of living as part of a community.

Capitalism formalises this into an ideology – admittedly with a few additions. However, this is fundamentally the “business model” on which capitalism trades. Unity and co-operation are not ‘natural’, the individual must strive against these to ‘succeed’.

What’s fascinating is that we are prepared to dismiss myths, legends and folktales as products of superstition, devised to explain events – quite often natural phenomena – that occurred before science was sufficiently developed. Anthropologists will link ideas and events in previous ‘civilisations’ in order to explain this or that myth or legend. They examine the structures and sociopolitical dynamics that gave rise to X, Y or Z.

This is what Machiavelli does in regard to religion and its ideological position in supporting aristocratic rule (an idea which is later refined by Marx). He (correctly) identifies the notion of a ‘God’ as being absurd, but an extremely useful sleight of hand in maintaining rulers’ power: There is an ‘entity’, unseen and lacking any physical evidence of existence, that can know all you think, and see all your actions, that will then sit in judgement on you (the ways of this judgement, and its punishments, being remarkably human in their execution).

What is puzzling is why this kind of analysis is not applied to the role of narrative in capitalist ideology: the veneration of the individual, the positing of competition as being ‘natural’, conflict with others as being part of “human nature”. Is it because this indoctrination begins at such an early age that we cannot see it for what it is?

Featured

Decisions on Value

How do we define ‘imagination’? Or, more importantly, should we even try to define imagination? A definition seems to defeat the purpose; it attempts to tie down something (A ‘quality’? An ‘attribute’?) which seems to be part of the definition of what it is to be a human person. Karl Popper, the philosopher of science, claimed that what distinguishes human persons from other animals is the ability to tell stories that feature themselves as characters – in other words, the ability to use our imagination. The standard ‘reading’ of imagination connects it to creativity, in the sense that we use our imaginations to create fictional narratives – in music, film, literature – that enable us to place others (characters) and ourselves in situations which do not – in some cases cannot – exist.

When we look at the way imagination is regarded, and treated, in our educational system, it quickly becomes clear that imagination is seen as a threat to what we can call the established order. In schools, art, music, ‘creative’ writing are embraced in primary, seen as allowing children to develop their ‘personalities’. However, this changes abruptly when the transition to secondary is made. The “imaginative subjects”, so desirable in primary school, are ‘ghettoised’, seen as ‘soft’; in traditional patriarchal terms, they’re seen as subjects that ‘girls’ take. Boys, on the other hand, are encouraged to focus on the sciences, maths and computing. Imagination becomes a threat to the “serious business” of employment and profit. Imagination is gradually categorised as something ‘childish’, a self-indulgence.

With the advent of the TUs and their focus on employment and graduates who are “ready for work”, plus the consequent pressure on the more traditional universities to follow suit, areas which emphasise the development of imagination are consigned to the periphery and, eventually, the scrapyard. In an environment led by profit and “what business wants”, imagination has no part to play – unless we allow its defintion to become “new and more sophisticated ways of exploiting others”. Which is, essentially, the only use that business has for imagination.

So, we might ask, what is business trying to exclude in relegating imagination to the sidelines? Thought. Imagination necessarily involves thought as a primary factor. Imagination involves a consideration of what it means to be a human person, and the positions which human persons find themselves in. What we might call “imaginative thought” focuses on what constitutes ‘good’ for both self and others. It examines exsisting ways of being and evaluates these, identifying injustices and, on occasion, suggesting solutions – how can situation X be made better (= fairer, more just). Imaginative thought refuses to accept that this is “just the way things are” or that ‘tradition’ should followed because it’s tradition.

Artefacts become meditations on Being. Imaginative thought consists of both conscious and unconscious ‘influences’. The work of the academic or the critic is to ‘discover’ these influences, to identify where the artefact has come from, what kinds of analyses it offers, and where it suggests that we might be ‘going’. The imaginative thought of the academic combines with that of the makar, as the imaginative thought of the philosopher interacts with the apparent ‘realities’ of being-in-the-world, to produce something greater than the sum of its parts.

It is this ‘something’ that creates apprehension, fear, in those who see their ‘function’ as “servicing the system”, hence, their attempts to minimise the opportunities students have to study A&H. They attempt to achieve this by introducing “business metrics”, and that simplistic notion of profit, into education. By continuing the marginalisation of A&H that begins in secondary school, they hope to “preserve the system”.

What terrifies the servants of the system is the unpredictability and critical acumen of imaginative thought. The values on which imaginative thought operates are those of the human person, what it means to live in the world, not those of the balance sheet.

Featured

The Value of Art

There’s an idea kicking about at the moment that some call “culture wars”. This seems to be connected to what right-wing ‘commentators’ describe as ‘woke’, although it’s rather difficult to nail down a definition. It appears to be the idea that being thoughtful in how you treat others, being conscious of racism, sexism homophobia, transphobia and misogyny, and how derogatory terms have been “built into” language, so much so that people use these terms without thinking. This would appear to be what upsets right-wing commentators the most: that we might think about our attitudes and revise them. In fact, one might say that their main objection is to thought itself.

Likewise, when tabloids scream about “culture wars” and accuse folks of being ‘unpatriotic’, what they are really taking issue with is people thinking about history, recognising, for example, that the British Empire was built on murder, massacre and corruption, and is, therefore, not something to celebrate. Neither should our streets be named after slave traders, or should statues of slave traders remain on public display.

This is anathema to the right. Their fervent wish is that we plod along, doing X in way Z because “we’ve always done it like that. It’s tradition.” From these ‘traditions’ they construct an entirely nostalgic picture of a “Golden Age” which, as with all nostalgic constructs harks back to something that never existed, is entirely illusory. All we need do here is recall Thatcher’s bemoaning of the loss of “Victorian values” – no trade unions, ricketts, children being sent up chimnies, slum housing, folks dying because they couldn’t afford medical care. These facts were conveniently passed over in her account.

What is also worth remarking on is the lack (complete absence) of right-wing festivals, whether it’s literature, film, painting etc. There appears to be little aptitude for, or engagment with, Art. This is hardly surprising; a film, say, or a novel that simply records what already exists, trying in some way to ‘celebrate’ this, would generate little or no interest. Even when the odd attempt is made, such artefacts are immediately subject to critique by “leftie bleeding hearts” or, as the Tories are now fond of calling them, the ‘wokeiratti’ .

We also see continuing criticism, particularly in the United States (although I have come across this attitude myself, referring specifically to me), of the ways in which “leftie academics” try to poison the minds of undergraduates with their Marxist/Socialist doctrines, their belief in social communities and (so it seems to some in the USA) their atheism. Parents complain that their offspring go off to college and return ‘changed’ (or ‘possessed’ as one partiularly amusing comment from a parent expressed it).

The question, then, is why should this be the case? Well, in the case of right-wing artefacts, there are always elements which diminish our ability to class X as a “work of art” without our having reservations. Take Reinfensthal’s Triumph of the Will for example. Without doubt, there are some fine shots, and it’s fascinating as a film…BUT it celebrates Adolf Hitler and the Nazi party. We can say much the same of Wagner: The Ring Cycle is an astounding, ground-breaking work yet, no matter how much we allow ourselves to be swept up in these operas, the fact remains that they contain antisemitic themes. Works such as these attempt to persuade us into adopting their positions, asserting the superiority of one group over another.

In tje right’s attitude to academics, we can perhaps identify the defining factor. Academics, who spend their lives thinking (it’s what they do for a living) move towards the left. When topics such as justice, equality, fairness and community, together with many others, form part of your working day then a shift to the left is more than likely because these concepts are vital to a just and democratic society. Such a society prioritises the human person, regardless of nationality, class and all the other ephemera that the right use to apparently ‘justify’ their outlandish claims that competition is ‘natural’, that some people are simply ‘lazy’ and ‘feckless’, that nepotism and “old boy’ networks do not exist.

What Art (and the Humanities) do is to expose the ideological basis of these claims: the attempt that the right make to ‘argue’ that their narrow self-interests are universal, that they have everyone’s “best interests” at heart. Their concept of freedom is based solely on private property and exploitation – this, as Badiou argues, seems to be the guarantor of all other freedoms.

In short, the right do not like thought because thought, and the competing rationalties we find within it, contradicts their simplistic view of the world. It exposes the inherent violence of capitalism, its cruelty and its corruption.

This is why the universities, especially the TUs, must be brought into line, must be run as businesses. Only by doing this can the capitalist class (and those who serve it) assert that there is no alternative. The battle appeared to be over once the Berlin Wall fell, but universities as centres for the discussion of ideas remained.The expected capitulation did not happen, despite the best efforts of postmodernism…and the resistance was, and is, led by Art.

Art, as I’ve said before, is produced by conflict, by dissatisfaction with the way things are. It presens us with alternatives, with different perspectives; it is, from one point of view, philosophy in action – praxsis.

What we are experiencing with the advent of TUs in Ireland is the return of Mr. Gragrind from Dickens’ Hard Times.

Featured

Value and Value

As “business practices” have infiltrated every aspect of life, accelerated by the advent of cheap digital technology, the definition of ‘value’ has become more corrupted, more univocal. Whereas we once realised that there are a multiplicity of objects/relations/dispositions that are valuable, as business-speak has colonised our language (therefore, our thought) one Archemedian point has become dominant: profit (another term that used to be defined in terms of its context). Monetary profit has come to be regarded as the only signifier of value, with a knock-on effect when one talks of ‘success’. In another sense, profit has also becomes inextricably bound up with the “the individual”, in that the concept is now linked to there being an advantage for the individual in any, and all, social relationships – personal or professional.

There is also a certain irony in writing this on the day when the newspapers are carrying the story of Tony Blair (the man who began the Labour Party’s move away from socialism, and the destrcution of the left within the party) announcing that the NHS needs to make “greater use” of private healthcare. This is the man who, during his premiership, appointed an ex-CEO of Tesco as head of the NHS because, apparently obviously, selling groceries is a good preparation for running an organisation dedicated to providing free healthcare. I could go on and on (and on) about Blair, but I don’t need to: this single act makes my point. Put another way, Blair introduced “business practices” into healthcare…and look at the NHS now…

Likewise education. What we’ve seen over the past few decades is a shift from education (non-profit, for the benefit of society) to training and skills (for profit, for the benefit of business, therefore, the individual). Education was run by those to whom administrative bureaucracy and personal gain were alien, hinderances to be minimised and endured. Now, some years later, “the business of academia” is run by failed academics (because who would rather administrate when they can teach?) who saw career and revenge opportunities in the extension of bureaucracy to Kafkaesque levels. Layer upon layer of administrative control has been introduced in the name of ‘transparency’, yet this ‘transparency’ does precisely the opposite, obscuring and denigrating the process of education, turning it into a system operating on the traditional business binary opposition of profit and loss. The system has become the ultimate ‘goal’ of the system, its maintenance the primary object, which is to say that the value of the system is dictated by the system itself. The idea that the system is there to serve values other than those dictated by the system itself is seen as puerile thinking. And what are those ‘values’? That education be run (i) as a business, that is, for-profit, (ii) that those enmeshed in the system work for the system and, (iii), the only question to answer is “What does business want?” – the apparent guarantor of (i) and (ii). There is one other significance factor at work here: this system has been devised, and is run, by failed academics playing at being “business people”; their focus is on (a) what others want (‘others’ here referring to politicians and “business people”) and, (b), their own self-interest (defined as being pliable and obedient to the commands of (a)). In much the same way as other fields, mainstream media journalism for example, these folks only occupy their ‘position’ because they will “play the game” according to rules established by others. They have surrendered their capacity for independent thought, contented themselves with the nostrum that “this is just the way things are”…and large salaries. In short, they have abandoned (‘betrayed’?) the values of education: enquiry; challenge to what exists; concepts of “the good”; democracy.

What, then, are we left with? A system which refuses to recognise alternatives to neoliberalism and anarcho-capitalism, a system based on “the individual” (which it defines), a system that is continually moving further to the right. The idea that we are a community of human persons continues to be undermined by nationalism – what else are anti-immigration policies based on?The immigrant is, apparently, ‘foreign’, does not share ‘our’ values…values manufactured to be exclusive, to be ‘ours’: as in “our jobs”, “our housing”, “our services”.

In junking the Arts, universities initiate, perpetuate and hone the myths of separation, competition (as ‘natural’), of individualism and, most importantly, the singularity perspective (rather a contradiction, given that it attempts to claim there os only one). Students are not invited to question, leaners are instructed, they learn: Whereas going to university was, in the past, about a range of experiences – fields of study, social experiences (hence, empathy), developing political views (based on questions of what ‘good’ is for both self and others) – it has now become about training for a job and controling factors that would prevent this. As a consequence, strong student unions are out, as are demonstrations. Studying for a degree in an area that one is interested in is out (seen as being childish), replaced by studying for the one that offers the highest ‘return’ (for four years of ‘investment’). Add to this folks having to live at home (with oarents) because of costs; having to take a variety of part-time jobs to suppot themselves (grants are inadequate), and we can see that university is no longer about experimentation and experience, it’s about preparation. Preparation to live in a controlled, consumer society. Values such as empathy, responsibility and obligation have been lost, replaced by separation, self-interest and the contract which, much like a computer operating system, constantly runs “in the background”.

Quite obviously, the Arts in this society (I don’t subscribe to the idea that, somehow, universities are not part of “the real world”) are, to say the least, undesirable. When we engage with an artefact we are emotion-testing, developing empathy with others, asking ourselves “What would it be like to be in situation X?” or “How would I react to/cope with X?”. The artefact causes us to refine who we are, to question our ‘selves’. It also asks “the Big Questions”: “What is the purpose of living?” and/or “What is my position in society?” and/or “What does being ‘alive’ mean?”. In the Arts, there are no definites but always challenges and questions.

Quite obviously, in this kind of society (I don’t subscribe to those bogus notions of universities not being part of the “real world), the Arts are, to say the least, undesirable. When we engage with an artefact it offers us another perspective, a set of values that can be vastly different to our own – it often forces us to formulate and refine what we think about X, to reflect on “What would X be like?” or “How would I react to X?”. Engaging with Art also forces us to emotion-test, to investigate ourselves, to question the world and our place in it, to explore what are called “the Big Questions”, for example, “What does it mean to be human?”, “What is the purpose of my existence?” “In what senses am I living a sincere, authentic life?”

Featured

Valuing Academics

The value of universities has become increasingly dominated by metrics – staff publications, learner completion rates, graduate earnings etc. As metrics become more important (despite the notion of “measuring education” being entirely bogus), those wanting to attend universities, and their parents, have been encouraged to base their ‘choice’ on these metrics. Degrees are chosen on the basis of ’employability’ and “earning power”, and universites market themselves around these. Thus, we are seeing an ideological shift in the concept of the university; the idea that one might study for a degree based on interest – whether personal or community-based – is gradually being written out and off. The very idea of doing this is now being represented as puerile, as an example of not wanting to “grow up”. We can see this being reinforced through government policies: for example, in the UK, there is a movement to defund what are seen as “soft degree programmes”, which essentially means Arts & Humanities programmes. As major cuts are initiated by the reduction in government funding – the operation of “market forces” in the academy when departments and degrees must be ‘profitable’ – Arts & Humanities programmes are the first to be cut, the lecturing staff being ‘redeployed’ or let go.

Of course, we already see this kind of disparaging attitude towards the Arts & Humanities in schools: art, music, civics and history are all considered ‘easy’ options (in traditional, patriarchal terms, they’re considered to be ‘feminine’ subjects). Sociology, philosophy and psychology have no recognised value, therefore, no status whatsoever (unlike in France, for example). As the nostrum that schools should concentrate on skills for life and employment has taken over, the demise of Arts & Humanities in universities has becoming self-fulfilling prophecy, convincing parents and pupils alike that economics is the central factor in ‘guiding’ their apparent ‘choices’ of subject. Metrics drive school curricula, yet we don’t appear to ask the fundamental question: Who dictates the metrics? What is their ideological basis? Nor do we ask the most obvious question of all: It is because attempting to apply metrics to Arts & Humanities exposes any and all metrics as the vacuous, biased, right-wing political instruments that they are?

Schools have become increasingly focused on testing – even primary schools – so that those subjects in which ‘competence’ can be easily assessed, and which have a direct link to the prevalent ideology, are favoured and promoted as being ‘valuable’. Subjects which resist simplistic testing, which require thought and argument, are marginalised. This seems to be the crux: subjects which encourage critical, independent thinking are seen as less ‘valuable’. One is tempted to say simply “those subjects which encourage thought.” Someone who can think for themselves is, in our current system, automatically ‘undesirable’: they might question the fairness, justice and equality of said system. They might also develop a conscience and morality that takes human community as it’s starting point.

In Ireland, the majority of schools, both primary and secondary, are still run by the Catholic Church, an institution that has been proved corrupt on countless occasions. Religious instruction is still part of the curriculum, a rather bizarre notion if we take the purpose of education to be developing the capacity for independent thought, thus, the ability to move away from mythological narrative and it’s magical stories. This is especially peculiar when we examine the central religious idea of enduring the privations of this life to obtain “rewards in the next”. Of course, both Machiavelli and Marx identify religion (per se) for what it is: an effective method of state control, a pre-emptive strike in terms of blind obedience. All we see in the transition from secondary to third level education is a shift in terminology: ‘God’ is replaced by ‘Market’. Other than this, there is little, if any, difference in concept or idea between religion and capitalism. Like God, “the Market” moves in mysterious ways, apparently beyond human control.

Hence the importance of marginalising the Arts & Humanities: critical thought is undesirable to say the least. The ability to think critically represents a threat: compare, for example, thinking communally versus thinking individually. In the first, the object of thought is justice and fairness for all, the focus on how this can be acheived. In the second, the object is the self, and only the self – others are peripheral, mere means to one’s own ends. Back to Thatcher and her “there is no such thing as society.”

We can also identify the ways in which social media contributes to this ideological construction of the “unconnected individual” (which might, at first sight, appear to be contradictory). “Social media” is a term that, without interrogation, appears to suggest a connection with others, with the community (and a myriad of special interest communities). It could be seen as a forum for activism…but examine the term in detail and the inherent contradictions are obvious. Engaging with social media is ‘about’ competition: for followers, likes, reactions (regardless of whether these are good or bad). To use these media is to make oneself a product, to formulate a “marketing strategy” regarding self, to become attuned to the reactions of others, changing oneself based on these reactions, craving the approval of others for personal ‘authentification’. Political activism becomes, quite literally, a box-ticking exercise. Algorithms will present you with like-minded others who, as with any other product, you can consume. You, and they, are absent presences. Overall, social media is about competition with others, perpetual growth (the holy grail of capitalist economics) and the validation of your existence by others (who remain other, only useful insofar as they serve your purpose).

I think that here we can see a direct link to what Stiegler calls the temporal object; he is referring to artefacts, but I’d argue this can be extended to human persons. In social media, the self and the other simultaneously appear/disappear. Stiegler uses the example of someone watching a film. Whilst watching, this person adopts the time of the temporal object (the film) in question. As he says, “you are in the screen.” (N.B. Ironically, he explains this concept in volume 1 of Symbolic Misery) When we are ‘in’ social media we are, simultaneously, self and other – a self that is confirmed by the otherness of others, but a self that also craves identification with such otherness. In short, social media allows us (and I do mean allows) to satiate our need for security while asserting our individuality. This is acheived by positing other human persons as temporal objects (the temptation here is to change this term to temporal bodies).

As business ‘practices’ colonise our schools and universities, this sense of being an individual is reinforced, becomes more ‘refined’ – in that ideas of community, or sincere connections with others in virtue of their humanity, become ever more peripheral. The metanarrative of capitalism conceals itself by propagating the myth that there are no credible alternatives.

Featured

Academic Value(s)

Over the past few years, the ‘job’ of the academic has changed: it used to be, primarily, concerned with talking to students, discussing ideas, formulating concepts, trying to go beyond (surpass?) what existed. However, as neoliberalism/anarcho-capitalism gradually made inroads – something we can date back to Thatcher, her resentment of academia and worship of “the market” (nor should we forget that FG/FF are neoliberals, pushing the same kind of individualistic, self-interested, market devotion) – Irish education began its lurch to the right, kick-started by the 2008 recession. Education, as with all other public services and servants, paid the price for the reckless endangerment committed by bankers. Politicians who, by and large, appear to know nothing about anything except self-promotion, seized the opportunity to make academics pay for their intelligence (the academics, not the politicians) and what they (the politicians, not the academics) perceived as “ivory tower” lifestyles. Business pracitces were ‘frontloaded’, a need for centrally-controlled “quality assurance” was manufactured. A new management ‘system’ for education was created, based on mistrust and distrust. ‘Accountability’ would be guaranteed by “learning outcomes”, together with ever-proliferating streams of paperwork. Academia would shift from teaching to being assessed/seen to be teaching. At one and the same time, Administration would be elevated to primary position, expanding exponentially with each passing year. Business practices would tame the academy, make it “tow the line”, force it into “meeting the needs of industry”. In short, turn education into training and “skills acquisition”, stripping out thought, replacing it with drone-like obedience to the whims of the market.

Students have become ‘learners’ – adjusting their ‘allowance’ of individuality to what the capitalist system permits, identifying conformity as ‘choice’. Their status as ‘human’ is diminished by the replacement of the designation of ‘student’ wth that of ‘learner’: a ‘learner’ eventually finishes (in some sense or other) ‘learning’ by the arbitrary imposition of a cut-off point; by completing and passing this module, thus, meeting the “learning outcomes”. Collect the set of modules and your learning is done. You too are now permitted to enter the ‘adult’ world of ‘work’, amassing the usual markers of adulthood: mortgage; children; car; ambition; consumerism…the panoply of ‘adult’ indicators.

As the student is diminished (one can remain a student of history, of philosophy, of literature, forever) so is the academic. Indeed, the academic cannot be trusted to write “learning outcomes” without being trained to do so: they must use certain words (appropriate to particular ‘levels’ of learning) in specific ways…for which they require training (or, as it is laughably described, “continuous professional development”). Uniformity and obedience have become the primary requirements of academic positions, ideologically integrated to appear to be ‘choice’. In addition, standards must be ‘benchmarked’ – merely another way of insisting on uniformity. “The look” must always be directed towards “the other(s)” because, apparently, only by doing this can competitiveness be assured, and the market given it’s rightful place – as the guarantor of freedom.

Yet what does such ‘freedom’ consist in? Freedom to serve the system; freedom to obey; freedom to ‘choose’ the discourse of business; freedom to believe that you are a free individual…the freedom of self-deception.

In all of this, academic freedom has been lost, has become a simulacrum of freedom (as have all other ‘freedoms’). It is only a matter of time before Ireland too falls to the metrics of ‘output’: publish or be damned. As we see in America and the UK, it is not the content of publications that count, but the very fact of their publication. Quantity over quality – the university becomes little different to a factory, churning out product; it is of no importance what it is, what it says or what it does, the simple fact of its existence is validation enough. Thus, the academic becomes just another labourer, alienated from their work and themselves, whose ‘real’ life exists elsewhere.

And where does that ‘life’ exist? In consumerism – one’s value and values are displayed in what one possesses.

Featured

A Digression on Values

Which it isn’t really – just a way of restating the central purpose here: identifying the privatisation of education and, in consequence, the marginalisation of the Arts, the ultimate purpose being the removal of critiques of capitalism and, more importantly, of critical thinking per se. The aim is to push the Arts out to the periphery, confining them to: the “heritage industry”; artefacts becoming mere “investment opportunities”; Arts degrees becoming the preserve of the rich.

We may not have actual fees in Ireland yet, but who can afford to study without significant input from their own, extracuricular, labour or their relatives, thus, their ‘choices’ become circumscribed by debt, in both senses.

In addition to this, in the TU, creativity must be harnessed to “what business wants”, the space for experimentation, for simple joy in the act of making (in old Scots, the word for poet was ‘makar’)/constructing/designing, constricted by the demands of “the market”. The creative (critical) act is limited by the need for employment, by apparent educators warning of the ‘skills’ business wants and the importance of marrying these with what already exists. The Nietzschean call, that “we must challenge the views of our forebears, not because they’re wrong, but because they exist” counts for nothing. The idea of the university as a site of challenge, debate and critique is being replaced by that of perpetuation (of what already exists). Put another way, that we should “know our place”. We have already been told that the TU “certainly won’t be teaching philosophy”, something widely quoted in newspapers. This, in itself, gives the game away: philosophy is about the perpetual why, the perpetual challenge to justify/prove that what exists, X, is in some sense or other “better than” what might exist, Y. Philosophy is the positing of the theoretical against the existent – to “Don’t look at it like that, look at it like this”. We might sloganeer here: All Art is philosophy, All philosophy Art.

Thus, philosophy represents a permanent threat to business, particularly if one is talking about moral philosophy. What business suggests is that everything can be run as a business, run for profit – whether that be healthcare, television stations or education. Business ‘thinking’ (call it that for the moment), however, is especially dangerous to those professions previously seen as ‘vocational’; business thinking cannot comprehend the idea of engaging in an activity, any activity, for anything but profit. Business turns vocational professions (for example, education) into career opportunities for the mediocre. In education, management engages in managerialism – an endless stream of petty bureaucracy ‘justified’ by an appeal to ‘transparency’, ‘consistency’ and ‘accountability’. Yet, if we interrogate these three terms, they all embody one quality: power. The power of the bureaucrat to interfere with, and constrain, the academic. To obscure this connection, the bureaucrat argues that they are merely fulfilling the demands of “the system”, the suggestion being that they would love to behave otherwise, but the system requires them to do X, Y and Z. “Computer says No.”

This ‘system’ destroys collegiality. The academic becomes just another wage labourer, increasingly alienated from their workplace and themselves. The satisfaction that they once derived from their human inactions with others is lost to systemically-controlled meetings. Codes of conduct and of practice replace authentic human exchanges (for both lecturers and students).

What we see is education being supplanted by training, by “skills acqusition” and by a system that demands rule-following at the expense of thought. Gradually, education is being absorbed into the “business system”, whereby the aim is to maintain the system, not to encourage independent thought. Regardless of discipline, students imbibe this kind of social interaction as part of their being-in-the-world; they are taught to be individuals in this specific sense, while, at one and the same time, being deceived into thinking that this is their choice. However, what they cannot do is choose to be an ‘individual’ in any way other than the narrowly-defined sense of ‘individual’ with which they are presented. This reinforces the major contradiction of capitalism: “the individual” is presented as the basic unit of society, and a central part of the definition of this ‘individuality’ is that one’s choices are one’s own, made freely…yet being “an individual” within capitalist society is very clearly delineated. To stray from the rules will result in penalisation (either literallly or metaphorically).

If we return for a moment to the idea of the human person and work. Marx argues that the highest expression of one’s humanity is the work one does. Labour of whatever sort or kind is the highest expression of your humanity, your self, an integral part of being human. Capitalism, on the other hand, imposes a kind of ‘dualism’: a work self and a real (authetic?) self. The work self labours to obtain money to facilitate the real self which exists outwith work. In regard to work, the slogan is “Never mind the quality, feel the wealth”. This dualism pits individual against individual, as Descartes pits mind against body and contributes to that proto-capitalist, christian notion of the “next world” making up for the privations of this one.

The market, of course, offers you the opportunity – with enough wealth – to create a pre-lapsarian world on earth. Expressly though at the expense of others; their ‘success’ is a direct threat to yours…but don’t question why this should be the case. Simply accept it as “the way things are”.

Featured

The Value of Discontent

Art, I’d argue, is initially borne of discontent, of dissatisafaction, with the the way the world is from Beethoven, to Proust, Godard, computer games and hip hop. As Eisenstein put it, Art is produced by conflict, by the dynamic that exists between the maker and their socio-political moment. We can go back to Sophocles to see a playwright who is disputing the wisdom of the Gods, asking questions which only spectators can ‘answer’ – or consider – for themselves. One might argue that this is how an artefact transcends its time: the questions it poses remain pertinent to successive generations and epochs…Sophocles’ questioning of freewill, Goldsmith’s critique of technology, Proust’s consideration of perception over time, Wagner’s representation of myth and its relevance, Pollock’s attempt to make sense of a post-nucelar world.

Art also exposes the myth of a “ruling rationality”, that is, the idea that there is a single, univocal rationality to which everyone aspires and subscribes. In Art we see alternatives; it’s tempting to say that we only see alternatives – which is possible if we exclude mainstream film from the ‘category’. Yet even mainstream film is useful, in that it enables us to see the ideological assumptions on which this type of film rests: capitalism is the ‘best’ system; poverty is a useful motivator; some people are inherently ‘evil’; some people are ‘lazy’; men and women are inherently ‘different’. Wthin the mainstream film world, such assumptions become self-fulfilling prophesises, perpetuating themselves. ‘Adulthood’ is defined as accepting that “this is the way things are”; positing other ways of being, other rationalities, become adolescent fantasies or hormonal phases – temporary flashes of rebellion before accepting “adult values”.

‘Floating’ in the background, I think, of any discussion of Art and value is Heidegger’s contention that it is Art that produces society. If this is the case, then it raises the inevitable question of why, if Art occupies what we might call a “revolutionary space”, we don’t live in some kind of utopia acheived by a kind of artistic “trickle down” effect. Put another way, if Art can really make one a ‘better’ person, then why do we live as we do, in a world riven by injustice, inequality and individualism?

The easy answer is that philosophical gambit, “Define ‘better'”. Ok…a world that is more empathetic, caring, community-based (Never ask a question that you don’t already know the answer to! Maxwell Fife and a host of others over the centuries.) What artists cannot control is the interpretation of their work, how it will be used to political ends (take Shakespeare for example, although one can argue that he wrote for deliberate political ends). Nor can they control the perception (of them as a group) which society has of them. If we look back into history, we can see a progressive marginalisation of Art and artists: in the 17th century, Henry VIII, in England, begins the creation of the modern British state. To this end, he establishes the position of Lord Chamberlin, a forerunner of the modern censor, to whom all works must be submitted before performance or publication. We can go further back philosophically, to Plato’s The Republic; Art and artists are banned because they represent things, people, emotions “as they are not”. Jump to Aristotle and we’re presented with the formula that persists to this day: give the spectator an individual character to identify with. This character can then act as a ‘guide’ to perception of events and situations encountered while, at one and the same time, appearing to be an “everyperson” – paradigmatic of “reasonable response”. Move forward to Kant, a philosopher who, as the embodiment of the Age of Englightenment, attempts to formulate a set of rational responses to Art, all the while dealing in a univocal concept of rationality – his most ‘interesting’ idea is that emotion clouds aesethic judgement, actually makes that judgement incorrect (in some sense or other). Yet Kant’s ideas rely (in the same way as Descartes’ cogito ‘proof’) on the existence of (a) God.

In Hume’s Of the Standard of Taste, we again see an assumption of a univocal rationality while, at the same time, he introduces two notions that are still popular: (i) that the young are swayed by their emotions, thus, invalidating their judgement of Art and, (ii), that to ‘know’ the value of Art time has to pass, we have to see if if X endures over time…

Now, this line of argument might seem rather odd, but what I’m doing is leading into a discussion of Heidegger’s assertion that it isn’t society that makes Art, it’s Art that makes society. In short, why do we value the works we do and, if Art has power (to changes minds/to improve society), why do we have the society that we have in 2023, a society characterised by unfairness, inequality, racism, sexism etc.

To begin to discuss this question, one has to look at the ways in which Art and Artists are regarded in society – their ideological positioning. We can also ask who, or which institutions, decide on what is considered Art…and, more importantly for my purposes, how Art is ‘used’ and perceived in education.

As recently as the early nineteenth century, Shelley wrote that “poets are the unacknowledged legislators of the world”…but he’d reckoned without the relentless march of industrial capitalism as the century wore on. What’s fascinating about the nineteenth century (and I’ll stick to the UK) is that we have this (apparently) flourishing age of industrialism and imperialism (or put another way, exploitation of people “at home and abroad”), but this is not celebrated by artists – read Dickens, Eliot or Gaskell, look at the paintings that depict the industrialised landscape as akin to hell. What we see is critique (similar trajectories are seen in France and Germany too), a questioning of the effects of technology and colonialism on people and their communities, on Being itself. Ideologically, this leads to the gradual marginalisation of the artist: the myth of the starving artist in the garret, who doesn’t inhabit the ‘real’ world, who cannot accept ideas of ‘progress’ (even though such ideas are morally repugnant). This marginalisation was, and remains, highly successful, now augmented by the profit motive and, theoretically, by postmodernism – a theory that claims metanarratives are obsolete, redundant, whilst it is itself part of the metanarrative of capitalism…sleight of hand at its most effective.

Once, apparently, metanarrative becomes obsolete, alternative rationalities become ‘equal’, everyone’s opinion becomes ‘equal’, one decides on worth for oneself.

Yet, as Marx argues, not only do the ruling class control the means of production, they also control the flow of ideas, and those ideas themselves – what is covered in the media, what is published, what is made…what is valued and what is marginalised. In short, it is relatively easy to argue that ‘history’ is controlled by the ruling class and that, therefore, Art is controlled as part of history. Those artefacts that ‘contribute’ to stability and security, to the status quo, are preserved over time, their “cultural position” being redefined as and when required, but fundamentally doing the same ‘job’: maintaining the ideological position of the ruling class.

Nietzsche’s The Genealogy of Morals works for Art too. Artefacts and the ideas they represent are anointed by “cultural controllers”. These artefacts are then passed down to following generations as a “base line” of (political) artistic standards.

End of Part 1…

Featured

The University and Anarcho-Capitalism

Or rather the Technological University…What has become increasingly apparent is that the TU is formulated to be a subsidiary of the anarcho-capitalist dream, a trojan horse designed to undermine, and ultimately destroy, ideas of collegiality, community and democracy. The motivation, the function, of the TU is profit, hence the layers of administration to control academia, the imposition of monolithic “command structures”, the pseudo-‘business’ job titles – all designed to displace and replace competing concepts of rationality with one: univocal, over-arching apparently impervious to challenge. Part of the design is to colonise the notion of ‘progress’, as business-speak has colonised language per se in the past two decades: terms such as ‘creativity’ and ‘innovation’ are now taken to mean the discovery of new ways to exploit others, and ‘transparency’ is simply a method of concealing power, information and responsibility. ‘Progress’ in the shiny new world of the TU is concerned with ‘value’, yet ‘value’ istelf can only be calculated in terms of financial profit – the value of the Arts, of community, of democracy is subsumed in this colonisation.

The TU becomes a frontline in the anarcho-capitalist fantasy of “the zone” – an area where relationships are purely financial, there are no guarantees of continuing employment, no commitment to anything other than “what business wants”…and certainly no commitment to knoweldge for its own sake or for the ‘improvement’ of the human condition; the human condition is seen as one of a perpetual competition between individuals, the contract defines what is now called social capital. In short, as Margaret Thatcher claimed, “There is no such thing as society; there are only collections of individuals.”

The “traditional university” (as we might call it) is/was a site for the advancement of knoweldge, for the analysis of perspectives on human society and how these might be advanced to increase the sum of human happiness. In terms of the Arts, what are the various branches, other than perspectives, ways of seeing? What does a novel, poem, play, painting or film do, but posit an alternative to what exists? In the brave new world of the TU, this multiplicity is rejected (in a similar way to the marginalisation of the Artist in the nineteenth century, that apparently great age of “industrial capitalism”). Students, or as we are now told we must call them ‘learners’, must be instructed in the ‘principles’ of business as the foundation of all things. In Heideggerian terms, calculative thinking becomes the foundation of the TU; meditative thinking is consigned to the dustbin of history. ‘Progress’ and “financial success” merge, becoming one and the same, which in turn connects to the nostrum of “the individual” as the basic unit of society (although this nostrum is incompatible with the idea of society). Egocentric individualism, the neoliberal’s building block, is deliberately promoted as the only ‘adult’ attitude, anything else is dismissed as ‘unrealistic’, ‘puerile’.

By focusing on for-profit research, the TU deliberately marginalises the Arts – ultimately, they are to disappear or become the preserve of the wealthy – and becomes a function of busines, engaged only in those activities which business ‘wants’. Put another way, “the market” dictates the terms…but this is the problem: the market is being allowed (encouraged?) to control the discourse. The market becomes the ultimate arbiter. In the same way as the market has destroyed the NHS and the role(s) of the university in the UK, the TU is being positioned (by successive right-wing governments) to colonise education in Ireland. Which means we need to discuss concepts of value…

Featured

If an axe falls in the forest…

The Technology of Everyday Life

I’m starting this to start a discussion…about whatever it happens to be, but particularly if it happens to be about the human/technology interface…should we even call it that?

Hmm, off we go…again.

The whole VAR/professionalisation of football (soccer if your’re here) seems to invite consideration of the shift we’re seen gathering momentum over the past thirty-odd years, an integral part of Thatcherist (I refuse to call it ‘Thatcherism’ – she just applied Friedman) and post-Thatcherist capitalism: the instrumentlisation of vocation.

What we’ve seen is the introduction of/enforcing of “codes of conduct” into what were seen as vocational jobs, for example, nursing or teaching; jobs that were originally viewed as being work one went into because of social conscience, a feeling of obligation or duty to others. A desire, in the old cliché, to “give something back”. They weren’t particuarly well-paid, but they were (still are?) jobs which didn’t ivovle the separation of self from occupation. Thus, your job becomes an integral part of who you are, a fundamental expression of oneself – the highest form of self-expression as Marx has it, and an unending source of satisfaction (defined as the opportunity to help others…yeah, that’s it). Of course, this doesn’t ‘fit’ with capitalism: to be ‘satified’ one must consume, but that consumption must be fleeting – a result of false perception – and, therefore, perpetuate the cycle of consumption, initially of material goods, of products, but eventually extending to people, who cease to be people in the full, subjective sense – in that we acknowledge the similarities and differences between ourselves and others – and become simple objects to be used and discarded as necessary. All summed up by Thatcher’s remark to the General Assembly of the Church of Scotland, “There is no such thing as society, there are only collections of individuals” (I could be slightly out with that quote, but I don’t think so). That final admission of rampant, instrumental individuality, an individuality which has gathered momentum again in the past few years. Hence the rejection of refugees and asylum seekers…”Oh, I’ll donate a few quid – makes me feel better – but don’t come here.” Yeah, all very Hobbes in the Quad…

So, everything is fine until I get bored, or you lose your usefullness to me, at which point I’m afraid it’s time to move on…it’s only ‘natural’, “just the way things are”…

I’ll come back to individuality in a moment, but just to explain the professionalisation thing. Our vocational jobs have gradually been ‘professionalised’: in order to be recognised as a ‘profession’, it appears that a code of conduct is required. As far as I can see, the basis of any code of conduct is mistrust (distrust?). We’ll begin from the point that, if you’re not told to do X (and, furthermore, if a penalty isn’t imposed on you for the non-performance of X) then you will not do it. I can’t trust you to look after me, in some sense or other, in virtue of the fact that we are both human persons. There must be ‘accountability’, usually legal accountability, to ensure that you do your job properly.

Of course, once a code of conduct is introduced, and enforced, human relations are disrupted, fragmented. The code becomes something I assume, like a cloak, when I walk through the door of my place of work – it becomes something other to who I am. My main function is then to ensure that I obey the code, in colloquail language, “to protect my own back”. I will do X because I have to do X, because I get paid to do X…and there’s the shift to instrumentalism. I now do my ‘job’ for money, and that money enables me to live my life, but elsewhere – in the consumerist paradise of capitalist society.

This is a result of the businessification (I love wordpress – it hasn’t underlined that in red as “not a word, you fool”) of society, the idea that business ‘methods’ can be imported into any ‘realm’ of human society in order to make it work “more efficently”. You want the NHS to work ‘properly’? Appoint the guy who used to run Tesco – there’s no real difference between selling groceries and treating people. Hey, we can establish “internal markets”…and thirty or forty years later we can wonder, disingenuously, why the NHS is collapsing.

In teaching we’ve seen precisely the same thing. The demand has been/is for a way of “measuring outputs”. Of course, when you can’t measure what’s important, you make what you can measure important. Subjects X, Y and Z serve the needs of the business community (a digression: at the heart of our society lies an irresolvable contradiction, that of the individual vs community. Capitlaism prioritises the former, yet fetishes the latter…more of that later), but the humanities? What use do they have? Where’s the profitability (the only meaningful term in a business ‘system’)? Hence the talk of “social capital” and “the social entreprenuer” (which, fairly accurately translated, means how good is your ability to talk folks into doing what goes against their own interests)…a simple integration of the humanites into the discourse of business.

We’re constanty implored to ask ourselves “What does business want?”. The answer is simple: a docile workforce that will reproduce what already exists, accept the status quo as “the way of the world”, and contribute to profitability. To that end, we’ve become mired in the bureaucratic and the predictive…a paper trail that culminates in the guarantee that at the end of a module, students will either ‘know’ or “be able to do” X, Y and Z. A bizarre notion borne of the assumption of uniformity. Give us a few years, and the term ‘student’ will be replaced with ‘customer’ – requiring an even more draconian code of conduct. One object produces another. Only the money makes the alienation bearble…that and, of course, the holidays. Define ‘holiday’ in this context…

In all of this, the creative becomes untenable. Creativity, as Eisenstein argues, is born of conflict (he even bolds it!), a refusal to accept the existent, a dissatisfaction with the actual, a desire to change the perception of others – “Don’t see it like that, see it like this”. A shift from the actual to the ideal(istic), from self-interest to empathy. The capitalist system cannot accept, or deal with this; look at the 19th century sleight-of-hand that, by the middle of that century, has marginalised the artist (of whatever medium), forced them to the periphery. We’re a long way from Shelley’s “Poets are the unacknowledged legislators of the world.”

The code of conduct in the vocational turns us into products to be consumed, destroying human connection. “You can expect me to do X for you, but only because I have to.” Oh, and if anything goes ‘wrong’, its the system, not me. There can be no blame because I have fulfilled my responsiblities as the system dictates. If, through some naive, misplaced sense of human duty (Bond? Affection?), I’ve attempted to go beyond what is required of me then I can be held to have transgressed against the code.

All very biblical but not, huh?

Coming soon: Revisions of Racism in Popularist Society…

Machinic Thinking and Subjectivity

This is prompted by Ron’s remarks on the first AI blog…or blog on AI…

Is it the case that the apparent ‘advantage’ of AI over human persons, that it can achieve an ‘objectivity’ or ‘neutrality’ (the latter is the preferred classification of capital), its Achilles heel? AI feeds on what already exists, it necessarily remains within parameters that are defined by its programming (much as a chess machine or a calculator does). Its supposed conclusions or solutions are, therefore, based on a set of (implicit) rules which it lacks the capacity to challenge, to transcend.

The human person (some human persons), on the other hand, reaches a stage whereby their subjectivity – a result of their unique physical and psychological experiences and the ways in which they “put these together” – causes them to question and challenge how things are. The human person develops a conscience, a consciousness, a capacity for moral thinking that goes beyond what exists. They transcend the boundaries of conventional thought to develop unique perspectives on the world, and events in that world, formulating ideas of justice and fairness. In particular, they are able to recognise that a “rational decision” can be the wrong one because of the ramifications for others. AI, similarly to “business thinking” (ok, that’s a questionable conjunction), does not do this; I say “does not do this” rather than “cannot do this” to retain the idea that AI is programmed by someone rather than fall into technological determinism – that because X was ‘invented’ at time Y, then it inevitably developed into Z at a later time, A.

Our subjectivity is what motivates our actions as human persons, whether that subjectivity is guided by a desire for profit or a desire for justice. Of course, our subjectivity is formed by a variety of factors: our culture; our upbringing; our education; our ability to think. Given that our subjectivity is constantly changing – each present physical and psychological event modifies and alters each past physical and psychological event and each future physical and psychological event – that demand for consistency, for ‘sameness’, in decision-making becomes redundant…in fact, it becomes unreasonable. The demand for consistency can be identified for what it is: the imposition of an epistemological framework that is based on power relations. This reveals the attraction of AI: that claim of neutrality, an absence of engagement with the subjective. However, at one and same time, this is indicative of AI’s inability to engage in human reasoning. Without what we can call the “emotive element”, isn’t ‘reasoning’ mere logic?

When one reads a novel or watches a film, we are aware of the logic of, say, the central character’s actions but part of the fascination of these Art forms is working out the emotional reasoning which explains why they do what they do. Often we think “Well, given this set of circumstances, I would not have done that”, seeing the character’s actions as illogical, but retaining an understanding of their emotional reasoning. What we are also able to ‘compute’ is how human persons react in remarkably different ways to identical circumstances.

To pin this down: our subjectivity is what makes us human persons – our ability to learn from experiences we have had (both physically and emotionally) and to extrapolate from experiences encountered in, say, fictional situations, which enables us to act in the world as if we had really had this experience, simultaneously “reality testing” our physical and emotional behaviours, modifying these where we deem it appropriate. When we experience Art, it is first and foremost, an emotional encounter, based on ‘like’ and ‘good’, whether that encounter is as spectator or as creator: the feel of the pen in my hand; the way the light falls on the objects of my photograph; the way the clay feels to my fingertips. These feelings go beyond the conscious, or a logic, to the human. This is not, however, to mobilise yet another version of ‘inspiration’, the “blinding flash” which causes the work to spring from the imagination. It is more docile, more ordinary, in that it is not some kind of exclusive, elite quality that only “the few” feel. It occurs when I fold a towel, place another log on a fire or cook food. Ok, it may (and only ‘may’) be a lesser feeling that when I manage to frame a particular shot or draw the pattern of sunlight on a hillside, but it is similar and connected. When I perform these actions is part of my motivation that others will share my concepts of ‘like’ and ‘good’? Is this different to the thousand ways that this occurs in domestic life each day? Can we talk about ‘motivation’ at all? I create X which can then go on to be recognised by others as ‘Art’ or not, an entirely arbitrary recognition based on narrowly-defined categories…BUT for the past hundred or so years, these narrowly-defined categories have been challenged, hence, the on-going debates around high and low culture, the concept of popular culture, working-class culture, the end of culture.

Does Art, as Dickie argues, depend on where and when it is displayed? This gives too much power to the established “Art World”, a clique who ‘decide’ on the classification and worth of something displayed in the “right places”.

How does MT/AI decide on Art? Can an algorithm be written that both classifies and feels? I read this morning of the world’s first AI Art exhibition. What does that sentence mean? When I looked at some of the exhibits, all they were were (stretching a point) ‘pretty’, a juxtaposition of objects. It did not seem to me that there was anything to be gained from looking, no insight or emotion, merely a pale imitation of Dali, Ernst and Duchamp. Thus, AI once again, as it does with ideas, recycles. It simply responds and reflects rather than refracts. The social dynamic that exists between the artist and the world is, in the case of MT/AI, flat – a monotone without meaning, although even that description credits these works with a conceptual meaning that they do not possess on their own. The question becomes: Do these works have artistic potentiality? Do they mean beyond themselves?

Artificial Intelligence 2

Perhaps, instead of calling this new ideological product AI, we should refer to it as “machinic thinking”; I’m using ‘machinic’ rather than ‘machine’ deliberately because this term seems, to me, to capture the ways in which this technology is designed, and promoted, to imply the idea of “better than”, not “different to”. The end goal of machinic thinking (from hereon, MT) is to convince us that our ways of thinking, of devising solutions, of using compassion and empathy to guide our thoughts, is inferior to the shiny, (apparently) unemotional world of digital technology. One can argue here that AI’s aim is to formulate a world without morality, without conscience, without imagination – competitive individualism becomes the norm, dispensing with ideas of responsibility and obligation to other human persons in virtue of their being human.

The prevalence of the computer (in its various guises – laptop, phone etc.) in contemporary society encourages the analogy of the human person as machine. This is hardly surprising; we can see this occurring in Descartes when the human body was likened to a mechanical entity. In a similar vein, the dominant image of the human and society in the eighteenth century is that of the clock, in the nineteenth century that of the tree and its branches, in the twentieth century the machine, culminating in the likening of the human person, and the ‘processes’ involved in being human, to the computer. Is there, for example, a great deal of difference between Plato’s aviary theory of knowledge and that of knowledge as background software?

However, what is glaring here is the sleight of hand involved in our thinking about AI (call it that as the ‘accepted’ term). We are not being encouraged to think of the computer as human, but of the human as computer. ‘Intelligence’ on this model becomes simple data processing. The human person must become a machine, putting aside conscience, ethical reasoning and context. ‘Intelligence’ in AI is taken as meaning ‘thinking’. But what does it mean to think? Is it simply relating data fragments?

The human person is encouraged to conform themselves to the machine because the machine (apparently) fulfils the goal of non-emotional decision-making. The machine is more efficient because it is uncontaminated by emotion, that fundamental human flaw. The more the human person uses the machine the more they are surreptitiously drawn into becoming machinic, the more the machinic stretches out into all aspects of being human.

There are two central questions here: firstly, how do we define ‘thinking’? (Related: how do we do thinking?) and, secondly, how do we define ‘understanding’? (Related: in what sense(s) can one be said to understand something?)

The comparison, as we allow ourselves to be subjected, whether consciously or unconsciously, by the computer, is with the human mind (not the brain) as machine rather than machine as human mind. This is because the computer, the machine, can do so much that we ‘forget’ it is simply a machine – a tool that we use; as Heidegger described it, a “Being with” technology. Yet the computer, and digital technology in general, has now become so ubiquitous that we forget it is a machine. It has become a tool that uses us, that provides us with a form (the form) with which to encounter the world, what Heidegger called “Being for” technology. As we unconsciously conform ourselves internally to the machine, the machine then forms our relations to the external world, to people and objects.

This is not to suggest that “the machine” is ideologically ‘innocent’, that once invented this machine was technologically determined to develop in this way. The computer as machine serves capitalism and the quest for profit as did the factory machines of the eighteenth and nineteenth centuries. And, in the same way, the human person fulfils the role of serving and maintaining the machine, whilst simultaneously becoming psychologically part of the machine. Thus, the (capitalist) machine controls both the means of production and the flow of ideas in society. The worker literally becomes a servant of the machinery (so no change) and, in their social interactions, reflects the depersonalised, non-emotional form of the machine. However, this depersonalised, non-emotional form is represented as aspirational, paradigmatic of the “successful individual” in society – the notion being that to be successful one must discard sentiment and responsibility for others. The machine is the perfect individual: able to take decisions without thought for, or understanding of, human persons. MT is represented as “the way to Be”, held up as the apex of ‘progress’.

Now, in all of this, one must also consider neuroscience, one of its aims being to map the human brain. Yet this is problematic: we are capable of mapping electrical impulses and pathways in the brain. We can, apparently, map certain thoughts to certain electrical impulses but, one is left with a question similar to Gilbert Ryle’s in his text, The Concept of Mind: shown the Oxford colleges, the visitor asks “Where is the university?”…shown the map of electrical impulses in the brain, one asks “Where is the mind?” or “Where is the thought?” or “Where is consciousness?”…or “Where is understanding?”

I have seen these kinds of questions referred to as ‘complex’ in an array of writings on AI(MT) yet, in these, the word seems to stand in for a multiplicity of unanswered questions. Take ‘understanding’ (N.B. Here we can see the relevance of Wittgenstein’s concept of “language games”, detailed in his Philosophical Investigations. An aside: could one read what AI(MT) does as playing games, in that one can play a game without understanding or being aware of all the rules that govern playing the game?

When one thinks, or has ‘intelligence’, doesn’t this involve understanding? Or we might say, an understanding? The qualifier is needed because of the myriad ways in which one can be said to understand something – by those who themselves understand that something. However, if I understand a something, X, differently to you, is either of us willing to credit the other with (an) understanding X? If your understanding differs from mine, we have to compromise, continue our discussions of X until such time as we identify “common ground” (remaining aware here of Sartre’s “bad faith” [deception of the self by the self] and Marx’s “false consciousness” [deception by ideology]). In regard to AI(MT), “an understanding” is presented as “the understanding”: the shiny, unbiased, neutral ‘truth’ of the machine reached without taking into account all that flawed thinking that ‘distracts’ the human person from reaching ‘proper’ conclusions. An understanding becomes the understanding, technologically determined (we already have an example of this in our universities: roundtable discussion of end-of-semester student results has been replaced by an algorithm which apparently ensures ‘transparency’…but who programmed the algorithm? And what has been lost by this abjuration to technology?

But back to the question of ‘understanding’. What do we mean when we say that we understand X? Is it possible to arrive at a useful definition?

Yes (well, I would say that, otherwise I wouldn’t have asked the question). Understanding comprises of context, specificity and moral thinking. This last involves empathy and compassion. One must possess the ability to confine oneself, although it is essential to have an concept of oneself as oneself, to “the background” when engaged in moral thinking – to balance subject thought with an idea of the good for others. This is generally known, I think, as a conscience: the ability to reflect on one’s actions as they impinge on self and others; to formulate theories of action and to extrapolate how these will affect self and others; to change one’s behaviour, or alter one’s thought, based on consideration of past actions and thought; to postulate an idea of “the good” then strive to achieve this, even though this idea is subject to constant modification through the processes of experience and of thought. In short, to engage in a perpetual process of self-analysis, comparing ‘facts’, theories and experiences. Back to Heidegger: one is involved in a constant process of Becoming.

Artificial ‘Intelligence’? 1

This term, “Artificial Intelligence” (from hereon, AI), has managed to become part of everyday language without an apparent challenge to, or questioning of, what it means. The area of meaning seems the right place in which to start, particularly in regard to ‘intelligence’. The ‘intelligence’ being referred to, although classed as ‘artificial’, is actually that of human persons. Once more, we have two conjoined terms which, merely because they’ve been used together so often, apparently have meaning. However, once we separate these, and begin to interrogate them, the certainties (of meaning) vanish.

What does it mean for something to be ‘artificial’? Shallow? A poor imitation of the ‘real’? This is a vital question when one uses ‘artificial’ in relation to ‘intelligence’ because, obviously, the term ‘intelligence’ has at best a series of disputed meanings, not some univocal, dictatorial (authoritarian?) definition which enables us to say that “X is intelligence, but Y isn’t”, although those on the right would try to convince us (insist) that they can say this. As the AI bandwagon swings onto action though, we can see this as the ultimate goal: a “technologically determined” definition of intelligence, therefore, of truth. The argument that “runs in the background” here is that of technological determinism: Because A was invented at point X, then its development into B at point Y was inevitable, despite human desire that it be otherwise. Raymond Williams, in his book Television, effectively debunks this argument for technological determinism, arguing that simply because a technology – in this case television, but we can extrapolate to all other technologies – is invented at a certain point, this does not mean that it’s destiny is contained in its origin (in much the same way as, for example, we can make this argument for human persons). A technology does nothing, it is the use that human persons put it to that is important. This is the subject of study: why is a technology, W, being used in way(s) Z? What is the motivation behind this use? Who does it benefit? Who does it control? What are the implications for those who are subjected to this technology? We can also see a link to Marshall McLuhan’s Understanding Media, a text that focuses on the form of technology (again, television) and its effects: How does it alter the ways human persons think about themselves as themselves and of their relation to others in the world? Does it cause a fundamental disassociation with previous ‘pictures’ of what it means to be a human person? Which traits of the technology do human persons take on, both consciously and unconsciously? In relation to AI, we might ask how the human person assumes machinic qualities, or attempts to mimic the apparently logical workings of AI.

These questions are, at this stage, secondary. There is, however, another point of interest in regard to our current terminology: how do we distinguish between ‘machinery’ and ‘technology’? What is the difference between the two? Machinery seems to be implicated in the idea of human control, an instrument amenable to cogs, spanners and hammers, something which cannot escape human control. Technology, on the other hand, is seen as an implacable, independent force, mythologised and mystified, a force that exists independently of the human person. Here, one can observe the essential sleight-of-hand that Williams exposes: in our contemporary technological society, the suggestion is that the computer is supremely logical, an entity (we appear to accept that these blocks of plastic, microchips etc. have actual Being) that is not subject to the factors that can ‘sway’ human conclusions: emotion; desire; context; consequences.

What we must ask here is: are these necessarily ‘faults’ in our thinking? Is technology simply a step towards achieving the dream of the Enlightenment Project – a world ruled by ‘rationality’ and ‘logic’? If so, the central question still remains: Whose rationality and whose logic? What kind of ideology guided, and still guides, these notions?Who does this ideology serve?

Under capitalism, we can identify a remarkably simplistic definition of ‘intelligence’ here: intelligence is that which increases profit. Put another way, which generates wealth. But for who? And to what end? This ‘definition’ sends us back to Heidegger’s concept of calculative thinking: how to make X happen faster and, therefore, apparently more efficiently, therefore, more profitably. AI is, taking this ‘reasoning’ into account, the ‘perfect’ creation: it enables decisions to be made without sentiment or emotion, regardless of the human cost, all the while claiming to be (technologically) ‘neutral’, the result of simple calculations that are ‘necessary’ to ensure the one and only goal that matters, profitability because it is the fulfilment of this goal that is the guarantee of all other ‘benefits’.

The suggestion is that profit is “common sense” or ‘natural’; AI legitimises this by making it appear that neutral technology favours profit, selects it as a the only ‘normal’ goal when thinking/intelligence is shorn of the emotion that detracts from being able to make sensible, rational decisions.

In this AI, particularly the ‘artificial’, is being represented as superior to ‘ordinary’ human intelligence which cannot, because of that “emotional flaw”, achieve ‘real’ conclusions: emotion will always get in the way, cause the structure of the argument, therefore, the conclusion, to be unrealistic.

Thus, AI rewrites what it means to be intelligent and what it means to think…and, by extension (what McLuhan calls ‘form’ in regard to TV), what it means to be human.

The Class of Education

We inhabit a university system in which Diversity, Equality and Inclusion (DEI) policies proliferate. Each university must have one and, apparently, to ensure these policies are followed, there is also a requirement for Universal Design (UD). Together, these ensure that lecturers do not populate their lecture series’ according to their own prejudices and biases which, left to their own devices, they would, apparently, be sure to do.

Or at least this is the thinking behind this ‘innovation’ which, as with the introduction of “learning outcomes” some years’ ago – whereby I am able to ‘predict’ what each and every student (oops, ‘learner’ according to DEI & UD) will know after the completion of my ‘module’. I have referred to “lecture series” in the paragraph above but, nowadays, the favoured term is ‘module’…Modules are, apparently, discrete, self-contained units which, once completed, can be forgotten as the student moves on to the next. Fragmentation in action. Not only do these modules have to be formatted in a particular way, the language that one can use is prescribed by management diktat. “Continuous Professional Development” sessions are run to teach lecturers how to write these “learning outcomes”: certain words are prohibited, as are certain phrases…in other words, the discourse is controlled, dictated by people who have never set foot in a lecture theatre or, if they have, did so many years ago. What one never hears is a rational argument in regard to why this word but not that one.

“Learning outcomes” form, as Thomas Docherty puts it in The English Question, part of the “paper trail” of managerialism that has blighted education for over twenty years. These established a structure of control which DEI and UD continue and reinforce. Academics are not to be trusted, should be distrusted from the outset. After all, primary and secondary teachers are already engulfed by paperwork to ‘prove’ that they are “doing their jobs”, so why not extend this to third level? Another victory in the war against intelligence – conformity and uniformity are imposed by bureaucrats, and the careerists in universities who are determined to compromise the integrity of everyone else in service of their “climbing the ladder”, whose goal is a “pat on the head” from those in power (that they, one day, might become them).

This is, of course, another instance of imposing metrics on the unmeasurable. How can I possible predict what someone else will ‘learn’ from my lectures/seminars/tutorials? I can offer combinations of: what I’ve found interesting; what has intrigued me; what the current state of knowledge in regard to this subject is (according to my research); a political analysis of the subject (bearing in mind that all analyses are political); identification of the discourse. What do I hope? That the people I meet will take these ‘pointers’, then use them in their own work. That they will develop perceptions that I have not – ways of seeing that I have missed or been blind to. These might not be immediate, but come months or years later. Education is about providing a “current state”, then seeing what people come up with.

Unfortunately, this does not ‘fit’ with the business model being imposed on education: metrics are demanded; inputs and outputs; a constant ‘judging’; conformity; uniformity; a demand that people meet ‘standards’ – no matter how arbitrary and absurd those ‘standards’ might be.

With the imposition of DEI and UD the control of academics reaches another level, what one can describe as the “supreme dream” of the bureaucrats and careerists: a standardised curriculum for each and every subject. The ‘perfect’ business model, inculcating measured uniformity and, above all, conformity. A model that produces workers who will obey, without question, the profit motive; who accept the profit motive as the only ‘realistic’ motive there is. Creativity, thinking for oneself and dissent from the ‘norm’ must be banished. In short, the narrative of university is being rewritten from one with no final act of closure to one that is sealed shut. The sham of completeness…again, a sham that transfers the inadequacies of the capitalist system to the individual: “If you don’t think like this, accept this as your motivation, pass this by regurgitating X, Y and Z , then you are a failure, a misfit whose hardship is of their own making.”

The capitalist dream über alles… a different conception to what used to happen when you went to university and were working class: the general tenor of the place was “get rid of those tarnished, dull values of yours, have these shiny new bright ones”. Once universities opened up (in the late 50s and early 60s) with the introduction of maintenance grants, we began to see changes. “Popular culture” began to be taken seriously, the notion of “High Culture” was challenged, syllabi changed radically; lecturers with “regional accents”(!) appeared, film became an actual subject (admittedly, even when I was a student in the 80s, it was still a “special subject” contained within the English department), theory became increasingly important as we moved away from the idea of accepting the metanarrative. Patriarchy was attacked, as was colonialism, capitalism and “the establishment”. The Arts and Culture were recognised as being as important as other more ‘tangible’ subjects. Lecturers were trusted to engage in self-analysis and, therefore, analysis of what they were teaching.

So, as the cliché has it, “Where did it all go wrong?” When Regan and Thatcher took power. The beginning of populist politics – greed and petty nationalism were paraded as ‘goods’. The individual became the basic unit of society. The mainstream left crumbled – look at Blair or the craven cowardice of the Labour Party in Ireland. All culminating in the recession, the irony of which is that it was caused by individual greed yet, in true disaster capitalist sleight of hand, was used to retrench capitalism as the only ‘solution’. In the post-recession world (in Ireland “post-Celtic Tiger world”), every aspect of society had to, and must, be bent in the service of profit. The lazy academics, with their government salaries and long holidays, were an easy target. They had to be brought into line.

Which is where we are now.

Completeness as Sham

So we can see pattern narrative as the fundamental “building block” of society, which instils a need for completeness through the imposition of its fictional structure (beginning, middle, end) on ‘reality’ (‘realities’?). What also occurs here is the perception of life as a ‘quest’, a cause and event ‘journey, between birth and death. Or perhaps we could see life as an attempt to restore a (fictional narrative) completeness that is ‘remembered’ from early life? We engage in a search for something which did not, and cannot, exist – something we think of as ‘lost’, yet this ‘lost’ is an ideological construct, instanciated and magnified by constant exposure to, and consequent subjugation to, pattern narrative.

Sartre may say that we choose to make our birth important, but the more interesting question is how and why we ‘choose’ to do this. Within that existentialist framework, it is simply that the human person chooses the events of their lives on which to confer importance, without any kind of mythological entity deciding this. This does not, however, account for the deterministic effects of ideology. It this a simple case of replacement? Well, no, because the “mythological entity” is a product of historical ideology, a promise of “eternal happiness” if one bore the trials of life with stoicism – “eternal damnation” if one protested (or belonged to the wrong sect). The promise of some kind of immaterial ‘heaven’ has now been replaced by a rather more materialist concept: the idea that, if you can just make enough money, you can inhabit “heaven on earth”. Gambling promises this, as does “hard work”, the twin gods of the capitalist system – because what else are “the markets” other than a vast lottery? Or our conventional notions of gambling? And “hard work”? Something which promises an ultimate “happy end”, but one that will always remain just beyond one’s reach.

Have ‘happiness’ and “financial security” become interchangeable terms? It would appear to be the case, mainly because that aspirational idea, of financial security, is ouwith the reach of so many. ‘Happiness’ seems to be a category that is defined by its rarity, as something that is unacheivable by the majority. However, what must be taken into account here is that ‘happiness’, in capitalist society, is an ideological construct – a consumeable, as are the consumer goods from which it is constructed. To be ‘valuable’, it must be rare, it cannot be something achievable by all, because that would diminish its ‘value’. In this, it merges the intangible with the tangible: ‘happiness’ in our “free society” means freedom to buy and, moreover, to be able to buy, to have, what others cannot. Then the next step: to convince oneself that this apparent ‘achievement’ is the result of one’s own “hard work”, the “fruit of one’s labours” as the archaic saying goes. It is not a case of “being happy”, it is a case of possessing happiness, of owning it. Put another way, one’s ‘display’ of happiness must be validated by the approval, by the envy, of the other. Their lack confirms one’s happiness. Happiness is a vital component of completeness, but that completeness is defined by the inability of the other to achieve it.

Social media, media in general, provides models of aspiration: smiling video clips, photographs – demonstrations of “the good life” – are designed to prompt thoughts of “Don’t you wish you too could be like this?”, “Don’t you wish you were here?”, “Don’t you wish you were happy like me?”. Sitcoms, for example, show us twenty minutes or so of confusion and conflict, which is always resolved by the end of the episode. Soap operas offer brief respites from the trials of everyday life before plunging us back into the desire for more and other, for difference, for an escape which will never come.

Social media causes a split in personal identity, in the conception of the self, a split which harks back to the Renaissance idea of self-fashioning: we construct a public self, then become enslaved to the maintenance of that public façade which, in turn, produces the trauma of self-perception. Our private self, our (as we see it) more authentic Being, aspires to become one with the public, to become ‘complete’ – our aspiration is to reach an ‘end’ we can live, to mirror, to imitate, the pattern narrative by which we are surrounded/engulfed.

What this pattern does is create a sense of inadequacy, an ideological construct to which an ideological construct is the apparent ‘solution’. The way that society constructs us as ‘subjects’ begins by positing the ‘individual’ as the basic unit of society. This ‘individual’ apparently possesses free will and, therefore, is responsible for all of their actions (if, and only if, those actions conform to the generally accepted notion of ‘free will’). As the child grows up, encountering, for example, patriarchy, authoritarianism, and consumerism, they develop ‘sets’ of inadequacies, some general – the need for a ‘good’ job; to “work hard”; not to be seen as ‘lazy’; to compete, because competition is ‘natural’ – some, as they see it, applying only to them – to have ‘good’ friendships and relationships; to be sexually active (or not); to have an acceptable ‘body shape’; to be outgoing and self-confident. We could also identify these, and the many others, as ‘anxieties’ which then result in apparent inadequacies…or maybe it would be simpler to call them ideological ‘deficiencies’.

These inadequacies are then structured as per pattern narrative and, most importantly, are configured as ‘failures’ or ‘lacks’ on the part of the individual. The overarching message is that if the individual was just prepared to make the effort, these inadequacies could be overcome and ‘happiness’ would be assured.

As with everything else in a capitalist society, business is seen as the ultimate response to any ‘problem’ (a ‘problem’ which has been created by the system itself), hence, the mindfulness industry (now worth millions) and the pharmaceutical industry (already worth billions). Both of these operate on the usual “business model”: it is the individual who is responsible for themselves, in this case for their mental health. There is no question of examining the system under which the ‘individual’ is forced to live to find an explanation; whether the ‘solution’ is pills or yoga or colouring books, the blame lies with the individual’s ‘weakness’ – a term mobilised each time the ‘individual’ falls victim to the system, whether it be poorly-paid employment, lack of housing, failed relationships etc. etc.

There is, however, a ‘refuge’ from all of the inadequacies that beset one: consumerism. Buy more, buy bigger. Incorporate materials goods into your journey, your quest, and gain a series of fleeting ‘satisfactions’ from TVs, cars, travel and so on. We can see this pattern working in media: Take TV Detective series. There will be one metanarrative that spans the entire series, yet within each episode smaller crimes will be solved, providing the spectator/reader with lesser amounts of satisfaction.

In our society, one driven by competition at all levels (whether we realise it or not), we are bombarded with instruction: how to have the perfect body; what relationships need to flourish; how to garden; how to vote; how to understand. Obviously, the latter is the metanarrative of all the others, the ‘message’ being “If you want to understand X, Y or Z, then you must understand yourself, and to understand yourself you need to understand A, B and C”. This metanarrative both causes, and purports to offer solutions to, our apparent inadequacies – a vicious, brutal circle, the goal of which – completeness – can never be completed…

The great irony here is that individuality is defined by conformity, admittedly this is “smuggled in”, but it is the default setting; pattern narrative creates a desire for safety and security – a desire to be able to impose a pattern on (arbitrary) events. Cultures, and sub-cultures, establish their own patterns of perception which, although they purport to differ, are reflections of one another. Pattern narrative causes the human person to yearn to ‘belong’ in some sense or other or, if one is unable to ‘belong’ in the way one thinks necessary, then to seek explanations or refuge in other groups…essentially “I want to associate with others like me, or to understand why I am not like them.”

The defining characteristics here are inadequacy, insecurity and anxiety – characteristics that are created by capitalism (of which postmodernism is merely another function in its attempt to obscure the power relations – the metanarrative – which it denies) as an ‘evolved’ means of control.

However, what we are seeing in contemporary society is a return to Althusser’s distinction between two kinds of ideological state apparatus: (i) the plurality of private state apparatuses, operating through education, culture, the media etc. and (ii) a single public repressive state apparatus. The latter is the use of (legal) force by the police or army of a state; any other use of violence is deemed ‘terrorism’, “mob rule” or ‘thuggery’. We need only look to the USA (Black Lives Matter; Student protest camps) or Israel (the ‘actions’ in Gaza) – these are only the obvious examples at the present moment. I could just as easily cite the Miners’ Strike or the Poll Tax ‘riots’ in the UK. All of them share a common denominator: the abuse of power by right-wing governments, but abuse passed off as “the maintenance of law and order”.

The rise of populist politics in Europe – in France, Germany, Ireland, the UK – is evidence of a retrenchment of understanding, a return to the old binary oppositions of ‘us’ and ‘them’. Nationalism has again made in-roads as the conservatives in each country move further to the right as they try to ‘accommodate’ those who we might properly call fascists. The demand is to enshrine the nationalist narrative, to return to a (nostalgic) sham completeness fostered by pattern narrative as a function of capitalist ideology.

Completeness as Ideology

The psychological need for an ‘aim’ or ‘end’ is inculcated in us by fictional narrative, made ‘normal’ by our talking of Being rather than, as Heidegger states, Becoming – the latter term though being a more accurate description of “human life”. However, even accepting Heidegger’s distinction, we can see here a connection with the “quest motif” of fictional constructs, in that the central character in literature, film, music, even the speaking ‘I’ of poetry, embarks on a ‘journey’ of some kind (we can look back to the connection with the Puritan spiritual autobiography and to, for example, Chaucer’s The Canterbury Tales and the poetry of the Renaissance).

This pattern narrative though is more deeply embedded in the socialisation process: it becomes the necessary and sufficient condition of the human person needing to feel (the definition of which I’ll come back to. Suffice to say that, in engaging with Art we are intrigued by the direction of the story and we “emotion-test”, in that we learn, firstly, how to express various emotions [and how we should feel when we ‘have’ these] and, secondly, ‘acceptable’ ways of expressing these) to connected to others. We are encouraged – we might say it is demanded of us – to see our ‘selves’ as incomplete (as “possessing a lack”) if we do not associate/engage with others. This is evidenced by societal insistence on having a ‘partner’, being ‘involved’ in relationships, having ‘friends’, participating in ‘groups’ of one sort or another. Those who do not participate in these ways (or in such thinking) are marginalised, seen as ‘other’. We can detect a class bias here too: working-class people will be seen as ‘dangerous’ or ‘threatening’, whilst middle-class/upper-class people will be seen as ‘eccentric’ – the latter being a far more acceptable, and understanding, term.

There is another contradiction (fragmentation) here: Capitalism insists on ‘community’ and ‘relationships’, positing these as perfect, aspirational ‘goals’, but at the same time fetishising “the individual”. Part of the definition of pattern narrative is the concept of difference to, and from, others, while insisting on a single protagonist, a prime mover. The central Aristotelian idea might be that we can imagine ourselves faced with the same situations as the protagonist, but we feel that we would react differently or in a better way (the basic premise of Reality TV).

We define our ‘selves’ not by our “similarities to”, but by our “differences from” others. My interests are, apparently, mine and mine alone, and these interests change. However, physical resemblance (in the broad, physiological sense) is taken as an indication of mental sameness: Because I resemble others in the world, it is assumed that I share their mental precepts – in the case of capitalist society, that my interests are different to theirs, that each of us are self-interested. We should also note that, in terms of personal identity, continuing physical resemblance from day to month to year to decade is taken as evidence of continuing mental sameness.

In short, pattern narrative is the central concept of the socialisation process. For the narrative of citizenship to exist, and to continue, the human person must identify with the historical narrative of “their country”, see this narrative as ‘natural’, as being part of their own narrative. They must find in their own actions and thinking reflections of “national characteristics”. Yet an essential part of this “national narrative” is the individual as individual, a subject who has their own wants/desires/needs, which are unconnected with those of the other individuals who, en masse, comprise the “nation state”. The interests of these individuals stand in no relation to each other, unless ‘activated’ by the state in regard to competition with other states (economics; war; sport). If we extrapolate this to the relation between individual and state, we can see that the imposition of narrative on individual/individual, individual/others and individual/state is entirely arbitrary.

Narrative also occupies a central position in our ideas of free will and determinism and, subsequently, in the capitalist concept of ‘morality’ (insofar as one can claim that capitalism is capable of a morality – I’m simply using the term here as indicative of my point). The centrality of narrative establishes the “terror of causality”: “If I do X, then Y or Z might be the result”. Thus, we have a consequentialist narrative. However, within capitalist society, this consequentialist narrative is shorn of its “moral thinking” or “moral aspects”, becoming a simple case of asking oneself “If I do X, then what will the consequences be for me?” and “If I do X, how will my actions be judged by others?”. In regard to the latter, we can see manifestations in the current fashion of companies ‘greenwashing’ their activities or universities establishing DEI policies (and a host of others). The emphasis here is on being seen to do, rather than there being any sincerity involved. The distinction is between appearance and reality (see the discussion between Ron and me in the comments section of the previous entry) or, to put it another way, between avoiding the company/university being sued and their being able to blame an individual for “not following our stated policies”. What these policies do, of course, is establish the institution/individual relationship as one based on mistrust; unless told to do so, the employee will “shirk their responsibilities”. This is an important factor in the narrative of business, introducing ‘mistrust’ as a ‘natural’ relation between employer and employee. What this achieves is the infantilisation of the employee, who must be guided, for their own good and that of others, by the employer.

What is built into the pattern narrative is judgement: that fear that ‘others’ will judge my actions, and my (encouraged/inculcated) ‘delight’ in judging the actions of others. Thus, the consequentialist narrative is internalised through constant exposure from infancy. We fear the judgement, therefore, the disapproval of others in society; in our use of social media, this has led to us actively seeking the validation of others. However, what social media has also led to is the rise of populist politics by facilitating anonymity when expressing racist, sexist, homophobic etc. views – these being based on judgement, but also on a refusal to (try to) understand the lives of others. Social media has performed the same kind of function that radio and print fulfilled in the 1930s, albeit on a far wider scale.

We can also observe the ways in which the capitalist class have adapted narrative over the past decades. Those who do not ‘prosper’ are represented as ‘lazy’ or ‘workshy’, their poverty a result of their own lack of ‘drive’. The varied concepts of ‘success’ that once existed have been replaced by a single definition that is financial. Morality, defined as caring about others in virtue of their being human, is depicted as a weakness. Fear of “the other” is a staple item of news programmes (whether that ‘other’ be a terrorist, a refugee, an immigrant, another country etc.). The “business model” is posited as being a “universal good”, its methods applicable to healthcare and education. Personal relationships have become transactional or contractual. Relationships per se have followed these forms too. As Hobbes claimed, life is apparently “solitary, poor, nasty, brutish and short”…

What is to be done? We can identify these ‘elements’ for what they are: ideological constructs. Concepts like “human nature” and “the individual” are functions of power; they have meaning only within capitalist discourse. Dismantle/destroy that structure – or analyse the structure – and we recognise them for what they are: meaningless terms that privilege the ruling class.

The analysis of pattern narrative then reveals the ideological ‘weapons’ of the ruling class. In regards to what we might call the ‘intersection’ of rights and morality, one can argue that each citizen has a set of rights conferred on them by the state, delivered through a set of “moral narratives” – biblical, mythical, anecdotal – which the citizen then proceeds to ‘rediscover’ in the other myriad narratives which they perceive as comprising their ‘real’ lives. One can ask if rights are predicated on the basis of ought/should or ought not/should not. If the latter, then rights appear to enshrine self-interest: compare “I have the right to walk down the street without being assaulted” with “I have surrendered my right to assault others, therefore, they should not assault me”. Could we call this a ‘negative’ narrative? This would be Hobbesian, in that each of us is born with an absolute right to do anything we please, but we surrender certain rights in order to gain others. This makes moral thinking redundant: the idea is not that “Doing X is wrong”, but “I will not do X because I do not want it done to me”. This tends to imply that “If I can do X without being caught, then X did not occur”, in that there is no moral value accorded to X – we cannot say, for instance, that “If I can do X without being caught, then X is ok” because this implies a moral value and that I realise that “Doing X is wrong”. What, however, is the case if this latter formulation is true? “I know that X is wrong, but I still intentionally do X”. This means, bluntly, that I am prepared to behave immorally to benefit myself (in some sense or other). This is the ‘morality’ of capitalism, in that it is the ‘morality’ of the self-interested individual. Yet this is quite obviously immorality, as a necessary condition of morality is the consideration of the well-being of others – a condition which is discounted as meaningless in a capitalist system; in this system, morality is a function of power, which does not apply to the powerful, only the powerless. ‘Morality’ is sleight-of-hand, used to distract from the primacy of the economic; it is allowed to exist, as Baudrillard argues, provided it does not interfere with profit. What we see is the economic being “dressed up” as the moral: redundancies must be made to ‘protect’ “economic viability”. The economic is prioritised over the human person: redundancies are ‘unfortunate’, as is poverty, destitution, powerlessness. Capitalism corrals ‘rationality’ whilst having irrationality as its basis.

The Fragmentation of Unity

At the centre of the capitalist project lies the contradiction between the individual and community. The former is fetishised, while the latter is divided into two: (i) An (unachievable because undesirable) aspiration and, (ii), a lost “golden age”, always past, always nostalgic.

Community is the staple ingredient of the soap opera, embedded into a mythological working-class setting that parades stereotypes: honest but poor and happy; the centrality of the pub as “community hub”; small, closed spaces; criminality; get rich quick schemes; substance abuse; domestic violence; respectability; the villainous rich; the lazy poor.

The general notion is a constant reinforcement of “they may be poor, but at least they have each other”. However, when analysed, one finds a confirmation of the individual. ‘Faults’ are personal, the political and the historical play no role. The spectator/reader is invited to speculate on “What happens next?” using their knowledge of past narratives and character traits to ‘guess’ the coming patterns. This forms a conspiracy of superiority between producers and spectator/reader, the latter always being in possession of more knowledge than the characters. This is not unique to soap opera, it is a feature (perhaps the feature) of all pattern narratives. There will be no surprises, no accusations, however ‘softly’ posited, that the socio-political or socio-cultural is in some sense or other responsible for human personality or action – this will always be, in mainstream TV and film (and novels), the ‘fault’ of the individual. The unity of fragmentation resides in this.

This fragmentation is installed in us from an early age, by the relation of fictional to realist (?) narrative. The fictional narrative has its roots, in English, in the Puritan spiritual autobiography and the character narrative of the 17th century. In these, an individual at the end of their lives recounts the narrative of that life as a ‘guide’ for the younger generation. Given its origins in Puritanism, this necessarily focuses on how one can lead a ‘good’ life, thus, ensuring that one becomes one of “the elect” and is admitted into Heaven (N.B. There was no way of knowing, as a Protestant, if one would “make the list” of the elect, one could only lead a good life. Catholicism, on the other hand, enabled penance and the forgiveness of sins through confessions…there was also the ability, as we see in Chaucer, to buy forgiveness). What we have then, by the beginning of the 18th century, is a class of persons who believe in the idea of “individual responsibility”, and a narrative structure that enshrines this. Hence, the appearance of the novel in English with Robinson Crusoe (N.B. Many of the early novels take the name of their central character as titles, so what we are reading is a quest narrative of an individual’s experiential ‘journey’).

We can take ‘narrative’ here to mean a connected story: a series of events unified by a single individual and by casual connection. What we can distinguish is, as I say above, two ‘types’ of narrative: (i) fictional narrative, exemplified by the novel and mainstream film and, (ii), the ‘realist’ narrative of one’s experiential life. However, (i) is predicated on, and structured by , cause and effect (X causes Y). In the interests of maintaining interest, the casual chain may be terribly convoluted BUT there is, nonetheless, necessary connection between each ‘link’ in the chain. For example: Cause (A) and Event (B) + Cause (B) and Event (C) constitute G(roup) 1; Cause (C) and Event (D) + Cause (D) and Event (E) constitute G(roup) 2, and so forth. Say we have 12 of these groups; seemingly, there is no connection between G2 and G12, yet there must be – If character X had not bought a newspaper on a particular day (G2), then she would not have been struck by a car (G12). This is the structure of fictional events.

What we must recognise is that (i), fictional narrative, becomes the cause of our thinking that the separate and distinct (discrete) events of (ii), realist narrative, are causal.

As infants/children we acquire language by being (a) told stories and by, (b), reading stories. Such stories appear to ‘reflect’ “real life”, in that they are told in a terminology, either linguistic or visual, which surrounds us in our experiential lives. The Western conceptual scheme is predicated in “making sense” of discrete events…which we might ‘translate’ as “imposing order upon”. The child hears/see this or that story (be it a fairy tale or, say, a narrative of their parents/grandparents lives – the latter being another imposition of order, the former an interpretation of the ‘rules’ governing [social] order), this this or that film/TV programme (be it cartoon or ‘educational’ narrative), each of which, regardless of content, is structured by logical causality. Thus, from an early age, the child imports and imposes this logical causality into/onto the events of their own lives. Thus discrete (random?) events become connected – partly as a psychological bid for security, partly as a result of engaging in a process that is, by most people, taken for granted.

I was once involved in a discussion about installing a 360° projection in a gallery. Now, whether this projection consisted of seven thousand images or seven, the spectator/reader (from now on S/R) will formulate connections, will construct narratives, even if the artist denies that these exist. We can also foresee an instance of a narrative being constructed if the S/R was told that a projection existed when all they could see were ’empty’ walls. I’ll forego a discussion of the concept of ’empty’ at this point.

We can see this fascination with, and necessity of, constructing narratives as a product of human psychology – the desire for predictibility, safety and security. That is, to see ‘middle’ and ‘end’ follow from ‘beginning’, although it appears that all ‘beginnings’ are devised in retrospect. Hence the popularity of fictional narratives – literature, film, music, computer games – and their classification as their “being like life” when, in fact, precisely the opposite is the case: “real life” has the structures of fictional narrative imposed upon it, and fictional narratives generate the psychological desire for predictability, safety and security.

Karl Popper claimed that what distinguished the human person from an animal was the ability to tell stories and to interpolate themselves into those stories. Thus, we are all central characters in the stories of ‘ourselves’ (my consciousness is mine; it cannot be anyone else’s, nor can I impute my ‘kind’ of consciousness to anyone else with any degree of certainty).

What we can deduce from these factors is that the overarching ‘motivation’ (inscribed by society’s use of narrative) is the human desire for ‘completeness’; that is, an epistemological desire produced by, but at one and the same time contradicted by (defeated by?) the conceptual scheme contained within our language, which is structured as an endless deferral of interconnected meanings. This ‘scheme’ appears to generate a search for an Archimedian point: a search for a stable foundation from which we can approach and make sense of the world outside ourselves. As a consequence of this search, we impose ‘sameness’ on both other subjects and objects, in that we insist on similarity even when it is not present – if it ever is. We might say that what remains the ‘same’ is the perceiving subject, not the subjects or objects of one’s perception. However, is it possible to say this, given that each perception modifies all of those which have gone before? Thus, the perceiving subject is also constantly in flux.

The Fragmentation of the Subject

Art comes, as I’ve said, from dissatisfaction and conflict, so how does it end up promoting what it critiques? Put another way, how does capitalism repurpose and weaponise artefacts which are profoundly at odds with its ideology?

By historicising these, then incorporating them into the system of “educational metrics”, coupled with the idea that developed over the course of the 19th century, that the artist doesn’t live in the “real world” inhabited by ‘us’, Art ability to cause change and raise issues has been neutralised (in the West – the Soviet bloc had a rather different attitude. Artists were imprisoned, executed, for their critiques). Art has been reduced, for the vast majority of the population, to a series of “celebrity individuals”, cult figures and disconnected mavericks, all of whom occupy the rarefied atmosphere of “the Art World”. Print media has, for years, sneered at artefacts such as the bricks in the Tate, the white canvas slashed by a carpet knife, the shark in formaldehyde…the urinal by Duchamp. They make a determined effort to ignore the meanings of these works, focusing instead on their monetary value. They attack those who interpret Shakespeare for the contemporary age because, in the “English-speaking world” (do we just include countries where English is the first language, or do we count the colonies too?), Shakespeare is the ultimate, fetishised writer, more than a writer, a touchstone of “British genius”…

Shakespeare, we are told in school, “catches the truth of human nature” in his language and, therefore, his plays. His characters represent what people are ‘like’. This is a fait accompli, for what 13/14 year old knows what people are like? Yet once the pattern is in place, once we’re given those points of comparison, we make them. Of course, the most insidious aspect of Shakespeare is the failure of change: disorder may occur but, by the end of each play, it is restored with little altered except the rulers’ names. The plays instruct us that: action, rather than thought, is the key to ‘success’ – Hamlet and Richard II; that the upper classes are fitted to rule; that hierarchy is a virtue, and we should know our place. It is hardly surprising that Shakespeare is still, in 2024, seen as a necessity in ‘education’.

Shakespeare is the obvious example, but what of other artists? Their works are seen as historical artefacts, their meanings employed in the service of “Well, people have always thought like this. Human nature doesn’t change. You have to make your own, individual life.” What’s the message of, say, Eliza Heywood’s Betsy Thoughtless (a work, incidentally, that had to be ‘rescued’ from obscurity) in regard to the position of women? From a capitalist perspective, the novel ‘tells’ us that women have always had these problems with patriarchy because that’s just the way things are. No heed is paid to how profoundly depressing it is to read this novel in 2024, 300+ years after publication, and see little, if any change. Much the same can be said of all the 19th century authors. They have been assimilated into the capitalist project of “proving human nature”. Simultaneously, the artist is told to stick to art – political commentary, social conscience, indicates that X is not a ‘real’ artist. Thus, the artist is subjected to claims of inhabiting the ‘unreal’, yet any attempt to be political is met with derision because they “don’t live in the same world as the rest of us.”

In film, an artist like Eisenstein is now studied in terms of form and technique rather than critique. Godard and Resnais are obscure, of passing interest on the way to Hollywood. All three are ‘taught’ as being of historical interest as ‘Arthouse’ or ‘Alternative’, posited as as displaying a ‘lack’ when compared to mainstream film. someone who becomes fascinated by them is immediately marginalised. The timeline marches on, its existence acting as a neutralising factor. Artefacts are compared, ‘influences’ discovered, connections made; all of which lead to the power of specific protests – the thing that made the artist create in the first place – being diluted.

A central problem here is the way that Art, Philosophy, Sociology and Politics were fragmented into different ‘subjects’ in the early twentieth century. Literature, for example, is studied as a self-contained “body of evidence”, complete in itself, with a distinct and separate timeline. This is something, I have to admit, that I’ve never been able to accept – I’ve always found literature to be entwined with the political, the psychological, the sociological, the philosophical. We can say the same for the other ‘subjects’ that I have just listed: surely to consider philosophical concepts and ideas without considering the necessary and sufficient conditions that existed at the moment of their production means that one has missed the point? This, of course, leads into a sociological consideration, a psychological consideration and so on. To claim that we can somehow separate ‘out’ such consideration from one another is indicative of the ways in which capitalism colonised the university in the 1920s. In order to produce a saleable ‘product’, knowledge was reduced to its component parts BUT, in order to facilitate the metrics, it was denied that these parts could come together in a coherent whole.

Philosophy is particularly bad at historicising which is surprising, given that it claims to be the ultimate ‘subject’ – an idea taken from Aristotle: apparently, the aim of man (because this is what he says in Greek, and this is how it has always been translated) is contemplation. However, even a passing acquaintance with Greek history situates this remark in a society that endorsed slavery and enshrined the inferiority of women…which rather changes it. Much the same with Kant. Are we to believe that his ideas sprang fully armed from his own head? There is a timeline in philosophy, but this focuses exclusively on prior philosophers (all of whom were, apparently, men). It’s worth noting Wittgenstein here, a man who did not study philosophy, but produced two of the most influential philosophical texts of the twentieth century, effectively shifting the entire focus from epistemology – what can we know? – to how we talk about what we think we know, the philosophy of language. The ‘slogan’ of Wittgenstein’s second book, Philosophical Investigations, is “Don’t ask for the meaning, ask for the use”, (re)asserting context.

Yet even Wittgenstein cannot escape metrics. I’ve answered exam questions, written essays, on what Wittgenstein ‘means’, what his position is compared to Ayer or Russell. What occurred to me then, as now, is “Should this be asking what material conditions caused Wittgenstein to write this? How does it apply to society now?”

What I’m getting at here is capitalism’s central ‘trick’: if in doubt, keep fragmenting the object (and the subject), then deny connectivity. Thus, there are more ‘products’, and fragmentation becomes a powerful way of controlling thought, therefore, controlling the human person. This is nowhere more obvious than the worship of “the individual” in our society.

In the post-WWII world, the concept of “the individual” became an ideological weapon in the war against communism (as the Americans saw it). This concept was associated with Western notions of ‘freedom’ (as Badiou says in The Communist Hypothesis this revolves around the ‘freedom’ to own property, to become rich, which is, apparently, “…the guarantee of all other freedoms.”) yet, at one and the same time as individuality is stressed, there is an equivalent fetishisation of ‘community’ – even though this was, infamously, disputed by Margaret Thatcher in an address to the General Synod of the Church of Scotland, “There is no such thing as community; there are collections of individuals”. Despite this, ‘community’ is still posited as something one should aspire to, even though this is unachievable in capitalist society. The idea of community is bemoaned as something lost, fallen victim to modernity. The ‘breakdown’ of community mirrors that of the family, both devices of fragmentation (perfect patterns) that create a tension within “the individual” who is encouraged to aspire to something forever beyond their reach…mainly because it does not exist. The television models, through soap operas, sitcoms and advertisements, represent these fictional entities that, while using dramatic fragmentation as a device, hold out a ‘hope’ that if these characters could only do X, Y or Z, everyone would be reconciled. This is part of their attraction: the spectator/reader is drawn into constructing imaginary ways to ‘solve’ these on-screen problems. The ‘solutions’ are always represented as necessitating changes in individual psychology, rather than in the socio-political environment. The ‘blame’ is placed squarely on the individual and their ‘failings’. Television drama never asks the obvious question: What has caused this person to become like this, to behave in these ways? If this type of question is even approached, in steps the next culprit: the family…to which the obvious retort is “What caused this family to become like this, to behave in these ways?”; which question leads us back to “the individual”. A vicious circularity that absolves environment from culpability.

What we also tend to see here, in terms of the final move in this pattern, is the introduction of ‘evil’. Thus, psychology and the motivation of the thief or the abuser is explained: they are ‘evil’. That is to say, they are not “like us”, their values are different to ours (if they can be said to have values). The use of this term usually, to me anyway, indicates either a failure of thought, or a refusal to think, to take an easy way out. It is, to me, a dismissive term that when interrogated has very little meaning; it indicates a refusal to understand on the part of the user (N.B. to understand is not to condone). What is it to use the term ‘evil’, other than a fragmentation of thought and of understanding.

Fragmentation plays a central role in our society. We are encouraged to divide our lives into ‘work’ and ‘leisure’, ‘family time’ and ‘me time’, ‘male’ and ‘female’ – all kinds of binary oppositions. What would happen if we were able to take note of Marx’ dictum that work is the greatest expression of who you are? A suggestion that money should not be the main motivator of what we do, it should be satisfaction – a holistic approach, impossible in a society that insists on the fragmentation of ‘tasks’.

Even those professions that used to enable satisfaction are now being brought under control, using a spurious notion of ‘unity’ and ‘transparency’. That, however, is the next entry…