The Anxious Age – Is Post-Postmodernity Defined by Mental Distress?

2013-07-02 17.34.55 - Copy

 

 

Introduction: Countermodernism

 

In 1979 Jean-François Lyotard (1924-1998) over-simplistically defined Postmodernism as “incredulity toward metanarratives, which was undoubtedly a product of progress in the sciences”.[1]  More simplistically still, this means that the theoretical machines of Modernism and before were rapidly ceasing to function as science progressed.  In Lyotard’s words, “postmodern scientific knowledge cannot be based on performativity, because efficiency must be calculated based on a stable system”. Nature and society are not stable (or closed) systems, and as such it is counter-intuitive to evaluate and accept universal truths derived from the variables of those systems. The truth-value in a stable system is inherently homogeneous and tyrannical – it will always favour the knowledge-bearer and work towards hegemonic disparity (knowledge is power).  Metanarratives, then, are told from a position of privilege or State, and any events of knowledge-based ontogenesis have their relative successes and progressions judged against a minor system of inconsistencies and innovations.  “New moves,” as Lyotard called them. As is demonstrated by dictatorships, control does not promote innovation, but rather homogenises the system.  This echoes in many ways Gilles Deleuze (1925-1995) and Felix Guattaris’ (1930-1992) analogy of Chess and Go whereby the game of chess represents the state, while Go epitomises the War Machine (the difference between a closed or stable system and an open and unstable system).[2]  Knowledge in the postmodern world is about change, adapting to it, and generating new ideas (nomadology), not on an established rigid scientific method. In 2019 we are indeed sceptical of grand narratives, yet our faith in science is perhaps greater than ever.  Mental illnesses, which are too often ascribed to biological causes, are medicated with confidence by general practitioners, and ingested blindly by the patient.

In Lyotard’s own terms, this precarious era following Postmodernism may then be defined as a blind credulousness toward the metanarrative of the sciences.  This will neither be proven nor disproven as, like Lyotard, I invite and anticipate paralogy (Lyotard again, referring to a discourse without consensus – an open system), though I do hope to add my voice to the already rich – somewhat cacophonous – well-pool of perspectives to be found in contemporary theory.

No discourse is ever initiated truly objectively: the writer always has a vested interest in the points made and the results found, contrary to any claim against this.  For is it not the initial spark of enthusiasm for any topic which drives that writer onwards, and is this not in and of itself self-interest?  My own vested interests are first-person perceptions of parenting a child whose behaviour is erratic, violent, recalcitrant and without context.  I propose to show how this behaviour, which is increasingly commonplace, carries not just biological hallmarks but is also linguistic in nature.  This perspective is parallel to Mark Fisher’s (1968-2017) belief that mental illness (in particular Bipolar Disorder) are the consequence of life in a neoliberal world rather than a biological imbalance, and as such it is vital to the sprit of this study to compare Postmodern theorists such as Deleuze and Guattari (who maintained that culture is biological) with the latter-day writings of Fisher.

This study takes the singular hybrid form of a third-person autobiography which will jump in and out of first-person perspective and critical analysis.  It does this because in order to evaluate the minor narrative it behoves us, in the manner of Postmodernism, to also evaluate the metanarrative.  In short, while I will be questioning the sources of my information, it will be equally important to evaluate my own narrative – to question the contingencies which have occasioned my experiences and shaped my perspectives thereon.

img006 - Copy

This is by no means the first attempt I have made to link our present social, political and cultural epoch to mental health.  In 2016 I wrote:

 

“(t)he schizophrenic is subject to fragmented thinking and delusions, synthesises words which make only subjective sense to the patient; repeats words and phrases over and over, each time as if for the first.  The schizophrenic displays a lack of emotional expressions, shows little to no enthusiasm and exhibits repetitive, jarring speech abnormalities.  Studies have shown that a key environmental factor in the onset of schizophrenia is childhood separation or loss – dislocation from a previous generation.  The post-millennial condition is schizophrenic in all of these factors and more.  Perhaps the most critical similarity, though, is in the delusional impersonation of established personalities of import without prejudice to the historical or mythological frameworks in which they belong.”[3]

 

If the term “schizophrenic” is appropriate to our times – if indeed the metanarrative is, in itself, schizophrenic – then it is the state which promotes a fractured society.  Not only, then is this a cultural matter – it is universal in its scope and implications.  Let us begin with the following statement, and allow this study to take us where it may:

 

“10% of children and young people (aged 5-16 years) have a clinically diagnosable mental problem, yet 70% of children and adolescents who experience mental health problems have not had appropriate interventions at a sufficiently early age.”[4]

 

What does this extract from a 2008 study suggest about our present age?  First, it speaks of a huge disparity between mental illness and our understanding of it.   Secondly, and more perniciously, it can only mean that healthcare professionals can no longer keep up with the vast amount of caseloads which grow exponentially by the year.  Numbers alone cannot provide or subtract weight to a hypothesis – if there are 10,000 children, and 10 percent of those have mental illnesses, out of those 1,000 children 300 have had sufficient intervention, while 700 have not.  And how do we qualify the sufficiency of “sufficient intervention?”  Has it been sufficient to bring the young person back from the brink of crisis and prevent self-harm or suicide, or has it been sufficient to provide them with the intellectual and rational tools to live lives relatively free of such crises?  Furthermore, the word “diagnosable” invites all manner of discourse, for is not science here limited to a pre-established metanarrative?  Psychiatry and paediatrics follow strict diagnostic criteria built on decades-old research, focused all-but-exclusively on biological study.  The linguistic question in such studies is a secondary concern, while any study of psychological abnormalities in our present age must, by necessity, bring linguistic and environmental factors more into focus.

It is for this very reason that I eschew statistical data: it belongs to the fixed sciences, those closed systems which are regulated by serotonin and its parameters of activity.

 

V__2B85

 

 

 

Episode I: PostNormal Hyper-Reality

 

The trouble begins in 1992 at the age of 14.  Nothing out of the ordinary happens that year – no bereavement, no stress…no perceivable external reason for the boy to make an attempt on his own life.  The only warning sign is a gradual onset of ennui, a sense of hopelessness and despair the like of which the boy’s parents are at a loss to account for.  One evening it occurs to him that the most sensible, rational act would be to overdose on pain killers, go for a walk and allow the drugs to destroy his organs.  When he wakes up in the hospital, the boy is overcome with the certainty that this would be the first of many such occurrences.  Over the years the boy will have no less than six hospital admissions on his medical record with varying degrees of seriousness.

 

In hindsight the boy (quite patently a reactivated prosopopoeia of my younger self) recalls that 1992 was likely one of the first periods of his life when the full pernicious implications of the neoliberal orthodoxy.  That year we marched in London in protest of the implementation of Poll Tax, and that same year (and contrary to previous legislation to abolish it), it was announced that Poll Tax would be replaced by Council Tax in 1993 (Thus the age of rebranding was born which continues to the present day).

 

The 1990s saw the dominance of the SSRI (Selective Serotonin Reuptake Inhibitor) over the field of mental health and psychiatric treatment.  Commonly referred to in the US by the brand name Zoloft and in the UK as Lustral, Sertraline is the most commonly prescribed SSRI, replacing Prozac in the early Twenty-First Century as the by-word for antidepressants.  I first began taking Sertraline in 2002, but it proved no difficulty in finding someone who had a more recent experience with the drug.  This user, who wished to remain anonymous, likened the first sensation as euphoric: “slightly drunk, without the loss of inhibition.”   The subject also went on to suggest that there may be placebic elements to SSRIs: “after the initial rush of happiness, it felt like I was carrying on the course of medication just to avoid withdrawal – which the doctor warned me would be unpleasant.”

 

Perhaps we are looking at antidepressant medication in the wrong way.  Instead of providing a means for coping with the world around us, could it not be the case that SSRIs actually create docile bodies?  Slavoj Žižek famously claimed about John Carpenter’s They Live,

 

“…definitely one of the forgotten masterpieces of the Hollywood Left. … The sunglasses function like a critique of ideology. They allow you to see the real message beneath all the propaganda, glitz, posters and so on. … When you put the sunglasses on you see the dictatorship in democracy, the invisible order which sustains your apparent freedom.”[5]

14141785_10210525902487691_4241852284410737561_n

Sertraline, and medication which acts in similar ways, act as the sunglasses in this example, yet the effect is opposite.  To put the sunglasses on is to become blind to the insidious processes of control and power which govern society.  A cursory glance at the listed side-effects of Sertraline seem to confirm this:

 

  • depression, feeling strange, nightmare, anxiety, agitation, nervousness, decreased

sexual interest, teeth grinding,

  • shaking, muscular movement problems (such as moving a lot, tense muscles, difficulty

walking and stiffness, spasms and involuntary movements of muscles), numbness and

tingling, abnormal taste, lack of attention,

  • visual disturbance, ringing in ears,
  • palpitations, hot flush, yawning,

 

As a long-term user of Sertraline, I can attest to all of these side-effects, and more.  The decreased sexual interest is of particular importance, having a Lacanian frame of reference.  Is this not the virtual definition of castration?  To repress desire in this way is to dissolve the Oedipal Stage, revealing the “real” father – which in this case is the State itself, and we can place more stock in Lacan here than perhaps any other contemporary theorist. If we contrast the figure of authority between that of a century (or even half-a-century) ago to the figure of authority of today, the difference is vast.  What was once domestic and proximal (the literal father) is now global and distal (the state apparatus), and this latter has no need to even interact directly with us: it has proven much more effective for us to actively regulate ourselves.  We surrender our desires to the desires of ideology, allowing the neoliberal clinamen to prevail.  While it is no great revelation to say that governments and ruling elites control society in ever-more pernicious ways (proving this would indeed be akin to proving that water is wet), there is also the neoliberal concept of “post-truth” to contend with which paints our political climate in colours far brighter than those of reality.  Anthropologist Alexei Yurchak coined the term “hypernormalisation” to describe the attitude of paradoxical political blindness which permeated Russia in the latter years of Soviet rule[6], and the progression from this to what Mark Fisher called “Capitalist Realism” is self-evident: while the former pretends that the climate is functional, the latter knows that the opposite is true, yet cannot imagine an alternative.  Fisher in fact takes his cue from the State Realism of the Soviet Union and its propagandist machinery, Realism has nothing to do with the Real.  On the contrary, the Real is what realism has continually to suppress.”  SSRIs are the perfect societal adjunct to this state – what better way of maintaining the illusion of stability than freely giving subjects the perception-managing drugs they crave?

 

221808_2012316711440_6520795_n

 

 

Episode II: Psychomodernism Vs. Schizomodernism

 

The child screams.  Perfunctory sounds no longer suffice to convey the ever-more complex thought processes in his head, nor do the articulated phonics which he has learned to parrot back at his father allow these abstractions to manifest themselves.  In his frustration the child begins to slam his head against the living room wall, once, twice…until the very succession of this action has deadened the images in his mind that he cannot yet begin to grasp.  The skin on his forehead is aflame with pain and the wall retains the crimson memory of the boy’s rage.  The child’s terrified father sobs as he applies the towel-wrapped frozen bag of vegetables grabbed in a panic from the fridge to his son’s head.  As these incidents increase in both frequency and severity, the child’s parents naturally seek medical help, only to find themselves subjected to the most tortuous cross-examinations and intense scrutiny.

 

This is, without question, a linguistic problem.  And, sure enough, as the child’s vocabulary becomes more sophisticated, so the violent head-banging decreases.  This would, ordinarily, serve as the happy ending scenario to a troubling-yet-not-altogether-atypical parental crisis.  However, over the course of the following years more troubling symptoms develop: an explosive aversion to fire alarms, the sound of a hand dryer in a public toilet, an intolerance for clothing…all of these point towards behaviour typical of the autism spectrum.  At 7 years old the child is removed from his school following an attempt to strangle a classmate, and what follows is a nearly two-year diagnostic period during which the child is assessed by numerous professionals in order to gauge speech and language (to satisfy the linguistic question), psychology (to address mental health) and paediatrics (for autism-related issues).

 

There are also the protracted and focused attacks by the child upon his mother, resulting in the latter’s body (in particular her lower torso) being covered in bruises and swelling.  Occasionally these attacks are facial, and black eyes become common.  And then there are the secondary effects.  The child’s mother is forced to forego employment during the week, surrender her studies (ironically in social care) and the relationship between her and the child’s father eventually becomes so jaded and warped that they end up finding one another again at the end of the process – only this time they are entirely different individuals to the couple who fell in love a decade previously: they are beaten, disillusioned…all promise of future prosperity scuppered by bureaucratic torpor.

 

It is now seven years since the child began to display troubling behaviour, and two years since his parents sought help from the Umbrella Pathway, a service provided by Worcestershire County Council to “provide an assessment process for all children and young people presenting with neuro-developmental disorders which may be due to Autism Spectrum conditions (not ADHD).”[7]  Among the non-diagnostic suggestions made by professionals is PDA:

 

“Pathological demand avoidance (PDA) is a behavioural profile associated with apparently obsessive non-compliance, distress, and florid challenging and socially inappropriate behaviour in children, adolescents and adults.”[8]

 

While PDA is a behaviour profile within the autism spectrum, it is by no means unanimously agreed upon by professionals whether or not it belongs on the autism spectrum.  It is therefore referred to as a sub-type.

 

When I informed my older brother of this prospective diagnosis, his immediate response was to exclaim “you’ve just described yourself!”  Could it then be that there is a biological element which I have passed onto my son, which has only become recognised scientifically in his generation?  In hindsight I recall my childhood carrying hallmarks of PDA: an aversion to authority, discomfort at regulation and intense feelings of suppressed rage.182634_1880487455791_3649772_n

However, what if the problem lies elsewhere, in the most pernicious and overlooked social evil: standardisation?  A hallmark of neoliberalism, standardisation regulates the mainstream of the state apparatus, covering all areas of government and the public/private sector.

 

The government’s own website states:

 

 

Standardisation is the process of creating, issuing and implementing standards. A standard is a document, established by consensus and approved by a recognised body. It provides rules, guidelines or characteristics for activities or their results so that they can be repeated. They aim to achieve the greatest degree of order in a given context.[9]

 

The two words which jump out there are “order” and “repeated.”  These hallmarks of meta-power can be traced to Deleuze and Guattaris’ notion of the state apparatus and the war machine and to Michel Foucault, who would say that “order” in this context can be translated into “discipline” in order to produce normalisation and therefore “docile subjects.”

 

Episode III: Dromomodernism and Aggressive Desublimation (There Can Be No Conclusion)

 

In Precarious Rhapsody, Franco Berardi states:

 

“The acceleration of information exchange has produced and is producing an effect of a pathological type on the individual human mind and even more on the collective mind. Individuals are not in a position to process the immense and always growing mass of information that enters their computers, their cell phones, their television screens, their electronic diaries and their heads. However, it seems indispensable to follow, recognise, evaluate, process all this information if you want to be efficient, competitive, victorious. … The necessary time for paying attention to the fluxes of information is lacking.”[10]

 

In a hypernormalised world of post-truth, what better way to control a people than to bombard them with a constant strobe of information parcels?  As Twitter feeds accelerate and Facebook becomes ever-more hyperbolic, so too do our levels of anxiety.  Can PDA be rooted in biological neurosis which is exacerbated by linguistic factors?  Paul Virilio (1932-2018) argued that “there was no ‘industrial revolution’, only ‘dromocratic revolution’; there is no democracy, only dromocracy; there is no strategy, only dromology.”[11]  Dromology is derived from the Greek “dromo,” which refers to the activity of racing, ergo speed and acceleration.  Dromology, then, is surely how we should countenance the flow of information in the modern age. There can be no true conclusion to this study: as linguistic cultural and social factors multiply and accelerate, we can only wait to see how our biological and linguistic bodies cope…if, indeed, they can.

 

16864141_10212279478926006_5779708440142606051_n

 

 

[1] Lyotard, J., Bennington, G., Massumi, B. and Jameson, F. (2005). The postmodern condition. Manchester: Manchester University Press.

[2] Deleuze, G., Guattari, F. and Massumi, B. (2017). A thousand plateaus. London: Bloomsbury Academic. (pages 523-551)

[3] Davis, G. (2016). No Job for a Grown Man (part six) – in Explication of the Schizophrenic Age. [online] Legally, I Own the Thoughts of the Dead. Available at: https://grumpusart.wordpress.com/2016/11/07/no-job-for-a-grown-man-part-six-in-explication-of-the-schizophrenic-age/ [Accessed 30 Apr. 2019].

[4] Children’s Society (2008) The Good Childhood Inquiry: health research evidence. London: Children’s Society.

[5] Slavoj Žižek, THE PERVERT’S GUIDE TO IDEOLOGY. British Board of Film Classification (19 June 2013).

[6] Jurchak, A. (2006). Everything was forever, until it was no more. Princeton, NJ: Princeton University Press.

[7] Hacw.nhs.uk. (2019). Umbrella Pathway. [online] Available at: https://www.hacw.nhs.uk/our-services/childrens-community-health-services/umbrella-pathway [Accessed 30 Apr. 2019].

[8] (Newson et al. 2003; O’Nions et al. 2014b)

[9] GOV.UK. (2019). Standardisation. [online] Available at: https://www.gov.uk/government/publications/standardisation/standardisation [Accessed 30 Apr. 2019].

[10] Berardi, F. (2010). Precarious rhapsody. London: Minor Compositions.

[11] Virilio, P. and Polizzotti, M. (2006). Speed and politics. New York: Semiotext(e).

Advertisements

Through the (Immediate) Past, Darkly

We are compelled to bookend the event, to portion months and years up into manageable, quantifiable volumes of matter and memory.  If time is the ultimate capitalist commodity, then our quantification of time is its currency, and it is in this way that the annual review-of-the-year rundowns which one can read in any given broadsheet or tabloid, or viewed on December 31st through a gaze of varying levels of ridicule are -to all intents and purposes – its audit.  Is it of any value to do this from a philosophical angle?  And, indeed, wherein lies the point?  Is it in order to file away each successive year into a unitary index for the historian or sociologist to access at their convenience?  If so, are we not further commodifying our notions of time?  In order to give this interrogation some perspective, perhaps we should benefit from referring to Manuel DeLanda (1952 – )’s excellent introduction to A Thousand Years of Non-Linear History:

 

“(I)f the different “stages” of human history were indeed brought about by phase transitions, then they are not “stages” at all – that is, progressive developmental steps, each better than the previous one, and indeed leaving the previous one behind.  On the contrary, much as water’s solid, liquid, and gas phases may coexist, so each new human phase simply added itself to the other ones, coexisting and interacting with them without leaving them in the past.”[1]

time-and-free-will-an-essay-on-the-immediate-data-of-consciousness

This being considered, is not our Gregorian inclination towards sectioning off units of duration rendered utterly meaningless?  Certainly, it may serve to “time-map” individual and collective events, it can also be useful as measurement of progress and decline.  It can, however, be of very little use to the contemporary thinker as Élan vital.  It is futile to review a year in terms of its singularity.  What we think of as “time” is little more than the shifting of energy, the passing of matter from one state to another.  There was, and never could be a 2018, as duration, that which we think of as “time” would necessitate each antecedent year forcing themselves as one into that year which is being experienced.  This is Bergsonism at its purest (albeit, too, at its most simplified), and if we were to take that model of the  present being nothing other than the past happening all at once – insofar as one may interpret such a complex idea out of its parole – then we may find a perfect analogy for our times: the Twenty-First Century can be defined by its concentrated repetition of the past, both culturally and politically, and perhaps provides us with the first significant parallels between Henri Bergson (1859 – 1941) – for decades dismissed as an antiquity of the old guard in philosophy – and Marxist poststructural thinkers such as Jacques Derrida (1930 – 2004) and, more recently, Mark Fisher (1968 – 2017).  For, if we were to take the concept of Bergson’s Duration out of the confines of analytic philosophy and place it in the broader spectrum of Critical Theory (and we surely can, as was amply proven by Gilles Deleuze (1925 – 1995)’s re-interpretations of Bergsonism), then what we are presented with is a like-for-like match of what Fisher called The Slow Cancellation of the Future[2], the “temporal malaise” which is so much a hallmark of contemporary culture that one is hard-pressed to discern between that which has been created last week and the artefacts of the 1970s and 1980s.

slow cancellation 1

 

Contemporary thinking, I would suggest, has all but eradicated the notion of a sole Philosopher King stood atop his plateau (to borrow the Deleuzian analogy).  This has been the case for several decades.  In his 1957 foreword to the second edition of Critique of Everyday Life, Henri Lefebvre (1901 – 1991) notes

 

“…professional philosophers generally ignored the book; for – starting with its title – it entailed relinquishing the traditional image of the philosopher as master and ruler of existence, witness and judge of life from the outside, enthroned above the masses, above the moments lost in triviality, ‘distinguished’ by an attitude and a distance.” [3]

 

Philosophy has, over the years, necessarily been a process of cross-pollination of thought.  In Elemental Discourses, John Sallis (1938 – ) writes

 

(i)n Derrida’s texts there are many voices. Some occur as citations from Husserl, Heidegger, or other authors. Yet, in the strict sense whatever is set forth in citations is not the voice of another but rather a passage from a written text. Even if what is cited should happen to be words once heard in the voice of another, they will, in being cited, have been transposed into the written text; in this transposition the voice of the other will have been silenced. And yet, we sometimes attest that in reading the words of an author we can hear his voice behind the words, that we can hear it silently resounding.”[4]

 

Thus we may observe the clinamen of Lucretius evolve throughout the ages and become Deleuze and Guattaris’ desire, accounting for the clinamen’s inclination towards capital.  Indeed, the most pertinent and evolutionary use of philosophy is to commandeer from its massive historical inventory of themes and ideas – and, it can be argued, this is how philosophy finds its true meaning (Deleuze and Guattari themselves said “the only question is which other machine the literary machine can be plugged into, must be plugged into in order to work.  Kleist and a mad war machine, Kafka and a most extraordinary bureaucratic machine ….”[5]).  One need not absorb every text by Foucault, or attend two-hour lectures on Lacan to gain a healthy reserve of critical resources with which to formulate one’s own theories. Plato’s cave and Wittgenstein’s stonemasons serve very well as building blocks for an architecture of language and socialisation, reality and simulacra.  These ideas the modern philosopher must osmose and re-interpret, modify and apply pressure to, and for that very reason philosophical models function in much the same way as art: intense critical thought and complex abstractions simplified to the nth degree as signs, giving flesh to otherwise untranslatable concepts: art builds real architecture in Utopia and peoples it accordingly, yet it draws its strength from its ability to topple said architecture and rebuild.  Artistic movements provides the zeitgeist for this architecture, and these zeitgeists are the very agents of its destruction and reformation.  Much to Plato’s imagined chagrin, art is in many ways inseparable from critical thinking. At any given moment, the human mind is subject to incalculable heterogonous abstractions which superficially bear no relation to one another other than their chronological linearity – or the oft-cited stream of consciousness, that convenient one-size-fits-all coat with which lazy commentators have dressed such diverse literary figures as Beckett, Burroughs, Thompson, Joyce and Proust. Terms such as stream of consciousness exist to categorise that which has no formal category (other than, in this instance, that of literature).  But, if we again consult Bergson, thought processes are time in its purest state.

 

“Let us assume that all the sheep in the flock are identical; they differ at least by the position which they occupy in space, otherwise they would not form a flock. But now let us even set aside the fifty sheep themselves and retain only the idea of them. Either we include them all in the same image, and it follows as a necessary consequence that we place them side by side in an ideal space, or else we repeat fifty times in succession the image of a single one, and in that case it does seem, indeed, that the series lies in duration rather than in space. But we shall soon find out that it cannot be so. For if we picture to ourselves each of the sheep in the flock in succession and separately, we shall never have to do with more than a single sheep. In order that the number should go on increasing in proportion as we advance, we must retain the successive images and set them alongside each of the new units which we picture to ourselves: now, it is in space that such a juxtaposition takes place and not in pure duration.”[6]

 

The Twenty-First Century has, since 2001, been bereft of landmark political or social moments.  The key word here is “landmark,” indicating a fixed point in time after which the ideological apparatus in place before the event can no longer function, such is the impact it has on society, economics and culture.  The word “landmark” also implies space, rather than time, yet is no misuse: chronology and geography are intermingled in memory, creating those very ghosts which populate Derrida’s hauntology, and in keeping with the concept of hauntology, the most critical phenomena of the year occurred just as it was ending.  Two separate and distinct events, which happened no more than a week from one another at the end of December and superficially bear little-to-no relation, but which in fact have great reciprocal significance.  Firstly, Charlie Brooker’s Black Mirror series gave us another instalment in the form of the feature-length Bandersnatch, which was closely followed by the announcement that HMV had gone into second liquidation, His Master’s Voice now nothing but a pitiable whimper in the neoliberal wind.  As outmoded a capitalist model as it is out of touch with the times, HMV has, for decades, pre-packaged culture and sold it on as part of some great promise that what that culture represents is the very essence of what one needs to understand our times.

28-black-mirror-bandersnatch-2.w700.h467

 

Within the first few minutes of Bandersnatch it becomes apparent that popular culture will never tire of revisiting the 1980s, as though that decade was both the genesis and zenith of our postmodern metanarrative.  And yet again, the past is shown to us through countless factual and technical filters – for instance, it is safe to say that nobody ever bought a Tangerine Dream album in WHSmith in the mid-1980s.  WHSmith, like HMV represents the Harrods model of “everything under one roof,” which for a store that deals in entertainment and culture, is a laughably hyperbolic claim.  Yet our memories of these shops, for those of us who had childhoods in the 1980s, portray them as precisely that, for our own undeveloped awareness of the sheer richness and variety of culture is reflected by HMV’s own limited scope of same.  Thus, it adequately met our stunted expectations.  As culture and technology evolved at an ever-increasing rate in the latter half of the Twentieth Century, it soon became the case that this capitalist model of the third place serving as cultural nexus could never fulfil our ever-more-sophisticated understandings of culture.  This could partially explain our craving for nostalgia, as the artefacts of the 1980s remind us of the last time we were culturally satiated – the economy of craving and fulfilment was in balance (perhaps it is only in childhood that this balance is ever truly equal).  “Nostalgia,” though, as Simon Reynolds (1963 – ) points out, is translated etymologically as “homesickness.”[7]   Contemporary sociologists favour the notion of a fourth place in order to tackle the workplace/home environment crossover, but it is more accurate to re-identify The Third Place as increasingly virtual.  This is hardly surprising, since 9/11 shattered what was quite possibly the West’s final moment when an event was experienced collectively in The Third Place, and was the last “where were you when…?” moment in living memory.  As the Twenty-First Century has unfolded, global events have occurred in what feels like a steady trickle, owing not to a lack of event, but in the way in which events are now relayed to us.  In the sixteen years since the towers collapsed, the ingestion of current affairs has gradually slipped away from the static television screen and become something experienced singularly (one-on-one) through portable, streamlined devices.  Before the internet, the news was fed to us daily at precise quarters of a clock, with the 6 and 9 PM instalments reserved for in-depth investigations into the ramifications of the day’s events.  This may well still be the case, but it is now by no means how we initially learn of these events, which are continuously fed to us via the offices of internet newsfeeds which have no beginning or end, and wholesale information dumps such as Twitter.  News is no longer dropped on us four times a day around a centralised information hub (i.e. television or radio), but is now with us all day, and can be accessed from any location via mobile phones, tablet and laptops.  Wi-Fi has freed us from the necessity of the specific location, and thus the “where were you?” moment can no longer really exist, since such an occasion is marked by more quotidian, tangential social interactions (since social media, we are paradoxically no longer social beings) – history has always been made in conjunction with analogue discourse to provide context and understanding; the pause for reflection has been superseded by the knee-jerk re-tweet.  Is it any wonder, then, that cultural eruptions comparable to that of 1976 have been scarce-to-non-existent during the last decade-and-a-half, and that the cultural satellites of the punk movement can now be bought in Primark on t-shirt racks which also contain images of Miles Davis and Marvel superheroes?

In this sense we can quite easily relate the lack of modern social information exchanges and their replacement by personalised feeds of information to a Twenty-First Century flatness, or to put it simply, an age when globally-relevant events are still unfolding on a daily basis but are no longer felt as shockwaves.  Without shockwaves there can be no fissures, which is where Twentieth-Century culture once thrived: jazz, pop, punk, Abstract Expressionism, Conceptual Art and the Postmodern break in general all happened as a consequence of events which were felt as they occurred, and carried real consequences, unlike the political pantomimes of today.  The ages in which these events happened had their own zeitgeist modelled from the social mood, and are remembered – perhaps rightly or wrongly – for their cultural and social values.  In an age which has had no real shockwaves or fissures a void has inevitably been created which has no atmosphere, zeitgeist or – crucially – human analogue.  Since domestic concerns are primarily centred around economy, the average Western citizen concerns him or herself with financial survival and the waning scope for prosperity.  Let us, for a moment, contemplate upon a strata of people working not for prosperity or an elevated standard of living, but only in order to cling onto the standard of living they already have.  The opiate of the masses has been superseded by a cold bucket of water, terror at a knock upon the door.  The clinamen of capital (its desire) is absolute subjectification.  In 2018, Brexit once more proved itself to be that very subjectification, spreading fear and hatred across the UK and using similar (albeit more sophisticated) tools to divide the country as did Germany in 1939.  Brexit is more pernicious than the campaigns of history, however: there is no single identified common enemy, no one sub-section of society singled out for persecution.  Rather, it plays to the worst fears of all social stratifications, always with one lingering threat – you will lose what you have. Again, as with the trial of Adolf Eichmann in 1961, one can discern a palpable sense of what Hannah Arendt called “the banality of evil,” a clear attitude of “rather you than me.”  Brexit has played into the neoliberal ideology in the only way it could: divide and conquer.  But in recent years, it has become the norm for events to resemble past situations.  Occupy, it can be argued, was itself a modified sit-in, grafted from the late-1960s onto the present day and given an Economics degree.  Where it has prospered – as opposed to the disenfranchised, disconnected youth of fifty years ago – is in its organisation and the clarity of its voice, both of which can be attributed to technological agencies unimaginable in the last century.  We have lost our sense of the epoch-making event, the galvanising force to attempt something different: the rule book is no longer torn up, so much as it is re-told through post-millennial perspectives.

wwi-color-restored

October saw the release of Peter Jackson’s They shall Not Grow Old, a technically astonishing colourised documentary to mark the centenary of Armistice Day.  Nothing can detract from the visual and journalistic achievement, although one could also read the film via Jean Baudrillard and liken the process to, for instance, the endeavours of Japan’s Ōtsuka Museum of Art, where only precise facsimiles of well-known original works are displayed.  The museum is, of course, anything but that: indeed, one might say that it is part-PowerPoint presentation / part-PT Barnum grotesquerie.  Or perhaps the film is more akin to the Abbey of Saint-Michel-de-Cuxa which was rebuilt by John D. Rockefeller, Jr.in Upper Manhattan using building materials from the original abbey in Southern France from what remained after it was rebuilt in 840.  The latter suggestion is given more weight when one considers that much of the original footage found in They Shall Not Grow Old was filmed using arcane hand-crank cameras which struggled to maintain a steady 12 frames-per-second, which is why the original films appear so jerky.  This jerkiness has been digitally offset by high-end digital trickery to save the World War One soldiers from an eternity of coming across like “…Charlie Chaplin-type figures.[8]”  This obviously means that fifty percent of what one sees in Jackson’s film is not original footage at all, but very sophisticated computer animation.

When Jackson says “I wanted to reach through the fog of time and pull these men into the modern world, so they can regain their humanity once more” he misses the mark somewhat for what has actually happened is that these veterans have indeed been brought out of the past, but only in a digital suspension: half-human, half-computer-generated chimera, they hang on the screen like exhibits from the Ōtsuka Museum.  Similarly, much of the audio track to They Shall Not Grow Old is actors’ dialogue, translated via the offices of a deft lip-reader who no doubt spent as many hundreds of hours reviewing the original footage as Jackson’s team did animating it.

Curiously, one can with great ease finish watching They Shall Not Grow Old and immediately begin watching the first episode of Peaky Blinders (set in 1919, one year after armistice) without any disruption in either narrative or visual quality.  Peaky Blinders itself, is an example of history’s reworking and re-presentation.  Its non-diegetic soundtrack is entirely of the Twenty-First Century, and consists of artists aping the late 1970s and early 1980s.

 

There can be no table of contents for 2018, nor can it be reviewed month-by-month.  The writer cannot simply disclose a year as a series of events which range in importance or ramification.  I certainly have not done this (nor would I ever wish to).  Charlie Brooker, when not writing episodes of Black Mirror, will scan the year in a linear manner in his New Year’s Eve Wipe, but for serious discourse this can never fully articulate the essence of a twelve-month duration.  I will, however, borrow one recurring sound-off from Brooker:

 

“That was (2018)…now go away.”

 

[1] DeLanda, M. (1997). A Thousand Years of Nonlinear History. New York: Zone Books, pp.15-16.

[2] Fisher, M. (2014). Ghosts of my life. Winchester: Zero Books, pp.21-39.

[3] Lefebvre, H. (1991). Critique of everyday life. 2nd ed. London: Verso, p.5.

[4] Sallis, J. (2018). Elemental Discourses. Bloomington: Indiana University Press, p.13.

[5] Deleuze, G. and Guattari, F. (2004). A thousand plateaus. London: Continuum, p.5.

[6] Bergson, H., Ansell-Pearson, K. and Ó Maoilearca, J. (2002). Key writings. New York: Continuum, pp.49-50.

[7] Reynolds, S. (2012). Retromania. London: Faber and Faber, p.50.

[8] Ilse, J. (2019). Prince William attends World Premiere of “They Shall Not Grow Old”. [online] Royal Central. Available at: http://royalcentral.co.uk/uk/cambridges/prince-william-attends-world-premiere-of-they-shall-not-grow-old-110509 [Accessed 2 Jan. 2019].