#103: The Story About the Lone Renegate Scientist Showing That Everyone Else Is Wrong

The resistance to the pandemic abatement has revealed the strength of a very popular narrative: The story of the lone hero, in this case, the lone renegate scientist who knows something that nobody else knows, that shows that everyone else is wrong, and that there is a different reality waiting to be discovered underneath what you still think is real.

There are several of these types out there. Some claim the pandemic would not be dangerous at all, that there all kinds of Coronaviruses, and this one would not be more dangerous. Others claim that somehow magically, the virus would become less dangerous over time (sure, evolution may point to such a possible pathway, but it is not guaranteed – or have AIDS, the Plague, Measles, Ebola, or Smallpox have become cuddly harmless pets in the meantime? I don’t think so…). Again others claim that the vaccines would be uniquely dangerous, and that everyone else – the pharmaceutical industry, politics, journalists, the vaccinated, are all in on the conspiracy.

Only the lone, renegate scientist can help us here?

This is called romanticism. In today’s newly romantic era, the romantic hero is typically a comic book character, or somewhere on a spaceship far, far in either the past or future.

Romantic narration – not unrelated to psychotic narration (Flor and Kneis 2007) is a category of fiction, not reality. Let us not mix methodologies!

How does reality work, how does science work?

Be careful of arrogant geniuses that tell you that what they are saying would stand against the orthodoxy of established opinion. If anyone even claims this, they know nothing about science, about the scientific method and process. They must not be taken seriously, for they know not what they are doing.

Science is collaborative, it is always conducted within an established methodology and community. That’s why scientific or academic writing is so hard! You need to have evidence, you need to tie you work into  the long line of researchers before you, you need to open yourself up to ruthless (let me repeat: ruthless!) criticism, so that once your ideas pass muster, everyone can accept your contributions as legitimate.

The romantic hero cries into the wilderness like a mental patient.

The scientific researcher willingly endures science bootcamp for all his academic life with rewards few and far between, and academic versions of drill and persistent criticism making all of us, hopefully, better.

If you read something by someone saying “What you are hearing from me stands at odds with the entire scientific/medical/political establishment,…” – simply stop reading. It will not be worth listening to. Ever.

#93: Don’t Picture This: The Trouble With Selfies

My phone’s front camera is probably very confused. It has been severely underused. The back camera is not very happy either, but it gets used occasionally when I have forgotten to take my regular camera with me and need to take a picture. Very rarely does the front camera get to make a video call, but I prefer to use a real computer for that.

I don’t even really know what a “filter” is on the phone or one of these photo apps, or why I should be using it. That’s what Photoshop is for, isn’t it? Of course, I know, and I mockingly pretend not to know. Somehow. But somehow, I also don’t know.

It’s not that I don’t take any pictures ever. I’m a pathological picture taker, and I have spent quite some time thinking about Susan Sontag’s book On Photography, which successfully and disturbingly analyzes said pathology. Ideally, this reading is combined with Plato’s discussion of the power of writing as well as his metaphor of the cave, Walter Benjamin’s article “The Work of Art in the Time of Mechanical Reproduction,” Marshall McLuhan’s Understanding Media, and Neil Postman’s Technopoly and Amusing Ourselves to Death.

Combining the insights gained from this tour de force of media theory, it becomes clear that it is too simplistic to cling to the idea that technology would be basically neutral, would just be tool, and that it would depend on how to use it. Technology is far more than that. Its influence on our lives is thoroughly transformative. It lets us believe that it does our bidding while in fact we yield to its influence and needs just as much. We do have some control over how to use it, but even its availability changes the very ways we can think about it.

Do I need to know something by heart, or is it just enough to look it up? Do I focus on experiencing something in real life, or do I focus on recording it, transforming it from a continuum of active being to a frozen moment in time, a mere snapshot, something that will serve as a substitution for reality? What does this kind of technological reductivism do to the world thus captured, what does it do to what we think about this world and the living beings contained therein? What does it do to people if they are mainly understood through their media representations? What does it do to their notions of self if those representations are created by themselves even?

Whoever sat pretty for Leonardo when he created the Mona Lisa may rest safely in their grave shrugging off whatever Leonardo may have seen in them and showed of them in his famous painting. But what does this do to the person conducting a self-portrait? We know that Vincent van Gogh probably did not gain in personal happiness through his painting of self-portraits. He represented himself as himself. If you subscribe to the idea that Leonardo actually may have painted himself as a woman, as Lillian Schwartz suggests (and which sounds actually fascinating and has some level of plausibility), then Leonardo perfectly understood that any visual representation is always an interpretation. Certainly, Van Gogh knew that also.

Artists understand that even if you depict yourself or represent yourself, you are never being authentic: There is always something else happening. The self on the page, or in the picture, always has to be seen as a lyrical I. The “me, myself and I” that you see in self-reflective and self-portraying art is never the real self. It is a deliberately chosen perspective, a snapshot at a specific time, a setting in a particular scene, an inauthentic moment posing as authentic for a very clear artistic and dramatic purpose.

Reality cannot be captured, it can only be represented. Umberto Eco illustrates this in his short story ridiculing the creating a map of the empire in a 1:1 scale. The map would completely cover and crush the reality beneath it, as it would take over the entire space of the empire itself. Similarly, if we rely on nothing but representations of reality in order to understand it, we will limit ourselves to understanding the representations rather than reality itself. Granted, sadly, we need media and representations to even be able to conceive of reality. Our perception itself mediates the world around us. We are always sitting in Plato’s cave, we can never access reality as such. But we can understand that our tools of perception and the media we use to facilitate our perception in turn influence that perception also.

As Susan Sontag describes it, “Photographs really are experience captured, and the camera is the ideal arm of consciousness in its acquisitive mood.” This acquisition, this pictorial conquest, this obsessive need to claim some form of ownership of the world by possessing it through images eventually destroys our relation to the world. We substitute experience with representation, concreteness with abstraction, the world with pictures of the world, the self with pictures of the self.

This act of substitution, of representation, certainly affects how we see reality. The self, specifically if it is mainly communicated through pictures, will eventually conform to the pictures. As with many things, this is probably not a problem unless done to excess, but it can still fundamentally change how we see ourselves.

There is a reason – probably more psychological than religious – that some cultures have looked with suspicion at photographing people, or at depicting an image of divinity. That which can get captured, depicted, and represented so easily will lose its mystique, its transcendental qualities or – following Benjamin – its aura. Something happens once we fixate our selves through pictures. It happens both for others and for ourselves. But how do you represent yourself without losing some sense of self? Especially if this is a constant exercise in the performance of the self?

Should we see self-photography as an art form? For some, it certainly is; for others, the definition of art would probably have to be stretched a bit. But art certainly can be an escape clause here: it requires though – as illustrated above – a conscious act of deliberate self-distancing from the image of the self as performance.

Yet the context of such pictures of the self certainly matters also, whether we should see them as more artistic self-portraits or what is commonly described with the less high-brow term as selfies. As soon as a selfie is posted on social media, the battle for audience reactions begins. How many people like my picture? How many don’t? How many are seeing it? Is the picture being noticed? At what point though do these questions into something more personal? How many people like or dislike or notice me? Am I, as a person, liked, or is it the representation that is liked? Should my self – if a specific representation is liked – conform to the representation? Should I myself become the image I have put out there as an allegedly authentic image of myself (or of my self)?

Maybe this is the key: if the pretension of authenticity is taken at face value, selfies may well turn from being a possible work of art to an act of introspection through outside judgement, and become an exercise not of play but of masochism (or its psychological twin, narcissism). We know that social media itself should probably be better described as anti-social: They all too frequently are an exercise themselves, and not in sociality but in sadism. Artists all throughout time have suffered from bad reviews, and have tortured themselves through their art. Maybe we should thus see the selfie as the revenge of the self-portrait: May the same level of scorn be heaped on John or Jane Q. Public now as it was heaped on artists throughout the ages.

But that is certainly not something to be endorsed. Personally, I am staying out of the selfie game. There are plenty of ways to indulge in self-loathing; I certainly don’t need to document this in pictures on a regular basis.

#82: Only Logic Will Help Us Out of the Pandemic

At Warm Springs Indian Reservation, Oregon

These are not the easiest of times. We are still in the middle of a global pandemic, which is worsening in its impact due to various mutations of the virus that keep questioning the viability of our vaccines. Many people are suffering due to the disease, directly or indirectly, and all of our lives are in turmoil.

It is easy in such moments to despair. Frankly, it is difficult to maintain sanity and retain focus even for those of us who have been able to keep surviving, albeit in a reduced fashion. The psychological toll is nevertheless real, and the seduction to just ignore all caution and act as if we could return to normal is completely understandable.

That’s why it is important, maybe, to remind ourselves that this reality is actually, indeed, our reality, for very concrete reasons. Logic may help us here.

  1. The pandemic is real. For whatever reason, it spread out of China, whose government was not forthcoming for months about the cause and specific design of the virus. (To be clear: The blame lies with the government, not with the people of China, who are just innocent victims of a cruel regime. The hatred for people of Asian or specifically Chinese descent is wrong – as is all hatred. But anger at the government is justified).
  2. The virus spreads through droplets (which can be controlled with masks, maintaining distance, cleaning of surfaces) and aerosols (which can be controlled with even better masks, even greater distance, cleaning of surfaces, avoidance of indoor meetings, frequent and efficient changes of air in rooms if you have to be indoors). Some viral variants are more prolific than others.
  3. The virus enters through the respiratory system, but its effects are on the entire biology of the body. Some of its effects are probably debilitating for a long time, in both adults and children.
  4. The virus can be transmitted by symptomatic and asymptomatic persons.
  5. The virus can be deadly for everyone, but most likely for older people and those with serious medical conditions which may or may not be known to those affected.
  6. The infection spreads exponentially, not lineally, if given enough room.
  7. The effects of the viral infection become clear only after a certain passage of time. From infection to symptoms to potential hospitalization to potential death, it takes several weeks. Current infection data will map onto deaths with a significant time delay. We will have to act now in anticipation of something that may or may not happen in a few weeks. This has proved to be a major challenge, as people keep questioning the lethality of the virus.
  8. Vaccination needs to happen as fast as possible. Without it, viral spread continues and new mutated variants can evolve that may evade the protections given by the vaccine.
  9. All our various hygienic and lockdown countermeasures are in place BECAUSE OF EVERYTHING BEFORE. We cannot pretend that this is not real, even though we are suffering the consequences of both the pandemic and our reactions to it. We need to constantly adapt our reactions to the situation, be more vigilant that we’d like to be, and remain in a state of lockdown and protectiveness till this is over for everyone, globally.
  10. If we ignore safety and just pretend it is not happening, the pandemic will still spread, people will still get sick, will die, and the number of mutated variants will increase to eventually counteract even our vaccines. This must not be allowed to happen. For this, we need to think logically, long-term, and need to have patience – and governments need to help their people to maintain that patience by supporting their existence.

This is our moment as a global community, our make-or-break moment. We need to fight this together, or else there may not be much that needs saving at the end of this depressing period in human history. Let’s prove we can rise to the occasion as one world.

#79: The Need for the Public Understanding of Humanities and Social Science Theory

Words are easy. They are not formulas. You should just be able to read them and understand them instantly. Or so it goes.

We seemingly are living in a time where all the things talked about in the humanities and the social sciences in the recent decades are finally coming to have their day in the public consciousness. Words like “race”, “gender”, (not “class”, that is not of interest ever, really), “narrative,” “history,” “construction,” “capitalism,” “discourse,” “inequality,” “equity” etc. are thrown around with ease that you would think the entire world had just taken advanced theory graduate classes.

But of course, this is not the case. What has happened is that some of these terms – completely taken out of their “habitat”, their historical and philosophical context, have been unleashed as memes into the wild, devoid of their caveats, conditions, footnotes and complications – devoid of all things that make up the equivalent of a mathematical formula.

The perception that the “talking” and “writing” sciences should just be understandable “as is” appears to have made the rounds, and any complexity is denied as it would be deemed to just make this new pseudo-discourse boring, take all the fun out of it, and the possibility to monetize the outcry.

If you have been wondering, should you have been reading anything on this blog so far, what it is that I am actually doing, then you are not alone. It took me, myself and I an entirety of 23 years to comprehend what I have been on about on my blog and in my research. My real interest in this format seems to be the Public Understanding of the Humanities and Social Sciences.

I am trying not to be too pedantic, to have a bit of fun, to not be too dogmatic, to never be mean, and to always be open to new ideas.

Speaking of idea, isn’t that a difficult term? Ah, but I just promised to not be too pedantic, so there’s that for now…

#73: The Destruction of Creativity through “Social” Media

There used to be a magical time. You will remember it if you remember using MS Internet Explorer version 2 and above, Netscape Navigator 3 and the mighty Netscape 4 (Communicator). Search Engines (AltaVista, Lycos, Metacrawler, etc.) and Catalogues (Yahoo!) were two different things. Inktomi was not just an allusion to the Lakota spider-trickster (the original spiderman?) but the gold standard in search technology. Google was a nerdy new thing whispered about by tech-mages to their students (and still believed “don’t be evil”). Modems limited your internet speed, and if you had a chance to sit at a university computer, you hit the mother lode with terms of connection speed, and even a Windows NT machine!

I have a “bio-birthday” and a “web” birthday: on January 8, 1998, I built myself an awkward looking home at GeoCities’. This was the place to be, when GeoCities was not yet destroyed by yahoos who did not understand it. There were “neighborhoods”, I chose the sci-fi neighborhood (“Area 51”), and “Station”. Back then, it was clear that the internet was built for two things: Star Trek and pornography. I was definitely not doing the latter, thus Star Trek it would be. You can see early design versions of my site at my Layout History pages.

You had to learn how to do HTML, CSS, make graphics, logos, etc. You had to create your own content, thus you had to write, take pictures (analog!), scan them, reduce their filesize, create thumbnails, etc. Then, in a mad dash, a noisy modem session would be needed to upload the whole shebang to the respective FTP server. To find likeminded people, you joined a webring.

Web sites were wild areas for experimentation, and everyone who made one had to learn autodidactically, and had fun doing it. This was your own space, and it would be as good as your skills (and sense for layout and content) would be. You built your own identity. Your web site was an achievement, and once you felt it was good enough to accompany for longer, and once you found money to pay for it, it was time to get real. You dealt with InterNIC directly for your domain name, and would eventually get real server space.

The point is, you needed to learn, you developed skills, you learned about the nitty-gritty of the web. These were transferrable skills. You got to play, create something for yourself, and interact with like-minded people – who would actually get to e-mail you.

You still can do all these things, and some of them work better now. But there are monsters out there sucking all the creative energy out of the room in order to display shadows of it in their own space. MySpace provided a portal that still allowed for some wackiness to survive let people personalize the interface a bit more, but it was also the first step into a corporate world. Eventually, Facebook and all the rest have created a world where their own portal basically was supposed to encompass the internet. Now we also have apps that are graveyards for photos, short videos, shameless self-promotion, all to create ad revenue.

Your own web site works for you; but to Facebook (and services like it, wrongly called “social” media), you are the drone stuck in the matrix giving it life. You are completely dependent on social media platforms and their designs, their rules, their monetization. No skills need to be required, nothing needs to be learned that’s transferrable, other than how to use a stupid (in the sense of limited purpose) app on a phone. You cannot easily control your content, how it displays, how it will be read, and you engage with others in a monetizable manner where each of our “likes” feeds an algorithm to give you more of the same.

This is the end of creativity, or rather, it is the seduction of easiness that allows for the end of creativity. You can still get your own web site. Or even a blog (not the same, but better than social media). But most won’t, because we humans are all creatures of convenience nowadays, and why make the effort when minimal mock-effort is enough?

Why give up a personal space on the web that is really yours to shape for the simulacrum or rather poor parody of such a possibility on so-called social media?

#72: Can We Trust The Media?

I. Introduction, because this is a Longer Text and it Needs Headings

There appears to be a sense among many people that there is a problem with “the media.” Trust in media seems low, and there is a societal division with regards to which media is seen as reliable, which as misleading or fake. These divisions appear along most frequently along partisan lines. If it wasn’t such a serious problem, it would be quite humorous to see how different media (and their supporters) criticize their competitors as being unreliable, yet they themselves believe steadfastly in their own reliability (and so do their supporters).

Indeed, we seem to have moved away from a consumer attitude towards media, and instead to a supporter attitude. The media you consume defines you more than ever before, it seems.

As to the criticism, specifically news media engender a suite of response archetypes:

  1. I trust everything media sources supporting view A are saying, and distrust everything from view B. I read to reinforce the views I already have, whether consciously or not. A media outlet that has proven trustworthy in the past will be given the benefit of the doubt; but if a media outlet (and their corporate or ideological sponsors) are suspect for a variety of reasons, I steer away from it.
  2. I am generally skeptical of everything I read, see and hear. I try to verify everything I read, even of news sources that I am more skeptical about.
  3. I do not believe anything from establishment media – whatever their alleged ideological background – and am relying instead on alternative forms of information.

These are, of course, stereotypes. Nobody falls into any category neatly, and may change their views over time, depending on life circumstances, mood, or social circles. In general, position 1 may be the most common. Position 2 is probably aspirational, and position 3 the biggest source of social division right now. Even if you’re in opposing ideological camps, if you are still in position 1, you inhabit the same universe as everyone else. Media is essentially self-referential, and hardly an hour may come by in which those holding view A will not somehow reflect about and position themselves towards view B, and vice versa. Opposites do not just attract, they require each other like opposing parties in a game – and just like in a game, you hope both are playing fair, but you suspect they won’t always (except your team is always right…).

But let us contextualize this critique a bit more. What do we mean by “media”?

II. A Brief Excursion into Media Critique

A medium is that through which information is channeled, through which the world becomes represented through us. We have no direct means of accessing the world; even our sensory organs mediate existence to us, and our brain interprets it. In his Allegory of the Cave, Plato describes our reality as seeing merely shadows of reality represented on the wall of a cave, but as never being able to see reality itself directly (Republic 7.514a). Through education, specifically philosophy, he hopes we can unshackle ourselves from that scenario and see the real world, and see the ideas and the divine on our own without needing to rely on representations of them – viz. without the need for media.

Plato gives a second example (Phaedrus 274e-275b) when discussing the consequences of writing, and comes to a depressing conclusion:

“You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom, for they will read many things without instruction and will therefore seem to know many things, when they are for the most part ignorant and hard to get along with, since they are not wise, but only appear wise.”

Furthermore, once the speaker is replaced by an author that is no longer available for conversation, texts replace conversation. This eliminates the human element from the equation, and creates distance, and opens up the possibility of a distortion from truth, from nuance, from interpretation and dialogic engagement. Culture – as transmitted through media – cedes to be a community-centered activity; it becomes an industry.

Theodor Adorno (yes, we are making a more than 2200-year jump) pointed out the dangers of such a culture industry, informed by his experiences with the Nazi propaganda machine, but worried about the possible rise of a machinery of entertainment, disinformation and commodification of truth in the West.

Both Marshal McLuhan (“the medium is the message”) and Neil Postman (“Amusing ourselves to death”) pointed to the properties inherent in technology itself to shape the any message mediated through it. We cannot ignore that the purpose of television is always entertainment. As Postman notes, in our obsession to be afraid of the Orwellian scenario of constant state supervision, we have given in to Huxley’s scenario of voluntary abdication of truth in pursuit of entertainment.

Gloomily, Michel Foucault, echoing Theodor Adorno and Max Horkheimer, kept reminding us that any form of speech, any discourse, is imbued with a form of power. There is no neutral information, no neutral discourse, no escape. All we can do is become aware of the power of discourse, and to keep this power dimension in mind always.

Finally though, there can be a source of constructive optimism also if we follow Jürgen Habermas’ relentless exhortation to create a new and functioning public sphere, after the old one (the Greek Agora, the Roman Forum, the Renaissance and Enlightenment era salons of thinkers and dreamers, the old style newspaper landscape) is hopelessly lost. But if we aim for deliberative democracy, aim for the recognition of humanity of each other, and create an ideal space for discourse, we may just find a solution and do not have to abandon all hope in this post-Dantean infernal chaos.

What are we supposed to learn from this?

There is no neutral medium, and no media product – whether news, web sites, television, film, books, etc. – should be consumed uncritically. While some media products may manipulate to a high degree, there is no media that is not in some way manipulative or biased, whether through active commission of lies, or through omission of unwelcome truths.

III. Today’s Culture Industry: News Media and Corporate Ownership

While the critique of media pertains to non-news items also, most criticism is reserved for news media, but this is a short-sighted approach. If we follow the money, we will see that old-style media critique is still relevant.

By considering corporate ownerships and relations, we may gain some insight into possible influences on news reporting that may compromise the neutrality of some, if not all news outlets. That does not mean that all news and commentary originating from them may be tainted or unreliable, but it may point us as the audience towards being a bit more skeptical overall about what is reported and how, and what is left out. Money talks, and if corporations and governments are involved, there might be a specific bias. Ties to non-democratic countries that fight against complete freedom of the press (like China, Russia and Qatar) certainly will limit perspectives. Also, in cases where a company owns several news sources, you may find yourself in the same universe of similar news re-confirming themselves. Big media conglomerates also demonstrate that there are clear corporate ties between news, entertainment, and technology companies.

Current corporate ties for major news sources are as follows (see also: Wikipedia, TitleMax):

  • CNN: AT&T, Warner, HBO, Turner Broadcasting (upcoming theme parks in Zhuhai and Beijing, China, non-democratic)
  • FOX NEWS: NewsCorp – which means New York Post, Wall Street Journal, The Times (UK), The Sun (UK); the non-news division of FOX belongs to Disney now
  • MSNBC + NBC News: Comcast, Hulu, Universal, Telemundo (upcoming theme park in Beijing, China, non-democratic)
  • ABC News: Disney (operates theme park in Hong Kong, China, non-democratic)
  • CBS News: National Amusements, Viacom, Paramount, Comedy Central, Nickelodeon (upcoming theme park in Chongqing, China, non-democratic)
  • PBS News: Corporation for Public Broadcasting, donor- and subscriber model, solidly trying to be neutral
  • Yahoo!: Verizon
  • LinkedIn: operates a censored Chinese branch
  • Washington Post: Amazon
  • Russia Today: controlled by Russian government (non-democratic)
  • Global Times: controlled by Chinese government (non-democratic)
  • Deutsche Welle: controlled by German government
  • France24: controlled by French government
  • BBC: independent from UK government
  • Al Jazeera: controlled by Qatari government (non-democratic)
  • Breitbart, Parler: Mercer Family Foundation
  • Guardian, Boston Globe, New York Times, The Hill, Politico: currently independent

This list is certainly incomplete, and is just supposed to illustrate the complexity of the problem.

IV. Ideology

We certainly know that ideology of the news channel or paper certainly plays a role (what I described as “side A” and “side B” above).

I am split on what the consequence of that is. Do viewers already gravitate to a specific view, and then consume news conforming their bias? Or is their own bias created by a one-sided diet of news? I suspect this is a chicken-and-egg problem.

One more consistently raised but very valid criticism is the increased blending of news and commentary. You could add also the undue influence that any editorial stance – even if it may be contained to a commentary section – may exert over the entire enterprise.

Is news supposed to help people make up their own mind, or is it telling them what to think? Can it even make them do that – at which point would they risk them switching the channel, or buying a different paper?

V. The Decline of News Journalism

The real story here, of course, is the decline, if not outright destruction of real journalism. I am talking about newspapers – not because I am old-fashioned, but because that’s where most of real journalism actually still happens.

Television news is first entertainment, then commentary, then news – of course that is a polemical opinion, but especially in the American context it rings true. (In the German context, it depends on the channel – the society-supported subscriber-based channels ARD and ZDF do have functioning news rooms, and are focusing on news. Private channels like SAT1, RTL and PRO7 are more entertainment. Similar probably in the UK with regards to the BBC vs. independent television news).

I may have watched to much Superman and have been influenced by its Daily Planet, but aren’t newspaper newsrooms still more important than we all think? Where does your news come from? Who is typically cited on TV? News agencies (Reuters, DPA, etc.) and Newspapers, I presume.

More and more big newspapers are streamlined, news rooms made smaller, commentary enslaved by Twitter and Facebook, content syndicated, and small newspapers are disappearing or managed in bulk. This destroys the very fabric of our society. Who reports any more on local corruption and malfeasance? Certainly it is still happening, despite there not being any stories in the local newspaper – if you still have one.

If Media are still to be the fourth estate, they need to still exist in all parts of the country, and report on anything possible, and monitor and critique every single aspect of society, culture and politics.

VI. Balanced Skepticism

Returning to the sentiment with which I began: Are we in a position now where we seriously cannot trust the media at all anymore? No. But we all need to do our work, conceptually, and financially by supporting the news sources we do consume and trust.

Can we trust the media? As long as the media still mainly trusts us to make up our own mind, I would say yes, and would welcome the diversity of voices that can be heard on all sides of any debate.

In the end, there is only one truth. There are facts and non-facts. Something is either true or false. Beyond those distinctions though, there are grey areas of opinion, commentary, selection bias, spin, framing, etc. We need to be aware of the limitations of each news source, and we need to do our work as citizens to look beyond just one source of news. This is the only scientific and democratic attitude that can prevent us from being lodged in too deep our own filter bubbles.

Thus, if a news item only occurs within a specific news ecosystem and is ignored or not reported everywhere else, this should raise concern. If there is a definitive slant in opinion and commentary all the time, and it may affect what is reported in the assumedly neutral news as well. That does not mean there has to be bias, but we should assume there can be. An overall skeptical attitude is always a good thing.

That being said, we can indeed be skeptical of everything, but we also must put this in perspective, and our skepticism should be balanced. Reality is murky, and just as we cannot trust everything automatically, we also cannot distrust everything automatically. We need to follow the saying that “It pays to keep an open mind, but not so open that your brains fall out,” as suggested by Carl Sagan. I have come to believe that it pays to listen to Carl Sagan most of the time.

#54: The Dictator as False Messiah: A Belated Review of Game Of Thrones Season 8

I.

Game of Thrones has been a mainstay of recent television mania. Each year, the excitement built up more and more, and for the very last season, expectations were high, and, as it goes with genre shows, fans had their very concrete ideas about how things should develop. This now is a very belated, spoiler-filled review.

The reaction to the conclusion of season 8 of Game of Thrones has typically not been kind. In a nutshell, it runs like this: “All the buildup. All the pathos. All the scheming. And it ends like this. Why?”

That’s basically the criticism. Well, you can see it that way. You could also ask, well, let’s just accept this is what’s happening, and ask what it is meant to accomplish and say. Rather than to allow the frustration over the disappointment of one’s own expectations to govern one’s opinion about the show (i.e. How does that make us feel?), we might instead learn something from the experience (i.e. What might that possibly mean?).

We accept audience frustration in the short run – which drives the popularity for basically anything done by Christopher Nolan (Memento, Inception, Interstellar, Tenet), the current master of surprise turns of narrative once M. Night Shyamalan (The Sixth Sense, Unbreakable, Signs, The Village) lost favor. The gimmick of the unreliable narrative and of surprise turns of events seems to work well with audiences if it comes in the form of a movie-length experience. But if it comes in the final run of a 8-year television series, audiences that have fallen for the trick seem to be frustrated. The same happened with Lost and its absolutely brilliant (and apparently equally not understood) ending. Maybe there’s a lesson here. Limit surprise twists to movie or even television season size (like Dr. Who under Steven Moffat – but even he suffered from frustrated audiences).

Or, in short, people who loved Game of Thrones so much that they named their daughters Daenerys or Khaleesi probably were put off a bit by the ending, one would hope.

The spoiler for this review is as follows: It Makes Perfect Sense. In fact, the entire series can be read as a playbook that lets us understand how people fall for a genocidal dictator, how they end up supporting a violent revolution that in fact runs counter to their interest, and how nothing good can ever come out of supporting someone pretending to be a messiah that will solve all your problems.

II.

Daenerys “Danny” Targaryen sees herself as a liberator. She has been abused, she had to fight for survival, she rose to the top, became Khaleesi (basically, a female Khan), she is a sympathetic character overall, but she has always had a cruel streak. She brutally kills her brother (Season 1). She locks Xaro and Doreah in their own vault in Qarth to die (Season 2). She kills the slavers of Astapor (Season 3) and has the slavers of Mereen crucified (Season 4). She feeds a Mereen nobleman to her dragons (Season 5). She kills the Khals that threatened to abuse her (Season 6, and yes, that was sadly very satisfying). She kills Randyll Tarly and his son (Season 7). She kills Varys (Season 8, now we’re finally suspicious). At every point, these are all signs for what’s to come.

Just because some of her victims are bad people, it is telling that Daenerys’ default answer is cruelty. The audience typically likes it because the show is pulling a Hannibal Lector – audiences tend to identify with the protagonist, especially if they are good-looking, charming, played by a great actor, or can claim to fight for the greater good. Her fight is not for justice, it is for revenge. She clearly delights in the violence, it is visible. Every season shows us a reminder of her character. She may have been a victim in the past, but she has become a perpetrator of violence and cruelty. And like every cruel person in history, she has willing accomplices unable or unwilling to stop her.

III.

Tyrion as Daenerys’ advisor basically plays the role many philosophers have played when trying to appease a brutal tyrant. As Plato fails with Dion of Syracuse, Socrates with the Thirty Tyrants, Aristotle with Alexander, Cicero with Augustus, Machiavelli with Cesare Borgia, Thomas More with Henry VIII., Voltaire with Frederick II, Robespierre with the terror he himself helped unleash, Trotzky (no innocent either) with Stalin, and, arguably, Heidegger with Hitler, and Oliver Stone with Castro and Putin, the philosopher/artist typically cannot keep the brutal tyrant from being a brutal tyrant. They may delude themselves into blunting the blow, into convincing the inconvincible, into preventing the worst. In the end, they never do. In the end, they may soil their reputation by getting too close to power, and by enabling the tyrant and providing legitimacy to a reign of terror that they should have seen coming. Cicero and Thomas More finally stood up against tyranny and paid the price. Heidegger is still being read, but with well-deserved disgust. The Faustian bargain hardly ever pays off.

On Game of Thrones, Tyrion’s fate – as likeable as he might be – should be much harsher. He should have seen what was happening, but he himself has gotten himself deeper into the dark shadows of questionable morality. When escaping King’s Landing, there was no need to kill his father Tywin, who was quite incapacitated at the moment as he was sitting in his bathroom. Tywin may have been a bad father, but killing him – as emotionally pleasing this may have been for Tyrion – was unnecessary, and it led to the downfall of the city eventually. Always the ultimate narcissist, Tyrion shows his lack of morality. The years of being humiliated by his family finally lead him to his breaking point – or do they finally reveal his true, evil character? In order to seek personal vindication, he ushers in the destruction of the city that never loved him. Naturally, he will partner up with the other murderer in the show. Tyrion, too late, realizes he has been playing the Goebbels to Daenarys’ Hitler.

And Jon Snow, he indeed knows nothing. He is the idiotic Siegfried character, duped by Burgundians (by political power), having abandoned his Brünnhild (Ygritte), lusting after Gutrune / Kriemhild / Daenerys, manipulated somehow by Hagen (now there’s a reason for Tyrion as a dwarf!). Enough Wagner, but it’s certainly fun to cross-read these texts. Jon is hopelessly in love, being seduced by Daenerys, and once he realizes the difference between right and wrong, it is rather late. In his very original defense, he indeed did die and was raised from the dead, so he might just as well be dead inside.

IV.

In all of their defense, if such a defense should be justified, this is the story of a world gone mad. It is not easy to maintain your morality under such circumstances. But this is precisely when it counts. Morality in good times is meaningless if it is not challenged. Morality shows up when it matters most, when you have to decide in favor not of your own selfish survival or comfort, but in support of the greater good. It matters whether you give in to the seduction of a violent quick fix, or whether you seek the path that is complicated, painful, laborious, and time-intensive. Put differently, do you save yourself, or do you save your soul?

Difficult times are no excuse. This is not about surviving a concentration camp, or some other liminal experience, this is about the point where you choose to become a perpetrator to avoid being a victim. You may not have a choice when you are ordered to shoot somebody. But you can always aim to miss. Historical evidence shows that most soldiers in battles actually do everything that keeps them from killing. Ironically, Star Wars was right all along: Most stormtroopers fail to hit their target, and it may just be deliberately. Human beings tend to know what is right and what is wrong.

There has to be a caveat here: We ourselves cannot know how we would act in these circumstances. For good reason, we are talking about exceptional situations. It should not be ours to judge too facetiously, lest we be judged also. We all make mistakes, we are all fallible, we are all human. What I am talking about here are deliberate and coherent patterns of cruelty, displayed by the protagonists of a keystone television show. This is not about characters under momentary duress, this is about characters deliberately and knowingly committing or condoning violence. It’s the difference between self-defense and murder.

V.

The show has always been a historical allegory, initially seemingly about the Fall of Rome, but additionally now about World War II.

The gravity of history is unforgiving. Tragedy is when characters end up doing the wrong thing despite having tried everything to avoid doing it. No matter how much they may have wanted to change, they cannot change their nature. Jamie Lannister realizes this. Daenerys Targaryen realizes this. They give in, because that was always their supposed destiny. They are weak, they surrender to the darkness inside. Contrast this with Arya and Sansa Stark: Both have suffered greatly, and yet they eventually beat the darkness and grow beyond the need for revenge. There is always a choice.

Daenerys has always been not just violent, but outright cruel, sadistic, indulgent in violence, almost a mirror image of Ramsay Bolton. We were warned time and again. She has always been nothing but a combined version of Julius Caesar, Attila the Hun, Napoleon, Adolf Hitler, Joseph Stalin, Pol Pot, all in one package. She is all dictators. She is all deluded violent revolutionaries. What brilliance to make her into an attractive young girl to allow the audience to fall for her. She has taken all the tragedy of her life and turned it not to wisdom, kindness or compassion, but into a weapon. Once she has the chance to release it, she does.

Like Ahab, she is mad in her pursuit to break the wheel. Yet the wheel of history cannot be broken. This has always been the truth of the show. It had to be revealed eventually, and shockingly, and the audience had to be punished for believing otherwise.

The business of dictators is seduction, and Daenerys has certainly seduced us into (false) hope, just as the show and its producers have seduced us, almost soma-laden, into believing that the wheel of history can be broken, that violent and unprincipled psychopaths shalt lead the way to the revolution, and that everything will be all right.

No, it won’t. It never will be. That way, there’ll only be dragons.

#43: “Worst Persons” in the World: Hate is the New Normal

I still remember Keith Olbermann’s sometimes ironic, sometimes dead-serious takedowns of politicians and people in the public eye. His show “Countdown with Keith Olbermann” on MSNBNC (2003-11) started, supposedly, as a parody of the style of Bill O’Reilly’s show “The O’Reilly Factor” (1996-2017), then on Fox News. While Olbermann never achieved the brilliance which Stephen Colbert displayed in his all-out O’Reilly parody on the “Colbert Report” (2005-14, sorely missed!), he perfectly captured the tone not just of his time, but even more so of the time we’re graced (or cursed?) to live in.

Olbermann’s key segment was called “The Worst Person in the World”, in which he provided a regular, and predictable, personality assassination on live television. If you had never heard of “argumentum ad hominem”, an argument directed at the person rather than the issue (“argumentum ad rem”), here it was, celebrated with gusto. It captured the time perfectly. The administration of George W. Bush, as it had to survive its rocky start after a contested election victory, and then the attacks of September 11, 2001, was a frequent and convenient target. Olbermann, the perfect showman, seized the moment and provided regular attacks against the people who committed politics he did not agree with. This was something not seen before in such a drastic and caustic style, putting even O’Reilly (whose show I really did not care for) to shame. As an all-out celebration of vicious partisan commentary, the show was a success – but what may have then been a welcome outlier to some, seems to have become the norm now, not just in journalism but in everyday life. As a previous sports commentator, Olbermann seemed to have forgotten the saying that you may hate the game, but not the player.

(To not throw Olbermann under the bus completely: he also had moments of true profundity, and changed the discourse with his powerful defense of gay marriage by just stating, in its baffling and utterly revealing simplicity, that it is just about love, and the freedom to love who you want. He also calmed down the part of the nation that listened to him with endearing readings from James Thurber’s fables.)

But back to the point about argumentative style.

Everybody you don’t like is now the worst person in the world, everything you don’t like is the worst thing in the world, liking may exist, but disliking something or not caring for or about something is out. It’s either like it or hate it. Hate is the new normal, and declaring who you like or hate is expected in everyday discourse. At the same time, the idea of “liking something” has been turned into a consumerist and corporatist tool that has completely destroyed its original meaning. Can I really “like” a certain brand just as a “like” the comment somebody made?

This culture of constantly declaring your positionality is disturbing, as it removes all sense of productive ambiguity and expects everyone to blast out their opinion into the world every chance they get. Even more disturbingly, you are now supposed to have an opinion about everything, and are tied to this opinion forever. Maybe you liked brand A in the past, now you like brand B till something better comes along or you become nostalgic. Maybe you agreed with position X back then, now favor Y, and in the future tend to return to X or choose Z.

What happened to the idea of changing your mind? If information or societal circumstances change, should we not be allowed to adjust our positions? Is not an opinion something that should be a momentary snapshot of serious judgements made depending on the specific moment in history? Do we not change over time? Are our tastes and preferences supposed to be constant? I cringe every time when people are supposed to only like the style of music that was popular when they were growing up. How limiting. I guess I have been growing up then for several millennia, appreciating music since ancient Egyptian styles. I don’t believe in limiting our horizons.

The attack on other opinions is, of course, always waged in the name of democracy, on all sides. This is nothing new,. of course:

“For, to state the truth in few words, whatever parties, during that period, disturbed the republic under plausible pretexts, some, as if to defend the rights of the people, others, to make the authority of the senate as great as possible, all, though affecting concern for the public good, contended every one for his own interest. In such contests there was neither moderation nor limit; each party made a merciless use of its successes.” (Sallust, Conspiracy of Catilina, ch. 38)

Ironically, this pretense of democracy promotion directly feeds into consumerism and marketability. Only if you voice clear opinions and preferences can market analysts and pollsters make sense out of you. Thus we surrender our capacity for a truly democratic exchange of ideas – which necessitates our opinions to change from time to time – in order to succumb to commercial market pressures and to make a mockery of an honest marketplace of ideas.

Olbermann and O’Reilly eventually ended their respective engagements, whether voluntarily or not. O’Reilly fell due to personal failures, supposedly. For the eventual end of Olbermann’s show, I credit Ben Affleck’s brilliant Saturday Night Live skit, in which he took down a hapless landlord for discriminating against his cat, “Miss Precious Perfect.” Affleck revealed Olbermann’s pomposity, defanged, or rather, declawed it. Humor is always more seditious than righteous indignation, and Jon Stewart’s sobering critical voice sorely missed.

The legacy of this ad hominem style, sadly, seems to continue with – quite literally – a vengeance.

Aren’t we tired of it yet?

#22: There Are No “Alternative” News Sources

The dissatisfaction with “established” news sources is real. There has been a worrying trend in media towards heavy editorializing, partisan bias, and selective reporting. All this is happening in newspapers and television news, also online.

But the answer to that problem is not to gravitate to “alternative” news sources which would see all these problems compounded at a much higher degree.

The answer to problematic media is to diversify your media intake, but also to make sure not to select even less reliable news sources. Either something is news or it is not. There is no “alternative” news, just as there are not “alternative” facts. There can be different interpretations of the same news and facts, just as much as media can have selection bias.

Ideally, selection bias can be overcome by indeed looking at news from different angles, and by maintaining all-out skepticism. Selection bias is nothing new, and has been a regrettable feature of newspapers for ever. You know that if you gravitate to a more liberal agenda, then the New York Times and the Washington Post are for you; and if you are on the more conservative side, it’s the Wall Street Journal and the Washington Times. Same with television, and with internet news.

Most problematic, however, are state media run by dictatorships. Russia Today, Ruptly, Sputnik, Al Jazeera, the Global Times, etc., may sometimes indeed carry legitimate stories, but they will not criticize the regimes in their own countries, and try to spread biased Soviet-style disinformation about the West. Regrettably, some of these sources of “news” are becoming more popular amongst Western youth who prefer critical reporting about their own countries – which is ok, but you will need to keep this in perspective. Soviet-style sedition campaigns work by eroding trust in Western democracy and making dictatorships sound more appealing in contrast.

There is not the “one news source” that explains everything. This is when we enter the realm of conspiracy theories and “alternative” news. The internet may make illegitimate content seem legitimate very easily, but again, diversification is the key here. Despite all the problems with established media, there is no alternative to solid and competitive journalism, everything else is just someone’s private opinion.

#21: Media: Don’t Tell People What To Think

Journalism is one of the most important activities in any country. Freedom of the Press, Freedom of Speech, both are cornerstones for any successful society, not just for democracies.

Without a free press and free speech, no society will survive successfully for long. Dictatorships that disallow one or both those crucial components of public and civic life will fail eventually because they close themselves off to the truth, and get eventually stuck in a restrictive worldview that will not succeed to map reality correctly. If a country fails to listen to all sides, to praise and criticism, to all factions, to all possible opinions, it will also fail as a country.

Similarly, If a country’s citizens fail to listen to all sides, to praise and criticism, to all factions, to all possible opinions, they will fail as citizens, as human beings, and they will also fail their country.

The function of the press is as follows, at least from my perspective, I am no trained journalist:

  1. Gather and publish information that informs on an important issue.
  2. Deepen a discussion of that issue, and add analysis and disinterested evaluation to it, to draw reliable knowledge out from the information.
  3. Make a judgement on what happened, based on the information, and your knowledge of the wider context, and try to make that judgement in the best non-partisan way, sine ira et studio, without anger or passion, so that evaluation can happen without unnecessarily falling into a partisan camp.
  4. Give people the facts, but do not tell them how to evaluate it. You may say, “in my opinion, this is x”, but do not assume everyone should draw the same conclusions. Let people come to their own conclusions – if you have laid out your case successfully, they may just as well agree with you. If they don’t, they always have the right to exercise their own freedom of thought. People have a right to disagree without being demonized.
  5. Do not use primitive click-bait ways to draw people in with an incomplete headline, using the hook-line-and-sinker approach all too common now. “You wouldn’t believe what I found in my driveway today, Click here (and here, and here, and here, and watch the ad here, and – what was this about again?)” – anyway.

News MUST be neutral. Commentary MUST be fair. There are no sides, only truth is the side of the journalist, and truth is always neutral. Ad hominem attacks against specific people ignore the complexity of political life. Don’t think you can easily label a person you don’t like or agree with in a way that such a label puts that person – rightly or wrongly – into the anathema corner of human discourse. Things are too simplified more and more, people’s assumed identities determine whether they matter or not, and dissent suddenly has to be partisan.

This is nuts. Don’t tell me what to think. Don’t pretend you can read other people’s minds. Don’t demonize the side you like. Don’t even tell me which side you like! I should not need to care!

You telling me what I should think in order to be a decent human being (according to you!) is precisely how socialist and fascist dictatorships talk to there people. Right-think, Wrong-think, Doublethink, etc.

The Media, if it behaves like that, is a problem. They need to fix this by themselves, and we all need to realize that we, the people, need to hold the Media as much accountable as the other important aspects of our democracy.

The key words here are truth and credibility. But the truth has many sides, and it belongs to neither party, nor group, nor identity, nor belief system. A journalist will run a story even if it is unpopular and goes against network or newspaper editorial opinion. A journalist will not just placate to the base, and hope they’ll support them when the power structures change. A journalist is equally liked and disliked by all, but respected for reliable information, the unvarnished truth, and contributing to the knowledge of all.

We have a long way to go still, it seems.