This is the Dark Souls of a short post about a YouTube video

Oh my word. I am pretty sure that all human development to this point has led to this moment.

I genuinely appreciate being able to get jokes like this. My belated discovery of the wonder of Dark Souls last year, in addition to fostering an addiction to complex games essentially focused on esoteric storytelling and exploration, finally granted me entrance to a fascinating subculture of Dark Souls adoration that tends to be a lot more fun than such a description makes it sound. See  above.

Admittedly, the above video probably seems insane to someone who has not played a lot of Dark Souls, thought about Dark Souls, read about Dark Souls, and then played more Dark Souls. Yet it brings me joy on a busy Monday.

Grading All Day…

Today I reap what I have sown, and in between prepping for tomorrow’s classes I have plenty of grading to do. It allows me to make a silly Chris de Burgh intentional malapropism:

Break out the diet coke and away we go…

More blog posts to come in future, I promise. For now hang out while I read fifty takes on Urban II’s speech at Clermont in 1095 and another thirty or so on the Meiji Constitution of 1889.


Deus Vult

I found this article in Polygon fascinating, discussing the odd conflation between an alt-right adoption of the term “Deus Vult” and the growing online community around the game For Honor. I find it fascinating in no small part because I gave my students Robert the Monk’s version of the famous Pope Urban II speech that gave life to this particular term, and did so with no knowledge about the online alt-right connotations.

Allegra Frank describes the historical nature of the phrase as “obvious and inextricable”, which is not a bad way of putting it in terms of its use in the game. I cannot help but find it all a bit disturbing, frankly. I had read previously of the alt-right community’s interest in monarchy and medieval standards of governance and (supposed) moral justifications for autocratic rule. Their approach, needless to say, is ahistorical and embarrassing. I’m not sure how you could come away from reading a version of Urban II’s speech in the early twenty-first century and come away thinking he was on to something exciting or that in any way gives us a useful blueprint for the future. The source is inseparable from its context, and it takes remarkably willful rejection of that context to draw parallels with challenges the world faces today.

As for For Honor, it seems the fans of the game use the term as part of their own celebration of their community online. Go for it, I say. There is something to be said for relationships between terms and I am grateful that even in the savage land of wild untamed memes there is genuine competition between representations.

The Holocaust and the Edges of Denial

This past Friday the Whitehouse shared a public statement to commemorate the international day for remembrance dedicated to the Holocaust, as they do every year. This year, however, had a notable difference from previous ones, in that the statement did not mention Jewish victims of the Holocaust. When asked about this, Whitehouse spokesperson Hope Hicks asserted that the Trump administration is inclusive, so inclusive in fact that they feel it is important to recognize the millions of non-Jewish victims of Nazi horrors including the physically and mentally disabled, homosexuals, Roma and other groups.

Looking past for a moment the astonishingly hypocritical claim to inclusiveness, this statement is very troubling. Ms. Hicks is completely right to point out that the Nazi Party turned its savagery on many groups other than Jews, but it is an unwarranted and unnecessary leap to then try and re-contextualize the Holocaust as a broad murder program aimed at a large number of groups. It was not. The Holocaust and Hitler’s “Final Solution” focused singularly on the global Jewish community. The extension of state-driven genocidal practices to other groups Nazis found distasteful does not alter that fact.

At first I assumed that this was a simple case of a costly error borne of a lack of expertise and experience, and that the Trump administration could not bring itself to admit to making a mistake; see also the mess this weekend derived from a poorly worded executive order and a foolish (truly foolish, beyond belief) decision to include green card holders in already restrictive and arbitrary policies on “vetting” entrants to the country. However, there was grounds for skepticism, and as John Podhoretz points out this odd and ahistorical desire to insist that not only the Jews suffered because of the Holocaust has an extensive history.

It is rubbish, and it is extremely bad history. Obviously, the willingness and eagerness of the Nazis to extend a message of hatred and eugenic exclusivism to a large number of groups that did not fit their entirely fictional Aryan race ideal is worthy of note and offers important lessons. Victims of their cruelty deserve to be remembered. However, the idea that the Holocaust was not the end result of a plan conceived from the very start as an attempt to solve a Jewish “problem” completely ignores historical fact. It ignores many, many things Hitler and his cronies said and were saying for long before he became Fuhrer. Hicks and her boss’ rather pathetic attempt to appropriate awkward language of inclusivity merely highlight a clumsy attempt to do an end run on the historical facts of the Holocaust, which should lead reasonable people to assume that someone involved in the drafting of that statement has a problem with Jewish people, acknowledged openly or otherwise, and that others who read the statement lacked the faculties or spine to fix it before sharing this message with the public.

If one truly wanted to be clear that the horrors of the Holocaust lie not just in the attempts by a state and a people to eradicate an entire group of human beings but that such methodology was then extended to various subsets of groups considered unacceptable to a stated norm, then there was an easy way to do this. Mention those who died at the hands of the state because of their sexual activity or ethnic origin, as an additional comment in support of the central horror of the Holocaust: the death of six million Jews and a meticulously constructed system that sought to kill many millions more.

To reiterate, this is among other things just bad history. You do not need to be an historian of the Holocaust or be familiar with the historiography to be able to point out the problem here. This is another evolution of the David Irving school of trying to chip away at the edges of the accepted historical argument. Irving and his fellows liked to cast doubt on the numbers of victims and dance around the reality that they were effectively denying the Holocaust. Irving even sued a historian for correctly pointing out that this is exactly what he was doing (a series of events recently dramatized on film). Dressing the omission of Jewish victims from a statement on the Holocaust in newspeak masquerading as post-identity-politics humanism simply takes the practice of Holocaust denial away from casting doubt on well-established historical fact to seeking to undermine well-established historical consensus. There is, in theory, room to maneuver here, but only if you are predisposed to argue that Jewish people are somehow over-represented in discourse on the Holocaust. If you are so predisposed, I am sorry to inform you that you are wrong. It was one of the great sins of human history, it has shaped all of our discourse on genocide and ethnic cleansing since, and adding to the long list of victims in the public consciousness merely helps spread the word of the sheer depth and breadth of its horrors. Those horrors, I am sorry to say, have expansive borders. There is no need to reduce the reality of the Holocaust’s defining anti-semitism to further illuminate them.

Apocalypse Now, Again

A lot of information is hitting me at the same time right now. People are making an Apocalypse Now game, Francis Ford Coppola is involved, and there is a lot of talk about how video games are ready as a medium to add something of genuine meaning to Coppola’s initial artistic offering.

Says the auteur genius behind Apocalypse Now, The Godfather (Parts I and II) and Jack:

Forty years ago, I set out to make a personal art picture that could hopefully influence generations of viewers for years to come. Today, I’m joined by new daredevils, a team who want to make an interactive version of Apocalypse Now, where you are Captain Benjamin Willard amidst the harsh backdrop of the Vietnam War. I’ve been watching videogames grow into a meaningful way to tell stories, and I’m excited to explore the possibilities for Apocalypse Now for a new platform and a new generation.

As I said, a lot there.

First off: it’s great they have a roster of talented people who have worked on good games before, but that doesn’t guarantee anything.

Secondly, I’m not really sure how making a game with Willard as a protagonist really adds to the original vision. I’m also not much of a fan of the phrase “interactive recreation of Willard’s journey” being used on the Kickstarter page. It doesn’t really mean anything of course, and doesn’t commit people to much, but it does reek of another video game existing mostly as a prematurely vestigial appendage of an existing work of art.

In fairness, what a work of art. My Vietnam class this past fall (which I will write up soon, I promise) watched Apocalypse Now and it remains stunning. If anything, I recommend that people who have seen the film at a young age watch it again after a decade or two. It was an entirely new experience, akin to the one I had reading Brave New World in my 30s, a testament to the film’s enduring artistic merit.

It means something, something tangible, that Coppola would make such a show of his investment and confidence in the project. There is no reason not to take him at his word. The game has the potential to be a genuine crossover as well, and if there is a medium best suited to mixing together established classics and something new, I think the video game is right up there with the novel. It is of course very difficult to do.

It also begs the question of what the game is setting out to do artistically. Apocalypse Now, in addition to being a tour de force from a young director in his prime, is a film seeking to explore the American misadventure in Vietnam in ways both explicit and indistinct. The film is soaked in the symbolism of American failure and confusion, but Willard’s journey enters the metaphysical. Apocalypse Now is as much about the limits and arrogance of modernity as it is the morality of the American war in Vietnam.

So, what to expect from a game produced in the early twenty-first century amidst the swings of populism in the West and a democratic consensus seemingly enshrined forever in the postwar twentieth century under siege? I’m not sure I know, really. I want it to work out, but it is not clear just yet what this game is going to be. I am far from won over from comments like this one:

“It’s like Fallout: New Vegas on acid in the middle of the Vietnam War.”

In fairness, they are trying to sell a game they are not able to start making yet. They deserve a lot of leeway. I am pulling for them, but I hope this proves to be a lot of pre-sale guff that makes something interesting possible.

Game of the Year (Not this year, last year)

‘Tis the season of GOTYs, or rather it ’twas before I spent Christmas having fun and not writing and then dove into an intensive January of teaching (more of that soon) and I come to share my own. As has been the case in recent years, my enjoyment of video games is nowhere near as closely tied to the annual release calendar as it once was. I have no regrets from not being up to date, but it does make my deliberations on what my favourite game of the year was more complicated. I suppose I could just give up on such deliberations, but why would I do that? It’s fun.

So yes, I often pick two or three games that came out the year before, or earlier. Rather than get too far into the weeds of what should and should not qualify as a Game of the Year, I will instead just go ahead and share my game(s) of the year, using my complete failure to match the latest and greatest releases as an excuse for categories, a cheap move that lets me write more and have more fun. Starting this evening, let me talk about my game of the year for a little while.

Game of the Year: Dark Souls

Yes, Dark Souls. Yes, the same Dark Souls that came out in 2011. This is probably my most egregiously anachronistic game of the year pick yet. My love affair with Dark Souls, though I had no idea that is what it would become, started earlier still when I tried my hand at Demon’s Souls in 2009. It didn’t go well. I subsequently tried Dark Souls, which also did not go well, and then bought Dark Souls for PC so that I could reinstall it every six months, play for an hour and get annoyed, and give up on it again.

This went on for a few years, and during that time I read, listened to and watched countless tributes to the game and its sequels. I assumed that the game just was not for me after all, that I looked for something different in games, that perhaps, as this problem gradually became worse, I just did not have the time to spend committing myself to the game. I did not have in me to be miserable for hours just to get to the next boss. Then it clicked.

Dark Souls is not about bosses at all. At least, not for me. I play the game somewhat atypically perhaps; I shamelessly dig into wikis, and the single moment that led to my finally understanding what makes these games so special came from a decidedly skeevy suicide rush to grab an armor set that my character in theory would not need for hours but that basically gave me just enough room to finally figure out how this whole thing works.

So… that’s the secret about Dark Souls. It’s not about the difficulty at all. I mean, the difficulty adds something special for people who are into that sort of thing, but it’s not the difficulty of the game that makes the game itself special. It’s the discovery, and the disarming intricacy with which the world you explore has been built. The Dark Souls environment makes sense in a way that compounds over one’s experience of the game. Lengthy, frustrating sections of the game come in retrospect to provide nostalgic recollections of one’s pathway through the story, a story that is nebulous by design and reflexive and adaptive as a result. You can take the Dark Souls story or leave it, you can read a lot into it or move on. It sits there and doesn’t really care if you are interested or not. In a video game world where your hand is held, gripped solidly by the AI as part of an extended tutorial that might last for hours, Dark Souls is refreshing. Invigorating. It’s the best kept secret in the medium, despite everyone shouting about it. If you read this and haven’t played Dark Souls yet, give it a shot, and keep trying until it sticks. You’ll thank me.

The Curious Expedition and plans to come…

I missed this on Tuesday, but Rock Paper Shotgun named The Curious Expedition as best “roguelike” game of 2016 as part of their consistently excellent annual Advent Calendar.

Congratulations to The Curious Expedition and its developers, Riad Djemili and Johannes Kristmann!

I’m quite taken with the game myself and must write about it soon. I am planning to give it to my world history students this coming spring as the assigned text for a short assignment. I’m not sure just yet what form that will take, and but I have to figure it out soon and look forward to sharing. For now I will point out what attracts me to the game for the purposes of discussing world history.

You choose a historical figure, such as Marie Curie or Johan Huizinga, and depart for a corner of the “unexplored” world (in eighteenth and nineteenth century parlance) to uncover a map full of jungles and wild beasts, hidden temples and “natives” who will trade with you. Essentially, The Curious Expedition will give my students the chance to simulate the act of discovery within a specific discourse of western identity and the modern. This is something Bob Whitaker and I discussed once on the History Respawned podcast when talking about the ways that No Man’s Sky chooses to present information to the player and simulate the act of discovery.

There’s a long way to go in figure out how this will work, and I have no choice but to play the game some more as I conduct my research.

I have a couple of posts to write, really: what I hope to get out of The Curious Expedition and how it is becoming easier to assign games anyway. It is easier and easier to find interesting games available on PC and Mac (and sometimes on iOS/Android, too), and the indie games revolution means there are plenty of games available.

It also makes for a more controllable student reading experience. See here for Graham Smith’s take on this particular game’s accessibility:

It’s a game you can pick up and play immediately. You don’t need to play a tutorial, it has crisp graphics and a simple UI, and a single session can be brought to a satisfying conclusion in 15 minutes.

Perfect. As much as I would like to give my students No Man’s Sky, have them play it for twenty hours and then come back to talk about how the game, despite being set in space in a vaguely defined future, essentially recreates extremely old-fashioned modes of creating ownership of the world through discovery and subsequent orientalism, that’s a big commitment to assume from my students. It is also a major assumption about the hardware they have available and what they are willing and able to spend money on.

So, The Curious Expedition it is! I look forward to writing more about it… soon…

Thinking about the dissertation

This is a post I began to write in early August, but let fall through the cracks. I have decided to share it today, and I have resisted the urge to edit too much of my thoughts from the heady days of August 2016.

I  want to write a little bit about my work this summer in a future post, but in planning out that particular post I started thinking more and more about my PhD dissertation. I defended it five years ago this month, an anniversary that if you asked me about it a couple of years ago I would have anticipated a profound feeling of fear, largely because that was my reaction to everything at that particular moment. I cannot be sure if it was a condition of academic life, collateral emotional and psychological debt from being the parent of a small child, existential fear that everything really IS going to pot (though being a historian mitigates that particular bit of irrationality) or just a natural derivative of whatever it is that is wrong with me that drove me to attend graduate school in the first place.

In any case, as the anniversary does finally come into view, I am happy to say that my sentiments are mostly gratifying. It turns out that all this work we run around doing, particularly early in our careers, can make us feel very good if we just stop and acknowledge it.

This is also to a significant degree due to the fact that the dissertation, the defining and central non-sentient element in my life for several years, is finally well on track to becoming a book, and, having navigated some of the more difficult phases of that particular journey, I have suddenly acquired a gentle, world-weary view of these challenges worthy of an elderly character in a science fiction novel or an associate professor mystified as to why today’s graduate students seem to find everything so much more difficult than when he went through the process, taking to the rivers of academic thought as a salmon coursing deep through its waters and leaping, with drops and splashes of creativity exploding and falling where they may, into the bright clean air.

Before I wade into the beatific intellectual retirement that is removing all the pain of writing what finally became my first book and filing it under the “articles of evidence supporting the clear unambiguous fact John Harney is a genius,” let’s take a moment to reflect on the dissertation, and what went into writing it. I find myself writing something I did not plan to write this morning and certainly did not think I would ever write five years ago, or ten, but I hope that some graduate students and prospective graduate students might come upon this and find it useful. As always, your mileage will vary, which is a neat and useful but syntactically rather uninspiring way of saying I understand I do not have all the answers. This is how I experienced the dissertation and how I think of it now.

The Dissertation vs. The Comprehensive Qualifying Exam

The great advantage of pursuing a graduate degree in the United States is that you are given time to read, to think, to engage in discussions with your peers and with professors, to really dig into existing theory in your field, to write papers, to become a Teaching Assistant and experience the initially odd sensation of handing out grades to undergraduates. It is all about time. if you are fortunate enough to be fully funded, you have a lot of time.

I was not fully funded. I did not feel that I had time. As a result, I was driven to complete the dissertation from the start, long before I knew with any confidence my dissertation topic. This translated directly into a particularly driven attitude towards my coursework and, in particular, the preparation for and completion of my comprehensive qualifying exams. For this, I am unapologetic and I encourage all graduate students to do the same. Yes, the learning of things is important and should not be reduced to a tawdry list of checkmarks, but you can still take a positive view of all the work you are doing while getting it done. That is to say, it IS important to reflect on the value of the expertise you are developing but it is also okay to identify clear endpoints, milestones and the like. In effect, you can treat your exams as a task but look forward to it, not dread it.

In theory, anyway.

My comprehensive qualifying exam involved providing reading lists to each of the three members of my committee, writing historiographical essays to be included in my portfolio, and defending this work orally. The whole thing took about a year, not including the various note-taking and collection of books and journal articles I engaged in during coursework. My attitude to time and in particular my focus on completing the exam as soon as possible was, in my opinion, a very good approach to the exam itself. It mitigated anxiety, at least, but it also made it clear that this was a temporary state of being and one that would come to a timely end with an appropriate amount of work.

The same approach will not necessarily work for the dissertation, because the dissertation is an entirely different beast. You will schedule your defence, sure, but you don’t schedule it as soon as you advance to candidacy. People will give you a vague idea of how long you are supposed to be in graduate school, and you have other people in the department who have been there longer, but there are no clearly defined rules about how long it will take. Nor should there be. The dissertation is going to take as along as it takes. So if, like me, you have taken a rigid approach to time management and you possess a clearly defined idea about when you hope to get your degree, you are going to generate a lot of stress for yourself.

On the other hand, not being rigid brings its own stress, a type of pressure that almost everyone I have met who has written a dissertation experienced regardless of their attitude to writing the bloody thing starting out: a truly horrid combination of the unrelenting sensation that you have not done enough work, a continually expanding effluvium of guilt, and the prolonged gradual existential realization that this thing will never actually come into existence and that you are by extension useless.

If that sounds dramatic, well… it is. I am not exaggerating all that much though. The basics of time management still work, you understand, but the context that surrounds the dissertation can become crippling in a hurry, and being in control of the process this week guarantees no such thing next week or next month. The good news is that recovery from such dark spots is equally frequent. My advice for people writing dissertations focuses on reducing such dark spots as much as possible, and you do that by managing the context.

Why are you writing it?

Yeah, why ARE you writing it? Have you given that enough thought? I mean, really. I know you think about it all the time, but why are you writing the bloody thing? Is it just a means to an end, and if so… to what end? Do you want this to be a book (or an article, or a series of articles, depending on your field)? Are you focusing primarily on your degree?

Allow me to use the extremely broad objective of producing a monograph as a shorthand for all the various publication possibilities here, as it is the standard in my particular field. You have a number of challenges ahead of you, not least of which is the fact that it is incredibly unlikely your dissertation will be ready for publication, regardless of what your advisor says. If s/he is also focused on helping you to produce a book, to the extent that a conversation with an acquisitions editor is scheduled before your defense, I say good luck and stick with your advisor. It is more likely however that your advisor and the rest of your committee, though very much pulling for you and supportive of the idea that your dissertation ultimately take the form of chrysalis to the butterfly of the next step of publication, will have either their own set of ideas of what your work should be or their own interpretations of their obligations to you, to themselves, and to the university.

So. There are some basic motivations here that helped me that I hope will help you. I urge you to cultivate a stubborn determination to finish the thing, whatever might happen. I also urge you to be patient with yourself, as much as you can be within all reason. Finally, I urge you to understand that writing a book is not a consistent project and that there will be highs and lows and successes and failures. These crosses are yours to bear alone, even if you do benefit from a supportive group of people around you willing to discuss both your work and their own.

We return to the question of why you are writing this thing. I should point out that this objective can change. For me, the objective became simply to do whatever it took to get my degree. This came back to haunt me later when it became time to edit towards a book, but it got me to finish my advanced degree in the first place, so… you choose your battles, you accept the results and you move on. Regardless of whether you are confident in your ability and fervent in your desire to produce the next important monograph in your field or whether you are fighting off the insecurities so many of us face in graduate school, large hulking and cacophonous banshees screaming at us to reveal the mediocrity we are sure is there, you need to know what you are doing. My advice is thus simple, and direct: figure that out. Trust me, though: no decision is final and you are not the fraud that little voice inside your head keeps insisting you to be. That voice is a jerk and should be repelled with clear writing schedules followed by exercise and some quality time with friends involving beer or Netflix or whatever floats your boat.

I lived in Austin and would get up and write, go for a cycle in the afternoon, and see people in the evenings, most of the time. A common piece of advice is to write three hundred words a day. It’s a good starting point, but honestly the number of words is up to you. I would keep it low; I chose three hundred words as my target and on good days blew past that number and just kept going. It helped on the bad days, though.

But do ask yourself: why am I doing this? The answer can be high-minded or mundane, intellectual or procedural, but you need an answer. Do not give into a narrative of failure, where the only option left to you is to mitigate a sure disaster. Understand what a dissertation is supposed to be, and try and find out what on earth a book is supposed to be (though perhaps give the former task a higher priority). They are different, but one can in fact become the other. It does happen.

The Advisor

Depending on what point you are at while reading this (applying to programs, just finishing your first semester at a program, looking back on three decades as a PhD candidate) has a significant effect on how useful or not this short piece of advice might be to you, but I will give it anyway: get on the same page as your advisor.

Every relationship between a graduate student/candidate and his or her advisor is different, the one thing in common among all these relationships being that it is one of the most important factors in that student’s continued progression towards receiving the degree. Some advisors have students over for beers early and basically hang out with them, some task their students with extremely important research activities such as selling furniture for them on craigslist. My advisor was somewhere in between these two points, though happily for me closer to the BFF scenario. At the same time, we were not BFFs. I advisor was great, and supportive, and I look back on my time as his student fondly. The best thing I can say about him, however, is that I knew where he stood and how he expected things to go.

For example, when I strolled into his office and informed him I was taking my comprehensive qualifying exams in February he smiled and told me that I was not going to do that because I was not ready. This would prove useful later, when I organized a defense in the full knowledge he had no interest in letting it happen if he was concerned I had more work to do to receive the doctoral degree.

What is the advice here? If you already have an advisor, recognize that they are who they are. If you are applying to a program, give some serious thought to who your advisor is going to be. If you have not been in a program long, remember that you can in fact change advisors, though that may depend on the dynamic in your program or between you and multiple members of the faculty.

Ultimately, remember that your advisor’s name will be on your dissertation, and treat that fact accordingly. Assuming they are operating in good faith, which they almost unfailingly are, they are working with you on the dissertation not just out of support for you but also out of the need to do right by themselves. Use this in whatever way can help you finish the dissertation. For me, it was a simple thing: I went in to his office to see him regularly, creating deadlines by when I had to have something meaningful to share. It also meant sending him extremely rough drafts early on, which I think terrified him but got me to where I needed to be.

Life after graduate school

There will, in fact, be one. You will have a life after your leave graduate school, whether you take a degree with you or not. The dissertation will not defeat you: you will either write it or you will not. It is not a signifier of your ability or your worth. It is a dissertation. If this was easy, everyone would do it. That does not mean you have to do it, but it does mean you should give yourself some respect for getting as far as you have. Frankly, it’s doable: I finished my dissertation and lived to tell the tale. And I sometimes start sentences with conjunctions, and often use italics with no sense of decorum whatsoever.

Death Stranding teases more historical allusions

He has, as they say, gone full Kojima.

That can mean a lot of things, of course. Hideo Kojima’s status in the video game community has only improved since his recent falling out with Konami, giving strength to an running online in-joke/slogan/act of defiance. Even Kojima’s famous new friends like to get in on it.

That status was vaunted already; as the mastermind of the Metal Gear Solid series Kojima can do no wrong in the eyes of many, a walking talking Exhibit A in the video games as art debate. His “cinematic” style and deliberate invocations of philosophical exploration into genre-defining stealth games have made him a legend. Subsequently, if you like video games and enjoy reading about them you can in theory have no reaction to or opinion about Hideo Kojima. But you probably do.

I have my own of course, but I find it difficult to pin them down. Most recently, I find myself more excited about his new project Death Stranding than I ever have been about a Metal Gear Solid game, including Metal Gear Solid 4: Guns of the Patriots, about which I got rather excited indeed. Kojima’s latest has come to us in a duo of thoughtfully crafted trailers as likely to be leading viewers astray as advertising any clear elements of the upcoming game. The casting of Norman Reedus, (the apparently extremely nice) Mads Mikkelsen and Guillermo del Toro is attention grabbing, but something in this latest trailer caught me more than the intriguing hints at a plot exploring the divide between life and death, or the star turns.

The World War II iconography here is unmistakeable. Undead soldiers march behind a Lovecraftian tank with an almost cavalier allusion to Nazis trooping through Europe. The skies belong to the enemy, del Toro’s character trapped, hiding on low ground and looking to the sewers beneath.

Kojima could have a lot of fun with this. Like many people my age who like video games I failed miserably to avoid being snatched up by the Metal Gear Solid games early on; far and away the element in the series that appeals to me most is Kojima’s fascination with the present and future of international relations. More specifically, the Metal Gear Solid series very clearly rests on the product of a Japanese mind, with Kojima’s homeland occupying an intriguing place in late twentieth century geopolitics. During the Cold War, Japan rose from an American-sponsored workhouse of Asia to a superpower sitting on the Pacific Rim, all power and influence derived directly from economic influence and power. Japan had no army to hold back from direct conflict, and sits alone in the nuclear age as the only nation to suffer a nuclear attack. All of this seeps through the Metal Gear games: Solid Snake’s famous declaration that “war has changed” in Metal Gear Solid 4 represents the game’s central premise neatly, with the character’s narrative and the broader geopolitics of Kojima’s near-future dovetailing with one another, but Snake’s sense of loss and discombobulation essentially mimics a postwar Japanese state in a world dominated by the possibility of conflict. The post-2001 world brings as much doubt to Tokyo and Nagoya as it does to New York and Chicago, compounding the complications of forming a coherent Japanese worldview that includes a clear sense of national self.

Seeing Kojima take these influences and throw World War II into the mix is promising. In the years since MGS4 Japanese prime minister Shinzo Abe has pushed harder for a more formal reconstitution of a Japanese military, a decision that does not reflect a shift in Japanese popular opinion but rather reflects the realities of his own political debts and somewhat concerning historical interpretations. The series that made Kojima a legend centers itself entirely around the creation and resistance to enormous weapons of war, the Metal Gears of the title. It seems, perhaps, Death Stranding is more interested in what makes human life valuable. By evoking the imagery of Nazi stormtroopers and the use of  flying machines in the sky to represent totalitarianism of pervasive observation and control, Kojima borrows from well-worn territory, but it is a well that seems slow to run dry. Time will tell how much of this comes from the man’s showmanship, but personally I look forward to more clearly historically inflected world building with much greater enthusiasm than I do the roles laid out for the famous faces we have seen so far.

Trump in my classroom

This started as a post, on November 9th, titled “the day after the night before.” Since then, I have only written one post for the blog. That is not particularly odd, given this blog’s previous record, but it does interrupt what had been a slow burn into regular updates. I am disappointed by the interruption, and surprised: I usually write about things that upset and worry me. This time, my writing has suffered.

There are many reasons I am concerned, if not frightened, that Donald Trump will be the next president of the United States. However, the post I sat down to write on November 9th was not about how I felt after the election, but what it had been like to walk into class that day.

It was difficult, more difficult than I expected. I did something I have not done in class before: I shared with my students some of the specifics of my politics, before expressing my own disappointment with what had happened. I was specific on why I was so disappointed, and I talked about the difficulty of having a wife at home crying mostly from fear at what the future might bring and what this means about the country. I am the only white person in my household but also the only immigrant. I take solace in the fact that Trump would say anything to get elected, but I share my students’ disappointment that he would say some of the things he has said in the last few years and still win an election.

In the end, I waited, not least because of the difficulties around all of this. I do not think everyone who voted for Trump is a racist. In fact, I know it not to be the case. It is important, I think, for us to realize that the problem is not that a Democrat lost or a Republican won, but that it is alarming such a clearly unprepared person, a person of such clearly limited ability, has become president. Mostly it is important to understand this has not all happened in the last month. It takes time for such phenomena to germinate and ultimately manifest.

I was going to move on, and never write this post at all, but here it is. If nothing else, I wanted a small memorial to the moment I walked into a classroom holding a silence I had never heard before. I’ve often read variations on the phrasing of a “heavy silence” but this was my first time experiencing one this deep. Students had tears in their eyes. Whether I liked it or not, Donald Trump was in my classroom. He will not go anywhere soon. For me, the response is simple: I will teach what I have always taught. We will read about and discuss how good ideas go bad and how bad ideas proceed regardless, we will talk about how many ways we experiment with this relationship between the individual and the state, we will talk about the importance of cultural production and the meaningfulness of ideology in social and individual interactions. We will talk about all these things, and I will encourage my students not to take arguments at face value, whether those arguments be appealing to them or repulsive at first. I will continue to be a historian, and we will talk about all these things and go out into the world.