Missives from back pages and ones closer to the front

At the risk of this becoming another entry in a genre I continually declare to relinquish forever, that being of the blog post centered on the problem of my lack of writing, I wish to write for just a little bit on my lack of writing.

Rather, I want to just try something new, specifically my latest attempt to just write something without it needing to be something good or something clearly functional. Although I have more than enough ideas to write five such posts a week, well… I’m not writing them. So here we go, a little miscellany of ideas from my brain to keep this poor little blog alive in attention and imaginary pennies for another day.

This is at least partly, by the by, influenced by my reading quite some time ago now of Nick Hornby’s The Polysyllabic Spree, comfortably my favourite thing of anything he has written, a collection of articles he wrote one year on the things he was reading. I have recently somehow found the time to read beyond the boundaries of the classes I am teaching this semester, and I have thoughts on such things, with many blog posts planned. Planned but not written. I’ve always liked the idea of writing about my reading, if only because it will in theory force me to read more. So for now, allow me to share:

    • I read Neuromancer! Why the exclamation mark, you may ask? I, like many people who read, have a pretty significant backlog of books that I have bought with every intention of reading (or in the case of Dickens re-reading) but have not. My success not just in beginning Neuromancer but actually finishing it is thus extremely rewarding, or at least creates associative feelings of such achievement in my brain.

 

    • I liked Neuromancer! Why the exclama… I did that bit already. Actually, I was not at all sure I would like Neuromancer after a couple of pages, and was if anything less sure sixty pages later. It did just enough to keep me, though (I am an extraordinarily willful and fickle reader these days, shorn of all guilt by fatherhood, work and other important things) and by the time I finished it I was very happy indeed.

 

    • I was particularly struck by elements of the novel that are noticeably cliched in a 2017 context but of course were not in 1984, largely thanks to William Gibson’s success in foreseeing our use of virtual space (or cyberspace, to use Gibson’s word) and his immense influence on so much work to follow, particularly in the cyberpunk genre which he basically invented. Or helped invent. It was intriguing how quickly I recognized my initial frustration at cliche as anachronistic and embraced the tropes: trenchcoats, mirrored glasses and cosmically spiritual Rastafarians are quite enjoyable when you instruct your brain to forget they’ve been done a million times and worse. I would strongly recommend Neuromancer though because it has a lot more going for it besides. A couple of Gibson’s characters have been tried since but with far less success.

 

    • I put Neuromancer down, picked up The Crying of Lot 49, another backlog book, feeling I was on a roll, but never actually started it and got into Infinite Jest instead. I’ve owned Infinite Jest for years, and it’s funny, as for a very long time I read nothing out of an astonishingly elevated and rather misinformed sense of dedication to literary snobbery, which involved rejecting all modern literary fiction, at least written during my lifetime. I know, I know. Ridiculous. I do still have somewhat of an issue with pretentious literature. I was pleased to find that the first fifteen pages of Infinite Jest are nowhere near as pretentious as I worried they might be, and were actually not as pretentious as the book’s introduction by another author who will here, at least, remain nameless. I can see why the book is so popular with literary types though. Even early on it really is extraordinarily impressive, and Wallace writes with confidence and a genuine sense of fun, though that seems lost quickly in all the things he does that impress everyone so.

 

    • I’m not sure how long I’ll stick with Infinite Jest. I’m already thinking of dipping in and out every month or two but I don’t trust my memory enough. I might pop into Dave Eggers’ The Circle, which has sat on my bookshelf for months now but has never tempted me. Recent trailers of the upcoming film have alerted me to the fact The Circle features a nefarious character several people, including Tom Hanks, later decided should be played by Tom Hanks in a film adaptation. This merits some form of revisiting, and given the trailer gives off a distinct “Weren’t those Da Vinci Code movies a BLAST?” vibe, a rather important tweaking of one’s expectations. Or reductions of one’s biases and preconceptions, fair or otherwise.

 

    • Finally, all this reading has put me in the mood to write, which is great fun, and brings up an old problem, which is the consistent concern and self-correctives over my tendency to write a little differently once I’ve been reading with a decent rate of regularity. Am I writing more confidently, more adventurously, or more pretentiously? All three? Should I care? The short answer is no, as I think the longer answer is also, but all the work in between the two is more interesting to me than to you, and so I think I’ll call it a day.

 

Thinking about the dissertation

This is a post I began to write in early August, but let fall through the cracks. I have decided to share it today, and I have resisted the urge to edit too much of my thoughts from the heady days of August 2016.

I  want to write a little bit about my work this summer in a future post, but in planning out that particular post I started thinking more and more about my PhD dissertation. I defended it five years ago this month, an anniversary that if you asked me about it a couple of years ago I would have anticipated a profound feeling of fear, largely because that was my reaction to everything at that particular moment. I cannot be sure if it was a condition of academic life, collateral emotional and psychological debt from being the parent of a small child, existential fear that everything really IS going to pot (though being a historian mitigates that particular bit of irrationality) or just a natural derivative of whatever it is that is wrong with me that drove me to attend graduate school in the first place.

In any case, as the anniversary does finally come into view, I am happy to say that my sentiments are mostly gratifying. It turns out that all this work we run around doing, particularly early in our careers, can make us feel very good if we just stop and acknowledge it.

This is also to a significant degree due to the fact that the dissertation, the defining and central non-sentient element in my life for several years, is finally well on track to becoming a book, and, having navigated some of the more difficult phases of that particular journey, I have suddenly acquired a gentle, world-weary view of these challenges worthy of an elderly character in a science fiction novel or an associate professor mystified as to why today’s graduate students seem to find everything so much more difficult than when he went through the process, taking to the rivers of academic thought as a salmon coursing deep through its waters and leaping, with drops and splashes of creativity exploding and falling where they may, into the bright clean air.

Before I wade into the beatific intellectual retirement that is removing all the pain of writing what finally became my first book and filing it under the “articles of evidence supporting the clear unambiguous fact John Harney is a genius,” let’s take a moment to reflect on the dissertation, and what went into writing it. I find myself writing something I did not plan to write this morning and certainly did not think I would ever write five years ago, or ten, but I hope that some graduate students and prospective graduate students might come upon this and find it useful. As always, your mileage will vary, which is a neat and useful but syntactically rather uninspiring way of saying I understand I do not have all the answers. This is how I experienced the dissertation and how I think of it now.

The Dissertation vs. The Comprehensive Qualifying Exam

The great advantage of pursuing a graduate degree in the United States is that you are given time to read, to think, to engage in discussions with your peers and with professors, to really dig into existing theory in your field, to write papers, to become a Teaching Assistant and experience the initially odd sensation of handing out grades to undergraduates. It is all about time. if you are fortunate enough to be fully funded, you have a lot of time.

I was not fully funded. I did not feel that I had time. As a result, I was driven to complete the dissertation from the start, long before I knew with any confidence my dissertation topic. This translated directly into a particularly driven attitude towards my coursework and, in particular, the preparation for and completion of my comprehensive qualifying exams. For this, I am unapologetic and I encourage all graduate students to do the same. Yes, the learning of things is important and should not be reduced to a tawdry list of checkmarks, but you can still take a positive view of all the work you are doing while getting it done. That is to say, it IS important to reflect on the value of the expertise you are developing but it is also okay to identify clear endpoints, milestones and the like. In effect, you can treat your exams as a task but look forward to it, not dread it.

In theory, anyway.

My comprehensive qualifying exam involved providing reading lists to each of the three members of my committee, writing historiographical essays to be included in my portfolio, and defending this work orally. The whole thing took about a year, not including the various note-taking and collection of books and journal articles I engaged in during coursework. My attitude to time and in particular my focus on completing the exam as soon as possible was, in my opinion, a very good approach to the exam itself. It mitigated anxiety, at least, but it also made it clear that this was a temporary state of being and one that would come to a timely end with an appropriate amount of work.

The same approach will not necessarily work for the dissertation, because the dissertation is an entirely different beast. You will schedule your defence, sure, but you don’t schedule it as soon as you advance to candidacy. People will give you a vague idea of how long you are supposed to be in graduate school, and you have other people in the department who have been there longer, but there are no clearly defined rules about how long it will take. Nor should there be. The dissertation is going to take as along as it takes. So if, like me, you have taken a rigid approach to time management and you possess a clearly defined idea about when you hope to get your degree, you are going to generate a lot of stress for yourself.

On the other hand, not being rigid brings its own stress, a type of pressure that almost everyone I have met who has written a dissertation experienced regardless of their attitude to writing the bloody thing starting out: a truly horrid combination of the unrelenting sensation that you have not done enough work, a continually expanding effluvium of guilt, and the prolonged gradual existential realization that this thing will never actually come into existence and that you are by extension useless.

If that sounds dramatic, well… it is. I am not exaggerating all that much though. The basics of time management still work, you understand, but the context that surrounds the dissertation can become crippling in a hurry, and being in control of the process this week guarantees no such thing next week or next month. The good news is that recovery from such dark spots is equally frequent. My advice for people writing dissertations focuses on reducing such dark spots as much as possible, and you do that by managing the context.

Why are you writing it?

Yeah, why ARE you writing it? Have you given that enough thought? I mean, really. I know you think about it all the time, but why are you writing the bloody thing? Is it just a means to an end, and if so… to what end? Do you want this to be a book (or an article, or a series of articles, depending on your field)? Are you focusing primarily on your degree?

Allow me to use the extremely broad objective of producing a monograph as a shorthand for all the various publication possibilities here, as it is the standard in my particular field. You have a number of challenges ahead of you, not least of which is the fact that it is incredibly unlikely your dissertation will be ready for publication, regardless of what your advisor says. If s/he is also focused on helping you to produce a book, to the extent that a conversation with an acquisitions editor is scheduled before your defense, I say good luck and stick with your advisor. It is more likely however that your advisor and the rest of your committee, though very much pulling for you and supportive of the idea that your dissertation ultimately take the form of chrysalis to the butterfly of the next step of publication, will have either their own set of ideas of what your work should be or their own interpretations of their obligations to you, to themselves, and to the university.

So. There are some basic motivations here that helped me that I hope will help you. I urge you to cultivate a stubborn determination to finish the thing, whatever might happen. I also urge you to be patient with yourself, as much as you can be within all reason. Finally, I urge you to understand that writing a book is not a consistent project and that there will be highs and lows and successes and failures. These crosses are yours to bear alone, even if you do benefit from a supportive group of people around you willing to discuss both your work and their own.

We return to the question of why you are writing this thing. I should point out that this objective can change. For me, the objective became simply to do whatever it took to get my degree. This came back to haunt me later when it became time to edit towards a book, but it got me to finish my advanced degree in the first place, so… you choose your battles, you accept the results and you move on. Regardless of whether you are confident in your ability and fervent in your desire to produce the next important monograph in your field or whether you are fighting off the insecurities so many of us face in graduate school, large hulking and cacophonous banshees screaming at us to reveal the mediocrity we are sure is there, you need to know what you are doing. My advice is thus simple, and direct: figure that out. Trust me, though: no decision is final and you are not the fraud that little voice inside your head keeps insisting you to be. That voice is a jerk and should be repelled with clear writing schedules followed by exercise and some quality time with friends involving beer or Netflix or whatever floats your boat.

I lived in Austin and would get up and write, go for a cycle in the afternoon, and see people in the evenings, most of the time. A common piece of advice is to write three hundred words a day. It’s a good starting point, but honestly the number of words is up to you. I would keep it low; I chose three hundred words as my target and on good days blew past that number and just kept going. It helped on the bad days, though.

But do ask yourself: why am I doing this? The answer can be high-minded or mundane, intellectual or procedural, but you need an answer. Do not give into a narrative of failure, where the only option left to you is to mitigate a sure disaster. Understand what a dissertation is supposed to be, and try and find out what on earth a book is supposed to be (though perhaps give the former task a higher priority). They are different, but one can in fact become the other. It does happen.

The Advisor

Depending on what point you are at while reading this (applying to programs, just finishing your first semester at a program, looking back on three decades as a PhD candidate) has a significant effect on how useful or not this short piece of advice might be to you, but I will give it anyway: get on the same page as your advisor.

Every relationship between a graduate student/candidate and his or her advisor is different, the one thing in common among all these relationships being that it is one of the most important factors in that student’s continued progression towards receiving the degree. Some advisors have students over for beers early and basically hang out with them, some task their students with extremely important research activities such as selling furniture for them on craigslist. My advisor was somewhere in between these two points, though happily for me closer to the BFF scenario. At the same time, we were not BFFs. I advisor was great, and supportive, and I look back on my time as his student fondly. The best thing I can say about him, however, is that I knew where he stood and how he expected things to go.

For example, when I strolled into his office and informed him I was taking my comprehensive qualifying exams in February he smiled and told me that I was not going to do that because I was not ready. This would prove useful later, when I organized a defense in the full knowledge he had no interest in letting it happen if he was concerned I had more work to do to receive the doctoral degree.

What is the advice here? If you already have an advisor, recognize that they are who they are. If you are applying to a program, give some serious thought to who your advisor is going to be. If you have not been in a program long, remember that you can in fact change advisors, though that may depend on the dynamic in your program or between you and multiple members of the faculty.

Ultimately, remember that your advisor’s name will be on your dissertation, and treat that fact accordingly. Assuming they are operating in good faith, which they almost unfailingly are, they are working with you on the dissertation not just out of support for you but also out of the need to do right by themselves. Use this in whatever way can help you finish the dissertation. For me, it was a simple thing: I went in to his office to see him regularly, creating deadlines by when I had to have something meaningful to share. It also meant sending him extremely rough drafts early on, which I think terrified him but got me to where I needed to be.

Life after graduate school

There will, in fact, be one. You will have a life after your leave graduate school, whether you take a degree with you or not. The dissertation will not defeat you: you will either write it or you will not. It is not a signifier of your ability or your worth. It is a dissertation. If this was easy, everyone would do it. That does not mean you have to do it, but it does mean you should give yourself some respect for getting as far as you have. Frankly, it’s doable: I finished my dissertation and lived to tell the tale. And I sometimes start sentences with conjunctions, and often use italics with no sense of decorum whatsoever.

Discovering the American Pastime

My first memory of baseball is of seeing the game on my
grandmother’s television in Cork city in 1987 or 1988. It was the New York
Mets. Don’t ask me how I remember this, as I remember nothing else about the
game. I don’t even recall what time of year this was, so I don’t know if it was
spring training, a regular season game or even the post-season. This was Cork
city, home to great teams of both hurling and gaelic football, the center of
old fashioned rugby union in the south and, as Ireland’s second biggest city,
soon to become a stronghold of Irish soccer. Baseball was not on the radar of
many people in Cork.

My grandmother had MultiChannel, a local service that
provided about sixteen channels via a small black box on top of the television.
Otherwise one could watch two channels over terrestrial television. The breadth
of selection, though a bounty to the generation before us that had been happy
with one station, was often referred to dismissively as “Bog One and Bog Two”
in schoolyards. The MultiChannel was popular because it gave affordable access
to English stations, which by the time I stood in front of my grandmother’s
television watching baseball added a whopping four stations to the original
two. Beyond that lay a number of stations with minimal effect on me, until Sky
One’s rigorous syndication of Star Trek: The Next Generation a few years later.
The French language station Canal Plus, the German station Sat 1 and the
sports channel Eurosport rounded out the coverage. This latter station hosted
the baseball, as was its wont; Eurosport presented itself as a channel
presenting a continental European slant on things but as far as I could gather
it was left with the dregs of sporting events the terrestrial channels chose
not to cover. Eurosport was the place to go for winter sports, off-season track
and field and the occasional bit of cycling. On this occasion the channel
showed baseball.

 

I have to confess that the game, or rather the few minutes
of the game I watched, had little effect on me. I had no context, for one
thing; I knew nothing about the successful 1986 Mets. I had no idea who Daryl
Strawberry was, let alone Keith Hernandez. In fact, I would become familiar
with both of these figures on television before I became aware of their
sporting legacies, Strawberry as the suck-up to Monty Burns on The Simpsons and
Hernandez as an object of Jerry Seinfeld’s smitten affections. Baseball was
supremely foreign in a way that it’s hard to fathom for me now, having lived in
the United States for almost a decade, fallen in love with an American woman
and become a soon-to-be father of an American child. He will likely see his
father as some kind of odd Irish weirdo. That’s certainly the aura I hope to
cultivate, anyway. I now spend weekends watching American football and much of
my summers watching baseball.

My conversion to baseball was by far the most recent and in
some ways more profound. I developed a taste for American football at a young
age, despite the fact that this was an extremely rare taste to develop on my
own in late 1980s/early 1990s Ireland. So isolated was I in my fandom that when
I met an Englishman in Sheffield in 2002 that also liked the sport, we bonded
immediately. He was in my wedding. There were other factors too. We both liked
metal, for instance. However, he was the first person I had met and hung out
with that was open to the idea of staying up until four am drinking beer and
watching the Dallas Cowboys. He was already doing it before I met him, in fact.

He also liked baseball. I didn’t. Baseball was too arcane
for me, and as a result I had long since dismissed it, even questioned its
viability as a sport. It was foreign in the very real sense that it appeared
quintessentially American, and although I was far more amenable to American
culture than most of my European contemporaries, sport provided a final
frontier of sorts. American football was different, or perhaps it filled my
quota of Americana weirdness to that point; basketball seemed like a waste of
time and ice hockey close enough to a European sporting ethos to seem
attractive but never quite convincing enough for me to start paying attention.
America was off on its own, a strange oasis of odd, commercialized sporting
enterprises that rejected my one true love: soccer.

It was that connection that helped draw me in, in the end. I
started working on sports history during my doctoral work and a good friend
asked a simple question: “Why do they play baseball in Taiwan?” This drove a
project that ended up, of all things, comparing baseball in Taiwan to Gaelic
Games in Ireland in terms of their political and social significance. It also
started an interesting process for me personally. In order to be able to write
about the sport effectively, I reasoned, it was no longer acceptable for me to
practice an aggressive ignorance about the sport’s culture and the culture of
those that appreciated it. I set myself a crash course on the game and began to
read: Joe Posnanski, Howard Bryant, John Thorn, David Block, Bill James, the
ubiquitous Moneyball by Michael
Lewis… There are many more that I have yet to read but surely will. I watched
Ken Burns’ nine innings (and his tenth), I actually started… watching games.

I had been to baseball games before. I always enjoyed it as
a spectacle. I was, at baseball games at least, the person I had always hated
at soccer matches, the very person that marketing departments desired to go to
the games. Somebody willing to buy a ticket and spend a little bit of money in
the stadium, and… well, I think that’s all marketing departments wanted. My
first ever Major League game was in 2000, when I attended a Giants-Dodgers game
at Pac Bell Park. I remember a Dodgers fan being escorted from the stadium
after the Giants fans surrounding him had been unpleasant, which struck me as
highly unfair; Americans didn’t seem to be very good at this rivalry thing. On the train to the stadium, we were
briefly joined by a man wearing a t-shirt bearing a motto he was only happy to
repeat at volume: “Duck the Fodgers!” I wasn’t overly impressed. 

My friends, all young Irishmen like me, wanted to depart
from the cheap standing area we had bought our way into and to take a nicer
seat in the stands. Being an abject coward, I argued against the approach until
it became clear that people were leaving with worrying regularity. I’ll always
remember the large man leaving with his daughters just after the seventh inning
stretch. He approached us purposefully and yelled “dirty hippies!” Being, as we
were, twenty year olds in unkempt jeans and cords with pretty liberal haircuts
(in a maintenance sense, not a political one) I assumed that he was focusing
his ire upon us. Then the hand came up and I performed one of the first
un-ironic high fives of my life. He took his daughters and left; we took his
seats.

Marvin Benard scored a three run homer in the bottom of the
ninth to win the game. We thought it was cool. 
In 2008 I witnessed Marlon Byrd hit a walk-off grand slam at the
Ballpark in Arlington to break a tie with the Yankees. I lost my mind. I often
try to recreate Benard’s homer in my mind and try and generate a more suitable
response, but it’s impossible of course. We thought it was cool. We went on
with the rest of our evening. I’ve often told the story since, though until
recently I claimed that it was Barry Bonds, and not Benard, that scored the
winning home run. The truth is that Bonds didn’t even play that day. Now, this
could be an example of the perfidy of memory but in truth I think it’s rather
more mundane: Bonds is a more interesting story so I choose to remember Bonds.

So what connection could there be to soccer in all of this?
As I suppose would be typical of my interests, the history of the game drew me
in. Baseball, at has existed and continues to exist in the northeast of the
United States, is a game of loyalties and latterly, of betrayal. The American
phenomenon of teams moving locations, such anathema to my European
sensibilities, has as its ultimate example the departure of the Brooklyn Dodgers
for the west coast. I suppose we can talk about the Baltimore Colts stealing
away in the night, but the Dodgers is the better story, the better example of
trauma inflicted on an entire community. The Dodgers led an exodus west, to
that place where the sun always shines and excited men demand high fives of
scared Irish college kids before leaving a game with two innings to go. Boston
and New York are home to the game’s history, where urchins in flat caps broke
windows and scurried off from the wrath of their neighbors. That resonated with
me because I did the exact same thing, though with a soccer ball, and in less
oppressive settings than the poverty of urban New York at the turn of the
twentieth century. Baseball fans care about their teams as well, and this meant
something to me. Yes, fans of other sports care about their teams, but it’s
just so sanitized in the American experience. The NFL has been too successful
in monetizing itself, in recreating the sport as a television product. The NFL
champions the fan that trades in body paint and mugging for the camera. The
advertisers focus on the concept of fan loyalty to a lesser extent, but it’s all
on the surface. Basketball threw its lot in with the superstar model and that’s
been successful too. But baseball…

Baseball is nerdy. It’s weird. That arcane nature that so
turned me off as a child is a large part of its appeal for me now. Look at the
people that write about baseball, that talk about it on television. They’re
generally not the cool kids. Don’t get me wrong; cool is overrated. I identify
more closely with them. In that sense, I have come to baseball not because the
sport reminds me of soccer, or because the culture of baseball resembles the
culture of soccer (though I believe it does in some key ways) but because the
way I have come to feel about baseball mirrors the way that I feel about
soccer. In the case of both sports, I enjoy intellectual discussions based on a
rational approach and I willingly give in to the enjoyment of completely
irrational impulses. I’m not a fan of the concept of “narrative” in sports
because I feel it’s a concept utilized and followed a little too closely by
people whose job it is to write about sports. However, I will confess to
wanting a specific team to win for no real reason, for liking certain players
and disliking others, for having a completely irrational (and unfair) dislike of
an entire organization for no apparent reason. My love is baseball is exactly that, or has become that: love. It will never displace my love of soccer but it’s come closer than I ever thought possible. My enjoyment of baseball and of
soccer, and my practice as a fan of both sports, makes no sense. That’s the way
I like it.

Towards a Personal Musicology, Pt. 1

A close friend and I often joke that our taste in music crystallized in 2003. I suppose one could argue that ossification first set in around then; as much as we supposedly have our tongues in cheek, the truth is that I haven’t made a huge amount of effort to discover new music in the last decade. Not that I don’t, of course; it’s just that I don’t make finding new music a priority. 

Part of the problem is the simple fact that I don’t listen to as much music as I used to. I listen to news programming an awful lot more than I did twenty years ago and I have a regular rotation of podcasts to get through week to week. Such programming takes up all of my listening time: a fairly sizeable commute to work and periods of physical exercise.

I rarely listen to music at other times, at least not actively. I don’t pay particularly close attention to music playing in bars or restaurants when I’m out and about, typically because I don’t like the music enough to become particularly curious. There’s nothing new there. Where my appreciation of music was once driven by a need for discovery, it’s now driven by a nostalgic tendency to celebrate the familiar and, often, to denigrate the unfamiliar. This seems reasonable to me… I mean, there hasn’t been any good music produced since 2003.

For a long time I’ve accepted this state of affairs, though I’ve refused to condemn my love of music to my teenage past.  I don’t believe music is a product for the young, as much as those who control the music industry may believe this to be so. Beyond that however, I don’t think about the medium as much as I used to and I don’t listen to music as much as I used to. I feel that I’ve lost much of my authority to speak about popular music in any kind of critical way.

Now, that’s pushing things, just a little. When I consumed music in great amounts, particularly throughout my teenage years, my critiques of popular music forms I didn’t like where not necessarily of a notably high intellectual level. In fact, I mostly consumed popular music through a mechanic dominated by my own ideological perceptions and an emotional appreciation of the form. In that sense, I didn’t have much authority in the first place. Of course, we’re getting into sticky territory here, especially when talking about a popular cultural form. Do I really have to have a solid background in music theory to point out I think that the majority of popular music produced today is garbage? Surely the clear disposability of the medium lends itself to such criticism.

There’s also an additional level here that makes me a little uncomfortable, specifically the idea that my distaste for current popular music is driven as much by a growing generational gap as by anything else. My immediate riposte to that would be that I used to be considerably harder to please; if anything, my taste in music is considerably more populist now than it once was. You’ll notice however, that I am completely failing to get away from a personal understanding of music. This calls the notion of being critical into question, really. When I complain about the contrived shallowness of contemporary pop, I am not necessarily criticizing the form in the context of what has come before but am rather reacting negatively to what I see as an unopposed victory for commercial interest in a supposedly creative process. This is the constant. This is what drives me crazy. That opposition to commercialism in music, or at least the commercialism run rampant that has utterly corrupted the popular music scene and rendered plagiarism a virtue assuming you have the connections, derives from a personal interpretation of popular music’s function and importance more than a clinical interpretation of the form’s relevance in modern society.

This brings me to the concept of a personal musicology. Increasingly, I find myself wanting to engage with popular music once more and to write about it critically. I have little interest in acquiring a particularly substantial understanding of present-day popular music however and I lack the tools (particularly in the field of music theory) to analyze popular music in a manner that I would find satisfactory. Instead, I’d like to think about my own reactions to popular music and the evolution of the thought processes that feed those reactions: a musicology not of the self but of my self. This is of course pushing the limits of propriety regarding personal vanity just a little bit, but hey: this is the Internet and it’s the twenty-first century.

Having said that, there’s no merit to writing glorified diary entries and posting them to the Internet, though I’m not sure that’s possible anyway. I can’t simply recreate the reactions of my twelve-year-old self to hearing Radiohead for the first time or of my sixteen-year-old self discovering The Velvet Underground. It’s just as well, though. The truth is that writing essays based on how I view those discoveries now should be far more interesting. That’s the goal of this little series that I’m starting today: to go back to the start and make the journey back to the present anew, asking questions all the way. As I’ve already written here, I understand my appreciation of music to be driven mostly by ideological factors, but I’m looking forward to remembering the passion of my teenage years in particular with the context of my 20s already behind me. My personal musicology is not over, as much as I might believe that 2003 marked the end of my active participation in popular music as a medium. It continues to evolve. As an historian, surely I should study my past to understand my present? I hope you join me and I hope you find the journey interesting.

A short post on the tyranny of the single author

I found about THE ROOM today and I’m not sure my life will ever be the same. Adam Rosen wrote a piece in The Atlantic yesterday discussing whether or not the film should be considered outsider art. Rosen makes some very interesting points about the nature of outsider art and its merits, but he has done me a great service; without Adam Rosen, I would never have seen the following youtube clip:

 

Mesmerizing, isn’t it? I’m not sure what’s more surprising about this little piece of cinematic history, the fact that the film cost six million dollars or the very fact of its existence at all. Close-ups of beards and oddly emasculated men with his tuxedo-wearing friends hovering around him, not to mention the dialogue and acting, would certainly inure someone to the idea of the film being more than it appears. I mean, look at this:

 

Surely they couldn’t have written that, acted it, shot it and edited it only for it to come across that way on purpose?

Right?

Of course when I say “they” I’m being inaccurate not only because it’s far too general and unspecific a term but also because this masterpiece belongs to one auteur: Tommy Wiseau. I know very little about Mr. Wiseau, but I wonder what his film says about the dangers of having no checks whatsoever on one’s creative impulse.

I don’t take criticism well. I never have. Nevertheless, it’s a vital part of the writing process, at least when one is moving towards publication. In the academy, double blind peer review is taken for granted. An article that hasn’t been subjected to review in this manner is essentially worthless. This approach is predicated on the need to maintain accuracy and the application of appropriate academic rigor. It’s main side effect, direct feedback, is actually more useful in the humanities. That is, unless you’re trying to write an article that claims Stalin was really a woman and you need to get it out there. Reviewers will look for appropriate methodology and use of sources of course, but they’ll also ask keen, insightful questions such as “what on earth are you trying to say here?” and offer important observations like “this is a great idea but in its current form it makes me want to find you and stab you in the eye.”

That’s how I read reviewer comments anyway. 

The point is, writing improves when somebody calls you out on what you’re doing. This makes terrible work slightly less terrible, acceptable work something approaching good, and good work very good. It’s just not typical for one person to control all aspects of the creative process and actually produce something decent.

 

Okay, THE ROOM is on its own level.

The frustration of writing

I find writing frustrating.

I love writing.

I find writing frustrating.

 

It’s a simple cycle really, and one would hope that simple cycles would be easier to bring to a halt or at least to direct in a friendlier direction. That’s not the case with my writing, though; I’ve found that when I write, I write across all levels. Writing for work, writing on this blog, even writing fiction. It all comes together. Each activity feeds the other. 

When I don’t write, I make excuses. I imagine that my professional writing is sucking the energy away from my blog writing or my (more or less consistently unwritten) fiction writing. I think of my writing in terms of finite capital that must be spent wisely, but that’s not how writing works, at least for me. If I was able to manage my writing capital, my inspiration, my energy, then perhaps I could stockpile the capital over a course of days. It wouldn’t be an issue that I hadn’t written anything in three days as I now had a bounteous reserve on which to call, a guaranteed two to three days of eloquent frenzy across my keyboard.

That’s not how it works. At least, that’s not how it works for me.

I go back and forth about what I want this blog to be, in my own head. Recently, or rather only a few posts ago, I made a decision to write longer form stuff. Not ten thousand words or anything, but somewhere between one and two thousand a post. I had found that my writing was creeping upwards in that direction anyway and it didn’t seem like a stretch. Something funny happened, however. I focused on writing longer posts but continued to hand myself excuses I wouldn’t accept elsewhere. I tacitly accepted a lowering of standards because, after all, “it’s just a blog.” I ignored the fact that this perspective directly contradicted the idea of writing longer posts in the first place.

Quality isn’t quantity of course, but weeks later I find myself frustrated by the whole endeavor. I wrote a post on the important but flawed video game GONE HOME a couple of weeks ago only to insert a caveat as minor foreword to the post over a week ago and ultimately to remove the post from the site altogether. I didn’t change my mind; the post said what I wanted to say, but it didn’t say it well. That drives me crazy.

Part of writing is accepting the perils of the freedom that comes with it, especially when writing in public, as blogging very much constitutes. So, I will jettison my idea of writing longer pieces once or twice a week, because the result was a long post once in a blue moon. I won’t try and guarantee a fixed number of posts a week either, because that just drags my writing in the opposite direction. The decision I’ve taken, essentially, is to lighten up and just write. That means no more apologies for posts like this one. I still hold fast to the conviction that blogging about yourself (or writing about yourself at all, really) is supremely boring as a rule, but I’m happy to bend the rule if it means I’m writing. Yes, I’m writing about myself here but I’m writing about myself writing and therefore I am writing about writing, or at least the impulses that drive for or against it.

The childish thrill I received by using the word “writing” so much in that last sentence helps convince me that this is something I like to do. The astonishing thing about writing is how the worst thing you can possibly do is not write. Failing to write typically comes from fear, and that fear only compounds over time. Writing is liberating for the writer in the personal space. At the same time, writing without any kind of contribution to others is a waste of time and by definition bad writing. So, although I will no longer run screaming from the idea of writing a post such as this one, I will remain committed to staying “on topic” as much as possible. In other words, if you visit this blog now and again you’re going to see more than your fair share of three hundred word posts that point out Apollo Creed is an amazing character.

In short, my advice to anyone who is frustrated with the writing process is this: write. I know that such advice is trite, obvious and by no means original. It is the best advice I’ve ever received however. Talk to you soon.

The Problem with Hannibal

Hannibal Lecter is a fascinating character.

Not really news, is it? I mean, he’s an exceptionally popular character. Which isn’t exactly the same thing as “fascinating”, but still. I suspect Thomas Harris will take it. In fact, I know that he will. Here’s how he describes his relationship with Lecter in the foreword to the 2000 paperback edition of his novel RED DRAGON:

“By the time I undertook to record the events in Hannibal, the doctor, to my surprise, had taken on a life of his own. You seemed to find him as oddly engaging as I did.”

As the decades have passed, and we’ve gotten to know Lecter better, we’ve really become quite enamored with him. I’m talking about all of us here. The audience. Not the royal we but the… the actual we. The inclusive we, I suppose. Or, you know, the editorial we. If you consider yourself a regular reader (and if you don’t, please consider considering… I will actually write more). Lecter did a terrible thing, or rather a terrible thing happened to Lecter: he became seen as easy money. A guaranteed hit. Bums in seats. And, to be fair, that’s exactly what he was, at least until HANNIBAL RISING. But there seems to have been some kind of collusion throughout the English speaking world to pretend that never happened, so I think we’re still good.

The problem with Hannibal, or at least my problem with Hannibal, and this is the crux of the whole thing, is that he had ceased to be interesting long before HANNIBAL RISING. Indeed, he became utterly ineffective as a character at the very outset of Harris’ third book, the appropriately titled HANNIBAL.

It’s an infuriating book. Harris’ first two books defied my expectations. That is, I thoroughly enjoyed them. Harris’ work is consistently (and uncannily) similar to the films subsequently based on his writing, to the point that they sometimes read like novelizations. Until, in each case, RED DRAGON and SILENCE OF THE LAMBS do their own thing a little bit. This is clearer in RED DRAGON, when the entire novel shifts focus away from Will Graham and gives us insight into what makes the killer tick. Harris goes out of his way to show us where this man came from, what drove him and what created him. We are encouraged to sympathize and perhaps even empathize a little, and we understand Dolarhyde just that bit more than we thought we would. It makes his subsequent transformation (and the deliberate attempt at transformation that follows that, and drives the novel) all the more terrifying. The Tooth Fairy is not some abstract concept, some B movie type construct of violence and predation on decent people. He could, perhaps, have been a decent person. He could, perhaps, be a decent person again. He chooses not to do so, although the degree to which he truly enjoys any level of choice is open to the reader’s interpretation. Ultimately, what makes Dolarhyde scary, and what elevates RED DRAGON beyond lazy “psychopath” characterization that has since become so typical in our popular culture, is that he is not all that different from you or me. Sure, he’s probably stronger, unless you work out a lot too. He clearly has significant problems. He’s not a bogey man. He doesn’t have secret powers. He is simply willing to do things that you or I (and I’m guessing as to your character here, but I hope for your sake I’m guessing correctly) would not do. He’s aware of the consequences and he doesn’t care. Admittedly, he doesn’t want to get caught; he cares that much. But morally speaking, he’s in a different room entirely.

This drives RED DRAGON and helps us to understand what Will Graham is going through. His gift and his great weakness is that he understands what makes these people tick. Essentially, he is asked to enter that room where Dolarhyde (or at least, his alter ego the Tooth Fairy/Dragon) resides. Graham hates this, because he is a decent person and he recognizes the price he may have to pay. Although the crimes committed in RED DRAGON are grisly, the book’s conflict is rooted in human relationships and human fears.

Lecter ties into this dynamic very nicely indeed. As a kind of adviser to Graham, one that wishes to elevate himself to confidant if only to procure better opportunities to taunt and injure Graham psychologically, Lecter is a truly excellent character. Like so many effective characters (for example, everyone in THE SUN ALSO RISES) we don’t really see all that much of Lecter. The reader is given some information and the rest is left blank. We know he’s “Hannibal the Cannibal” and we know that he’s killed a lot of people. We know about his past with Will Graham. We don’t know a lot more about him. Lecter is scary in a different way from Dolarhyde, but not because he is in some way a more effective killer (though this is more or less directly expressed). He is scary because he reminds the reader that Dolarhyde is not the only one. He’s not a one in a million freak. One in a few thousand, maybe. Not one in a million. The cruelty that is within us produces monsters like this. Lecter is the supreme monster because he revels in what he is and derives enjoyment from it.

This particular element of Lecter’s personality is further developed in THE SILENCE OF THE LAMBS. The clearest example is his desire to learn more about Clarice Starling’s adolescent traumas, but my favorite is his interaction with Senator Martin, mother of the abductee whose survival (or possible death) drives the novel’s narrative. Lecter openly prods at Martin and seeks to cause her pain so that he can absorb it. Lecter is not like us; he is not constrained by anything remotely related to sympathy or the acknowledgement of the feelings of others. He doesn’t care about Senator Martin. He doesn’t care about her daughter, just as he didn’t care about Dolarhyde’s victims and he didn’t care about Graham. Even antagonism seems beneath Lecter, though he seems to reserve it for Jack Crawford and his gaoler, Dr. Chilton. In THE SILENCE OF THE LAMBS, we get to see what Lecter can do, and it is terrifying. He’s intelligent, but again, his power comes from not being limited by morality. Killing does not affect him psychologically, and he may enjoy it if killing someone took a certain skill or if they had acted or behaved in a way he considered gauche. However, he may also enjoy killing for reasons entirely his own that he chooses not to share either with other characters or with us, the readers.

So far, so good. Harris even went and had a little fun with Lecter, implying the death of the rather odious Dr. Chilton. This was slightly shaky ground, really, but it worked in the context of the book. Lecter was entertaining, and both of the Lecter novels to this point had featured significant gore; the audience was being implicated by dint of their apparent hunger for more of this type of thing. Being expected to cheer, if only a little, for Lecter’s dispatching of Chilton was a little unsettling, but it worked. It worked both within the context of the novel’s narrative and within the context of the relationship between the novel and its audience. It was a nice little nod to the character himself. Go on Hannibal. Have a little fun. Despite the audacity of the moment, it was executed with restraint. Chilton’s death, after all, happens off the page. It is left to one’s imagination, though hopefully you won’t spend an awful lot of time imagining the specifics.

In THE SILENCE OF THE LAMBS, again, the killer is truly terrifying. Gumb is a monster, plain and simple. Again, like Dolarhyde, he is athletically superior to an average sedentary person. Not exactly a hard sell: you’re not going to overpower people if your typical daily activity consists of watching Netflix while eating M&Ms. That’s at least one way in which I can assure you I’m safe to be around, by the way. He is not, however, superhuman. He is scary, again, because he will do things a reasonable person will not. Gumb is quite similar to Dolarhyde as he is also driven by the desire for transformation, though whereas Dolarhyde’s obsession with the Dragon was a metaphysical journey Gumb has taken clinical rejection of what he sees as his own gender confusion as an affront and subsequently developed his own rationale for the murder of innocents. Gumb’s actions are extreme and reprehensible but they are believable. Well, at the very least, they push the boundaries of believability. They don’t quite break past those boundaries.

HANNIBAL changes everything. For one thing, we finally have Lecter centre stage. A lot of things about this make sense. Lecter himself would have no problem with the limelight. If the author is prone to converse with his characters, Lecter would surely suggest that he be given an appropriate opportunity to shine. However, there is a lot to be said for not giving the audience what they want. Harris, of course, believes this, which is what led to an utterly appalling ending to his novel. Somewhere along the way he either missed or chose to ignore the fact that he had fallen in love with his character as well. See above.

This isn’t a case of too much of a good thing either, because HANNIBAL delivers neither what was so impressive about Lecter or what was done so well in the first two books. Sure, we get to know more about Lecter and his habits and his wonderful educated manner and his European sensibilities. He’s essentially the hipster Jesus, if you don’t count all the horrible murders. It brings us no closer to why Lecter is an interesting character, however. Funnily enough, the Bryan Fuller show does all of these things, even daring to luxuriate in Lecter’s cannibalism and his apparent willingness to subject his unwitting dinner guests to his taboo gastronomic proclivities. HANNIBAL the novel does not do this, despite the fact that Lecter spends a significant portion of the book living in Florence and generally being a better European than the Europeans around him, speaking Italian better than Italians and reading medieval texts in the evening. It would be too much, it would take us too far into seeing Lecter not as an eccentric character but as a pretentious one if not for one thing:

Hannibal Lecter is Superman.

This is a significant problem. Now, Lecter doesn’t wear a cape, obviously, and he can’t fly (though this sometimes seems about the only thing he is unable to do). He also, obviously, lacks the Man of Steel’s compassion for humanity. What Lecter DOES have, in no particular order, is a collection of the following attributes:

  • Superhuman reflexes
  • Superhuman strength
  • Superhuman intelligence
  • Superhuman linguistic ability
  • Copperplate handwriting (this is mentioned so often I have decided to include it here)

Lecter is not even remotely scary because he is no longer interesting. He is a bogey man. Yes, Lecter will do things we will not do but he is capable of things we are not capable of. Not just morally but physically and intellectually. This reduces his genuine risk because it makes him impossible to relate to on any level. The average reader can handle a character with more intelligence, more street smarts, more education, what have you. The average reader isn’t a jerk. I don’t see how said reader can possibly relate to a character the author continually tells him/her is the super very best at everything OMG. It ceases to be entertaining.

Harris makes some other decisions that at first I thought might have been an attempt to counter-balance this issue, but soon realized was just more evidence of his insistence on cementing the novel’s reverence for Lecter. He saw his sister murdered (and eaten) at a young age. He only kills people that have failed morally in some way (this was news to me, from what little we were told in the first two books). His antagonist in the novel is a former victim that raped his sister when he was a child and mitigates the fact he is no longer physically capable of pederasty by making children cry and drinking their tears. I didn’t make that up. That happens. In a novel written by an adult, for other adults to read.

Note that this victim of Lecter’s attempted murder is an antagonist, as Lecter effectively shares the role of protagonist with Clarice Starling. Again, there’s nothing wrong with this in principle. However, the story does not clarify what its new direction is. Is the object here to catch Lecter? It becomes apparent early on that this is not so. This book is about Starling, to a more clearly defined degree than was THE SILENCE OF THE LAMBS. This is one of the things I like about the book… let me correct myself. It’s the only thing I like about the book.

However, I like an objective and an attempt, not the result. The result is half baked, poor in the extreme. Starling’s nemeses are weak willed, chauvinist idiots. That in itself would be fine if they weren’t so damn disappointing. Krendler, whose brief appearance in THE SILENCE OF THE LAMBS provided effective evocation of the frustrations of bureaucracy for successful individuals, particularly successful women, becomes in HANNIBAL a pathetic chauvinist that practices the put downs he hopes to deliver to Starling in advance. There is very little interesting about Krendler. Harris essentially points to him, saying to the reader “Look, isn’t this guy a JERK? What a jerk, right?” And, yes, Krendler is a jerk. He’s just not a very interesting one.

Meanwhile, Starling is being propositioned by almost every male character she encounters, a mechanic that at first appeared to be pointing to the endless crap a young woman has to put up with in a male oriented world but becomes so commonplace that it says more about Harris’ own proclivities than anything else. Just in case you didn’t get the message, Harris essentially shouts it from the rooftops: “ZOMG Clarice is so HAWT, guys!”

It’s pathetic. One of the things I liked about Starling was the manner in which her gender affected her everyday life and how the chauvinism of the men that surrounded her had a genuine affect on her career. I’m not sure why she has to be a supremely attractive person for those problems to exist. By the end of the novel it becomes rather difficult to get upset at Starling’s sudden descent into nonsensical behavior, because everything about the ending is so bad, so poorly thought out and so utterly disrespectful of the reader. And the characters the author created, for that matter.

I won’t bother to spoil the ending, though not out of any desire to avoid ruining your enjoyment of the book, as I’ve certainly been tossing spoilers left and right aplenty. I don’t have the energy to write about it. Yes, it’s that bad. I’m sure Harris feels he challenged his readers. He insulted them. You know, these books have enough odd things about them. Harris’ approach to race tends to be patronizing, apparently assuming an all-white readership (why is “black” a defining characteristic of a young man cutting the grass that we never hear about again? Couldn’t he have been wearing a work shirt, or something?) and I’m still not sure why Margot, the body-building lesbian rape victim sister of the mutilated pederast Moray Eel enthusiast, exists. We didn’t need a book that no editor had the gumption to gut (or, more appropriately, eviscerate) that spends hundreds of pages singing the praises of a murderer and then lets him get the girl. I guess I did give it away. Ah well. There’s a half-assed thematic stab at Lecter trying to regain his sister through metaphysical substitution, but I find it hard to believe many readers thought Lecter was going to kill Starling. He was such a damn nice guy at that point.

So, Lecter has been ruined as a character. Or at least he would have been if not for Mads Mikkelsen and Bryan Fuller. The NBC show salvages Lecter largely because it avoids all of the problems inherent in HANNIBAL. Hannibal Lecter is scary because he does things that seem inexplicable to a normal person. He commits acts of grievous cruelty and either enjoys the very aspect of those acts that would turn most of us away or, in what is perhaps further unsettling, ignores the cruelty of the act altogether and meets his results neither with disdain or relish but curiosity sated, left wanting of driven to discover more. This is the great success of Fuller’s Lecter. He doesn’t do what the audience might want him to do. Lecter does what Lecter wants to do.

Constructing Arguments

Constructing an argument is an important skill. For any interpretation, accusation or conclusion to hold water it must make sense and be supported by evidence.

It’s not exactly a radical point of view, is it? However, a distressingly large amount of people completely misunderstand how this process works. They don’t understand what constitutes supporting evidence. They don’t understand the concept of a clear thesis driven by identifiable facts. They don’t understand the importance of being able to cite these facts from reliable sources.

The notion of how sources work, in fact, gets to the heart of it. A lot of people utterly fail to understand that you simply cannot believe everything you read. The Internet has exacerbated the issue. In these first years of my career as an educator, I have already shed my assumption that the following generations of our society would be savvier with technological methods of gaining information. “They’ve grown up with Google” I thought; surely this would mean that inquisitive minds would find answers all the easier and thus reach logically sound conclusions more directly.

Well, I’m afraid that hasn’t been the case. One should certainly be skeptical but at the same time, one absolutely should draw a reasonable interpretation from overwhelming evidence. Most importantly, there are basic rules to follow: do some research, construct a hypothesis, conduct more research to test the hypothesis. Done. It’s straightforward. I consistently tell my students that they must question everything they read, including books that I give them to read. Apply the process: read the book, construct a hypothesis, read more course materials and look over lecture notes, return to hypothesis. Talk with me to explore questions and issues with the text or the narrative being presented. Yet, just as consistently, I receive essays that repeat what is stated in the books, sometimes verbatim. Sometimes they exaggerate the inconsistencies of historical sources further.

Take Edgar Snow’s Red Star Over China. This book, originally published in 1937, details the author’s significant period of time spent with the Chinese Communist Party’s guerrilla soldiers in 1936. It’s a fantastic source on the early years of Chinese communism, just after Mao Zedong elevated himself to unquestionable leadership of the Chinese communist movement. It gives us priceless information on significant milestones in Chinese communist ideology, particularly the Long march of 1935-1936, when the communists narrowly avoided complete obliteration at the hands of their enemies the Nationalists and successfully regrouped in advance of the successful communist revolution of 1949. It’s a great book. It’s a fascinating book. It’s a flawed book. Snow had fantastic access to Mao Zedong. Access so fantastic, in fact, that one could argue you’re getting Mao’s version of events more than you’re getting Snow’s. The book does not present a balanced view of the Chinese Communist Party.

That doesn’t lessen the book’s importance or its usefulness in a history class. Far from it; reading Red Star Over China encourages students to start asking questions. Why was Snow so enamoured with Chinese communism, as indeed millions of Chinese people would become? Why would Mao Zedong be so interested having a western correspondent travel with the Communist military? What are the issues with taking the book at face value and how should events that occur later in the twentieth century affect our interpretation of the book (or not)?

All great questions. Many students ask these questions or address them, at the very least in passing. The best students take them on directly. There are always students who fail to do this utterly however, students that write essays describing Mao Zedong as an unqualified genius and the savior of modern China. Now, I don’t have an issue with taking the view that Mao Zedong’s achievements must be recognized. You can’t just ignore the man’s many flaws however, just as you can’t decry him as a dictator who never accomplished anything. More pertinent to analyzing the book, you need to be critical of the author. Does what the author is telling us stack up in light of what we know about that period? I don’t need a specific answer from the student, but I need them to ask the question. I’m not complaining about my students here. I’m seeking to illustrate my concern that producing a high percentage of graduates with genuine ability to engage in critical thinking remains a significant challenge.

I bring all of this up because I am increasingly horrified by the popularity of the 9/11 Truth movement. Part of me feels rather strongly that I need to get past this, but I find it immensely troubling. These “truthers” actively ignore an overwhelming collection of supporting evidence for the cause of the attacks: an extremist terrorist group with a clearly stated aim of attacking the United States hijacked four passenger jets in an audacious and unprecedented attack on the World Trade Center and the Pentagon. In the weeks and months that followed, many questions were asked. Experts were called in, and conclusions were made. It’s clear. In fact, it should never have been doubt. It never has been in doubt for most people.

Yet, “truthers” ask, how can we know for sure? What about the fact that there is no clear evidence that a plane crashed into the Pentagon? What about the “dancing Israelis”? What about this one Dutch expert on controlled demolitions who gave an opinion based on viewing video footage and then “mysteriously” died in a car crash? What about the owner of the twin towers taking out an insurance policy on the buildings? What about…

I can’t go on. All of these questions have simple answers. Sometimes the simple answer really is the right one. “Truthers” decry people who believe the overwhelming evidence as sheep being led to the slaughter by… whom exactly? It’s not always clear. Some talk about the US government, some talk about an international government. My favourites are the ones who clearly imply that it was a plot to justify war in Afghanistan and Iraq but insist that they are not necessarily saying that it was a plot by the US government. Ah, trying to have it both ways. That old chestnut. In any case, they are in the right for asking questions. They are independent thinkers. They dare to be different.

Fine. Be different. However, you can’t ignore the evidence that is there. “Truthers” rely on misquotes, straw man arguments and fraudulent junk science such as the famous assertion that the twin towers could not have collapsed because the temperature did not reach the point where steel would melt. Steel doesn’t need to melt to collapse: it loses structural integrity at much lower temperatures than those required for it to melt. However, such conclusions are instantly dismissed as pro-government propaganda. Well, obviously. Forget the ethical issue here: that arguing for a government conspiracy rather willfully ignores the genuine tragedies experienced by thousands of families. Let’s look at arguments. Let’s look at what’s not mentioned in the “truther” argument.

The Middle East. As Phil Mole argues (very effectively I might add), there may well be a certain comfort in essentially pretending that the Middle East doesn’t exist. And, although there are many “truthers” outside the United States, there is something incredibly insular about this particular conspiracy theory. It maintains the United States’ position at the centre of the narrative. Asking questions is all well and good, but in this case the “truthers” are asking the wrong questions in the first place anyway. We know that Osama bin Laden and Al Qaeda sought to attack the United States. Looking back through history (and really, you don’t have to look back that far) one can see a pretty poor track record for western nations in the Middle East, particularly for the United States in the postwar period. The rise of Islamic fundamentalism as a unifying ideological dynamic has come in the wake of Arab nationalism, itself a product of numerous factors involving the detritus of the imperial mandates of the interwar period, the ham-fisted creation of the state of Israel and the already extant issue of the Zionist communities resident in the area preceding it, the frequently Machiavellian policies of both superpowers during the Cold War and the portrayal of American society as a bacchanalian den of sin among certain religious communities globally.

There are so many factors involved in the history of the Middle East that clearly support the conclusion that the events of 9/11 were the result of a terrorist attack that the only way one could argue against this would be to completely ignore the fact that the Middle East exists. There is a world out there, beyond the borders of this country. The failure of so many Americans to engage with that fact is something that must be addressed. It all comes back, finally, to this point of constructing an argument being an important skill. You simply cannot parrot back information that you read somewhere, once. You simply cannot ignore all the evidence that runs counter to your conclusion. You must acknowledge such evidence. I tell my students all the time: bringing up a credible counter-argument, even briefly, will often actually strengthen your own argument if you have constructed it well. “Truthers” will argue that the establishment has failed to do this in regard to their claims. That’s not true. Many of the conspiracy theories have either been debunked (controlled demolition) or clearly did not hold water in the first place (“dancing Israelis”). Arthur Conan Doyle famously wrote that when you “have eliminated the impossible, whatever remains, however improbable, must be the truth.” The terrorist attacks on 11 September, 2001 were unfortunately all too possible. By all means, question why the attacks had to happen. Question American and European policy in the Middle East dating back to the Ottoman Empire. Question the validity of the case for war in Afghanistan or Iraq in the years following the attacks. Question the use of Ground Zero in political turf battles in New York and beyond. Question whatever you want. We live in a free society, a society that will only benefit from intelligent criticism of our history, our cultural values and our ideological beliefs. Constructing a flawed argument on selective evidence brings no value whatsoever.

 

Delayed Inspiration

You know, you wouldn’t think it from the regular lengthy, um… sabbaticals that I take from this blog* but I spend a fair amount of time thinking about this blog. Mostly, I spend time wondering what I’ll write about, or I’ll have an idea I want to write about but that I’m sick of by the time I sit down, or I think of writing but guilt prevents me because there’s lots of other writing that I’m supposed to be doing, or readng, or researching or what have you.

*”This blog” is in my mind a continuing blog that has just moved sites a lot from its various incarnations as random outbursts occasionally yelled in blog form. Maybe I’ll unify them all some day. I’m building this up a bit much. There’s no massive back catalogue of blog posts. But still.

The funny thing is, and I feel like I say this far too often, that writing on this blog doesn’t slow my other work down at all. If anything it helps fuel momentum to the point where I’m writing more across the board, or doing more research or at the very least spending more time reading. 

So, I’ve recently been inspired yet again to start writing some more. A nice twitter conversation with @ladonnapietra recently led me down a rabbit hole of reading awesome things, particularly awesome things by the aforementioned @ladonnapietra and also a person named Cleo. Then of course, there’s always the Incredible Hulk. I wonder if he capitalizes the definite article in his name?

I wonder if there’s a band called The Indefinite Article? Or An Indefinite Article? Which would be more clever? I’ll go with neither, on that score.

So anyway, a big block for me with this blog has always been trying to figure out why the heck it even exists, really. There’s the issue that some posts can take a fair bit of time, but that’s never a massive consideration. I tend to blog in pretty fast chunks, five hundred words or so. If I go over a thousand words, something has happened. Usually, I’ve seen a Twilight film. No, the main thing is definitely an overriding sense of “What should I write about and why would anybody care?” Generally, when I ignore this I just write about Rocky films or a trailer I just saw, or whatever. The blog is alive when I write these things, which is considerably more favourable than when the blog is dead. If a blog has as many lives as a typical cat, this blog is the online iteration of some kind of all-powerful witch doctor cat. With a banjo.

The first image that Google supplies when searching for “witch doctor cat banjo” on 19 September, 2012.I’m not sure why, but the concept of a witch doctor cat makes me think of a Gabriel Knight game. We’re talking about witch doctor housecats by the way. Obviously.

So, I read these wonderful things and they get me to thinking about what I would like to write, and how I would like to write. I’m interested in storytelling but I can’t discuss it with half as much authority or be half as interesting as the Hulk. Obviously this is a small blog that nobody reads, but hey. There are any number of blogs out there that do their thing better than whatever this thing is. But then, I’m missing the point of having a blog, aren’t I?

The line between inspiration and outright thievery can be a vague one I suppose, but then I’m not really in danger of stealing anything from anyone. I don’t think I could face it. Whatever reason I’ve persisted with this damn thing would be gone completely.

I wonder if this blog will ever get its act together? When it does, what percentage of posts will be posts talking about trying to get this blog’s act together? If you’re reading this, go ahead and assume it’s all going to improve very quickly.

So, without further ado, posts coming up:

 

  • Reflections on The Knife of Never Letting Go and a debate on whether to continue with the Chaos Walking trilogy.
  • At least one post on the Rocky films which may or may not include a post loudly praising the talents, charisma and political decisions surrounding Apollo Creed.
  • A post about something to do with the history of China, because I’ve been considering doing an actual history-themed post forever but have never got around to it.

 

See you guys soon.

 

Procrastination: An extremely overused word. (A very short post on writing)

I was going to write about board games today. Actually, I was going to write about board games last Friday, and again yesterday…

And then I remembered how this blog keeps dying and how much that annoys me. So, a short one today.

Writing. I had a good friend who finally opened my eyes about writing last year. He made me realize that it’s about enjoying writing, about enjoying creating worlds and thinking about storytelling. He helped me get past my paralysis and my self-loathing (most of the time). I don’t write as much as I should but I write now.

I recently gave up on a project that has been in my head for years and years. Not given up on it forever, but put it to the side for a while. Now I’m writing something that is essentially a bit silly. I’m concerned that it’s derivative (it is), I’m concerned that I don’t know what I’m doing (I don’t) and I’m concerned that it’ll be terrible (…). I’m writing though, and I have a story in my head, and it’s about the distant future. Essentially it combines my experiences reading of the last year that have been mostly defined by reading Dune (and several sequels thereof) and researching the lives of American missionaries in China in the 1920s. There you go.

Dune. I have to write about Dune some more in this space. What a wonderful book. It frequently came to mind as I taught students about the ebb and flow of East Asian civilizations last year. Wonderful stuff.

One last little point in among these little points overlapping: the trick to writing is to write. This is written everywhere, funnily enough, but it cannot be underestimated in its importance. This short little blog post is a piece of writing. I am writing. Write as often as you can and whenever you can. Don’t put off writing because you think you have a clear idea of what you want to do. Don’t put off writing, end of story.