Shameless Flogging

What would you do if the world kept ending, and you were the only one who knew it?

That's my brilliant editor's tagline for my new short story "Strange Matter", available now in SCI PHI JOURNAL #3. My pitch was China Syndrome meets Groundhog Day, but his is better.

"Strange Matter" is the hardest sci-fi tale I've written so far, and it includes plenty of food for thought--both technical and philosophical.

I've recommended Sci Phi Journal before, and not just because they publish me. The reason I submitted my story to them in the first place is because they're a much needed platform for stories that entertain while holding to high standards of speculative thought. I encourage you to give them your support--not because they deserve it, but because you'll get top shelf entertainment and a rigorous mental workout.

Thanks for indulging me, and happy Christmas!


Servile Art vs. Liberal Art

First, I want to thank Daddy Warpig, Remi, and Dorrinal from Geek Gab. They made co-hosting the show one of the most entertaining and rewarding experiences of my fledgling literary career. My hat's off to their listeners for keeping me on my toes with thought-provoking questions.

On to the main topic. I joined a conversation over at John C. Wright's blog about how to adequately pay writers (which is, as you'd imagine, a subject of great interest to me).

Taking a page from Catholic philosopher Josef Pieper, I argued that the value of writing is beyond material compensation.

I was somewhat surprised that other commenters found this claim controversial. On further reflection, my failure to clearly distinguish my position from that of groups like Authors United (who argued incoherently that books aren't consumer goods), probably didn't help the audience's disposition.

To start again on the right foot, I agree that books are commodities subject to market forces. An equally vital observation is that writers don't produce books; publishers do. (Some may object: "What about self-published writers?" Note the dual job description. When an indie writer writes, he's a writer. When he publishes what he wrote, he's a publisher.).

So the question at hand is, what's the writing itself--the creative act--worth? As usual, Aristotle points the way to an answer. He distinguished between work done in service to something else--the servile arts, and activities performed for their own sake--the liberal arts.

Since servile work is all about utility, it's pretty straightforward to appraise the results and compensate the worker accordingly. (If I produce a pair of shoes, my compensation should be based on the fair market price of shoes.)

But dispensing a just reward for art performed as its own end gets tricky. Oscar Wilde declared that art is useless. A Modernist filtering that statement through his utilitarian bias would conclude that art is therefore worthless. In fact, he'd have it backwards. There's a good reason that wage slaves live for the weekend, and that industry keeps churning out labor-saving devices.

In the words of philosopher Peter Gibbons:

Everything that makes life worth living comes directly from intellectual and contemplative pursuits--activities performed during leisure time. The vast majority of people need to work to finance their free time, but the means used to attain something derive their value from the desired ends. Therefore the servile arts are subordinate to the liberal arts.

If the intellectual pursuits responsible for art, culture, and the things that make human life, well, human exceed the value of servile work, how can we adequately compensate the dreamers and music makers?

We can't.

As a product of the intellect, art is informed by a spiritual principle. Since no material reward can match the value of a spiritual good, applying the shoe standard above results in a false analogy.

Not even trying to reward artists for their art is a manifestly unjust and culturally suicidal non-option. The only thing for it is to embrace the tension and do the best we can. Casting off our grey materialist shackles and acknowledging the invaluable worth of art will sweeten whatever we as consumers can offer artists, even if it's only a widow's mite.

Not convinced? Consider the art of a society based on the idea that men are defined by work:

And the art of a society informed by the belief that men are defined by reason and free will:

I'm content--grateful, even--for my IPs to be rewarded according to the market value of the media they appear in. I'd gladly write for free (in fact, I spent the last several years doing just that). I've done my job if my stories entertain someone; not necessarily if he's gracious enough to toss a dime in my hat (though I won't spurn his generosity).

This post shouldn't be read as a program to increase royalties or justify government regulation. It's a plea for artists and patrons of the arts to ask big questions. What makes life worthwhile (insert obligatory Conan quote)? Do people live to work or work to live? Is the worth of work (and the man who performs it) based solely on the monetary value of what's produced?

Human beings are free, rational beings. It follows that the more intellectually informed and freely performed a deed is, the better it exemplifies human nature, and the greater its worth in human terms. Note that there's a matter of degree here. All work done by humans is at least partially informed by reason and performed with free will.

So it isn't just artists whose work is invaluable. To some extent, it's everybody's.


Geek Gab

I'm scheduled to appear on the Geek Gab podcast hosted by the geektastic Daddy Warpig!

In case you don't know (for shame!), Geek Gab is an internet radio show hosted by the Alpha Geek himself, Daddy Warpig. The show's format is simple. Our gracious host discusses games, movies, comics (anything and everything in geek culture) with a select panel of guests. We'll also be taking live questions from the Twitter and chat room audiences, so if you've got any burning questions for me, don't hesitate to ask.

Geek Gab airs on YouTube Sunday nights from 7PM-8PM Eastern. I'll be joining the festivities on December 28. Hope to hear from you!

Update: my Geek Gab appearance has been moved up to Sunday, December 21st at 7PM EST.


We Don't Need No Education

I've come across a number of articles by professional authors that deal in part with how much, if any, formal education is required to write professionally. (Specifically, should aspiring authors take creative writing classes, pursue English degrees, attend writers' workshops, etc.?)

The consensus seems to be: "Get educated enough to know proper grammar; then ditch academia and learn the rest by writing."

This advice contradicts the message touted by the host of elders, media figures, and educators charged with guiding me during my formative years. I've always believed that their efforts were well-meant. I've since learned that they were wrong and my more experienced colleagues are right.

Like most writers, I displayed a love of reading from a young age. I produced my first crude short stories in grade school. By the time I started high school, the idea had dawned on me that I might pursue writing as a career.

My enthusiasm began to fade as I slogged through the advanced English curriculum. The creative writing class I elected to take smothered the feeble remnants of my aspirations.

I bet this sounds familiar: being forced to read dreary novels like The Scarlet Letter and The Catcher in the Rye by teachers who worship "literary" fiction and scoff at speculative fiction. That kind of environment was pretty disheartening to a kid who was then devouring the original series of Dune novels. The unstated yet clear message was that sci-fi and fantasy were for childish philistines.

Though I tried to like the literature my creative writing teacher foisted on me, it never did take. So I eventually gave up on the assigned reading, regurgitated the requisite number of interpretive short stories, and coasted by with a B minus, certain that writing was the last thing I wanted to do for a living.

By popular demand, the teacher did say we could write one spec fic story at the end of the semester, but he got behind schedule and rescinded the promise. It's a shame too, because one guy in my class was really good. He'd led the revolt against the interpretive fiction monopoly, and he kept working fantastic and sci-fi elements into his stories through the cunning and brilliant use of surrealism, schizophrenic narrators, etc. Before the class ended, he broke down and submitted a full-on high fantasy short.

I should've followed my classmate's example. Instead I muddled along under the false belief that speculative fiction is somehow less worthy than "real literature". But despite my teachers' best efforts, it turns out I'm a writer. Sometimes I wished I weren't, but I finally accepted that there's nothing I can do about it. I also accepted that I write science fiction and fantasy; not nihilistic "realist" claptrap about vapid twits coming to terms with their disillusionment.

How much education does a writer need? I learned 99% of the grammar and spelling that I still use by eighth grade. If not for high school English and creative writing, I'd probably be ten years further along in my writing career than I am now.

Once again, Pink Floyd shows us the way.


Sci Phi Journal

When this age of instantaneous global information exchange began, it seemed inevitable that every fandom should have its needs served. For years science fiction and philosophy enthusiasts like myself were the sole exception. At long last, Sci Phi Journal has brought our intellectual famine to an end!

Helmed by a visionary man of letters who understands science fiction's special relationship with philosophy (after all, every proper sci-fi story begins with the philosophical question what if?), Sci Phi Journal brings you fascinating short fiction by new and established SF authors, as well as thought-provoking essays that plumb the deeper meanings of pop culture touchstones. In classic dialectical style, questions for further thought and discussion follow each story.

Full disclosure: Sci Phi Journal recently became the first publication to purchase a short story of mine at a professional rate (perceptive regular readers may have noticed the change to this blog's description in the header). My endorsement of the magazine doesn't stem from this business relationship. I've supported Sci Phi Journal since I purchased the first issue, and reading it convinced me to send them a short story submission.

In a field that has lately struggled under the heel of overt message fic and superficial, hackneyed narratives, Sci Phi Journal is more than a breath of fresh air. It is a literary David battling a host of philistines for the noble goal of bringing you thought-provoking entertainment.

The first two issues are available now in a variety of electronic formats. Buying copies now via the following links won't just provide hours of wonder and intellectual challenge, it will help to ensure the continued availability of such rare sci-fi delicacies.

Sci Phi Journal #1

Sci Phi Journal #2


Squirrel Chasing

This post marks a milestone. It's the first time I've blogged about a Twitter exchange. Trying to peddle logic where the medium ensures the participants will just end up talking over each other isn't my idea of time well spent. John C. Wright likens fruitless argument to chasing a squirrel.

Yesterday my habitual meekness came into conflict with the only character traits that override it: my enmity toward theological error and my utter contempt for theological error being spouted from a position of smug epistemic closure.

So a guy popped up in my Twitter feed complaining about Vatican corruption. I've got no problem with honest criticism of the Church's financial and administrative apparatus. Many in the hierarchy must agree, since institutional reform is one of the most often cited reasons Francis was elected.

Anyhow another commenter hijacked the thread with a non-sequitur about how belief in God is "preposterous" and answered believers who objected by implying that they're "gullible", the subtext being that he's not.

The thread-jacker further explained:

Believing in #God is like still believing the earth is flat BECAUSE texts, traditions & teachers say so [sic]

In the unlikely event that the above statement's manifold flaws don't pop out at you like Sick Boy and Begbie jumping out of a cupboard, here's a scorecard.

  • Non-sequitur: following a comment about the Vatican needing administrative reform with a denial of God's existence doesn't advance the discussion. It derails it.
  • False analogy: the earth's shape is an empirical question. God's existence is a metaphysical question.
  • Chronological snobbery: glibly dismisses Scripture, Tradition, and all theistic philosophers on the implied charge (by way of connection with flat earth-ism) of being "outdated".
  • Ad hominem: note the underlying implication: "Believers are like flat-earthers (in that both are idiots)."
  • And most damning of all, begging the question: the statement assumes a negative answer to the question being discussed, viz. "Does God exist?" and posits that presumption as evidence of itself.
I illustrated the commenter's circular logic by asking what if all the great teachers he summarily dismissed were right, only to be met with:
That would be bizarre, [sic]
Pointing out that he answered circular logic with more circular logic, I asked the question he begged, namely "Why?" His answer was a clumsy attempt at forcing believers to either deny God's existence or blame Aleister Crowley's debauchery on God.

Philosophically inclined readers may recognize this gambit as an appeal to the Problem of Evil. There are really only two arguments against belief in God, and this is the more successful one because it mainly operates at the emotional level.

The standard Argument from Evil goes like this: "How can you believe in God when there's so much evil/suffering in the world?"

Anyway, 140 characters aren't nearly enough to properly demolish this red herring, but luckily I've got more than enough room here.

I understand why people take such appeals to sentiment seriously. Who hasn't suffered a grave injustice or witnessed horrific suffering (at least on TV)? Among the several points that people who field this argument miss is that Christianity was instituted to address, and gives the only solution that's consistent with reality for, the evil and suffering in the world (short version: Christ purifies evil by taking it upon himself and joins our sufferings to His, making them redemptive).

Yet an unstated assumption behind pointing out evil to disprove God's existence is that the reality of evil has somehow escaped believers for centuries. That premise is absurd, as a cursory glance at Scripture, the testimony of the Church Fathers, or the writings of theologians will abundantly show.

On the logical level, an appeal to the Problem of Evil has two fatal flaws. First, it relies on the unstated premise that a universe with evil/suffering in incompatible with God. Not only does the argument fail to demonstrate this claim, it's roundly refuted by Christians themselves, who've always acknowledged the realities of both God and evil.

Consequentially, the Problem of Evil doesn't disprove God's existence. It doesn't even try to. At most, it's an attempt to disprove God's omnipotence and/or goodness.

But here's the rub. Trying to disprove God's goodness/omnipotence by pointing out evil is inescapably self-defeating. The argument's force comes from the implicit scandal of an all-good, omnipotent God allowing evil and suffering. Sounds reasonable, right? The problem is that this sense of scandal relies upon the assumption that an all-good, omnipotent God does in fact exist.

Follow me on this one. Christian theology teaches that God is the source and measure of all good. Evil lacks independent existence. It's an imperfection--a parasite on the good. Therefore neither good nor evil has any meaning without God. This means there's no way to deny God's existence because you're scandalized by evil without simultaneously denying the evil that scandalized you in the first place.

I anticipate objections along the lines of, "What about people who don't accept the Christian metaphysic of good and evil?" Remember the context. The Problem of Evil was framed as an attempt to refute Christian belief in God. Unless your argument addresses your opponent's actual position, you're arguing against a straw man, which achieves nothing.

What's more, people who don't accept the Christian view of good and evil, at least partially and unwittingly, are quite rare in the West. Some appeal to moral systems based on evolutionary, behavioral, or economic theory, but then they log off and continue living their lives according to objective norms of right and wrong.

Besides, anyone who sincerely does think that all morals are reducible to quirks of natural selection, cultural conditioning, or class struggle doesn't really believe in good or evil at all; just what's useful/detrimental to the propagation of the species, the society, or the party.

So here's a bit of rhetoric for you. Look at this image.

And tell me honestly that your reaction was just an arbitrary evolutionary response, a result of social conditioning, or purely political.


The Tunnel of Indoctrination Indubitably

I recently had the displeasure of learning that Bradley University, my alma mater, will be hosting an "interactive event"/propagandistic carnival attraction called the Tunnel of Oppression.

The production is staged by the campus Office of Multicultural Student Services, who claim that it "...highlights contemporary issues of oppression in a creative and realistic avenue. It is designed to introduce participants to the concepts of oppression, privilege, and power by guiding them through a series of scenes."

In contrast, an RA at DePauw University described her participation in various scenes showing how "...religious parents hate their gay children, Muslims would find no friends on a predominantly non-Muslim campus and overweight women suffer from eating disorders."

And to leave no doubt who's really exerting power in the name of oppression, the same young woman said, “We were told that ‘human’ was not a suitable identity, but that instead we were first ‘black,’ ‘white,’ or ‘Asian’; ‘male’ or ‘female’; … ‘heterosexual’ or ‘queer.’ We were forced to act like bigots and spout off stereotypes while being told that that was what we were really thinking deep down.”

This tunnel begins to sound less like an "interactive event" intended to "...educate and challenge participants to think more deeply about issues of oppression throughout society," than an intersectionalist version of Willy Wonka's harrowing boat ride, albeit minus the fun.

Indeed, the mention of "introducing participants to the concept of privilege" gives the game away. Disqualifying dissent by invoking an opponent's alleged "privilege" is nothing but a rhetorical dodge used to preempt rational thought; not promote it.

Looking beyond rhetorical claims to the actions of those who make them is the best way to discern someone's true motives. The claim that the Tunnel of Oppression is designed to foster tolerance disproves itself when one sees that the event is based on the hypocritical notion that entire groups of people deserve to be vilified for the actions of others, with whom they share only superficial traits like skin color, nationality, etc., and when its organizers dismiss the common humanity in which true equality is grounded.

Words are but the shadows of actions, and by their actions, the tunnel's promoters betray their real aim of indoctrinating minority college students into seeing themselves as eternal victims who are helpless without institutional aid, and brainwashing white, male, heterosexual, and religious students into an unearned sense of worthlessness, guilt, and shame.

The BU Multiculturalism Office will be happy to know that their propaganda has worked--though perhaps not as they hoped it would. I am ashamed. I'm ashamed of my alma mater--of the fact that I spent even one red cent of tuition money on an institution of higher learning that employs its considerable monetary and social power to shame and mislead the students who likewise pay enormous sums for that privilege.


My Favorite Time of Year

It's time once again for the Drunken Zombie International Film Festival. If you have the means, I recommend that you stop by Landmark Cinemas in Peoria, IL on Friday, November 7 and Saturday, November 8 for a worldwide selection of the finest independent horror films.


From Contemplation to Emotion: The Decline of the Novel

Let me preface this post by referring you to this outstanding essay by Mike Flynn. It covers the transitions in thought, art, culture, politics, and more that occurred between the Ancient, Medieval, Modern, and Post-Modern ages. All six parts are worth your time, but Part 5, dealing with the rise and fall of representational art--especially the novel--is the most pertinent to the following article.

As recently as the years immediately following WWII, Thomist philosopher Josef Pieper diagnosed the cultural ills afflicting the West as stemming from the absence of leisure. Pieper argued that not only do we in the West suffer from a lack of leisure time, we've lost touch with what the concept really means. The near certainty that everyone reading this paragraph has so far interpreted "leisure" as "idleness" supports Pieper's contention.

If leisure isn't the same as slacking off, what is it, and why does it matter? Both Pieper and Aquinas held Aristotle's account of leisure as definitive. Aristotle's concept of leisure agrees with the current notion in that both see leisure time as a state of freedom form external coercion and demands (specifically, those related to work).

Where the classical and contemporary views diverge is on the purpose of leisure. Contra Peter Gibbons, freedom from servile work isn't best spent doing nothing. Aristotle identified the ultimate purpose of human life as happiness, and he equated happiness with the intellectual act of contemplation.

So we work in order to buy ourselves leisure time, and the best possible use of this freedom from work isn't idleness, but contemplative intellectual activity. Pieper makes a strong argument that strenuous labor is important, but mostly to the degree that it affords people leisure time, since leisure facilitates the intellectual activities that produce culture.

Perhaps you can see how the philosophy of leisure relates to literature. Every midlist author aspires to quit the daily 9-5 grind in order to write full-time. Authors who write for a living don't actually work in the classical sense. They are people of leisure--and we should be thankful, since they are the makers of culture.

Book sales are in decline, especially in the West. That's unsurprising considering the loss of esteem for, and understanding of, leisure. Flynn points to the novel's origin as an effort to reproduce the early Modern resurgence of representational art in literature.

St. Benedict's Triumphal Ascent into Heaven, Melk Abbey
Early Modern representational paintings, with their near-photorealism, attempts to engage all five senses, and plethora of characters were meant as subjects of meditation intended to raise the intellect toward objective truth. For the same reason, most novels written before the late 19th century tended to feature lush description, voluminous exposition, and large casts of characters--all told by objective, omniscient narrators. Novels, as originally conceived, were objects to be contemplated.

The twilight of the Modern Age has seen a near-total revolt against logic, objectivity, and contemplation in favor of sentiment, subjectivity, and emotion. Novelists have been obliged to modify their craft to keep up with public tastes, and the inherent difficulties of bending the medium so far out of its original shape is compounded by the shortening of the Post-Modern attention span.

It's now vital to hook readers within a book's first page. Description must be short and sweet, and exposition light. Readers can't be bothered to remember more than a handful of named characters, and the presence of a single protagonist whose tale is told in the first or close third person voice is the boilerplate standard.

These innovations are touted as proof that storytelling has improved over earlier forms. Novels are now more capable of holding readers' attention by depicting characters they can identify with, thus forming deep emotional attachments.

It's true that contemporary readers find books written according to present storytelling doctrine more appealing than novels written in older styles. That doesn't mean the new way is better; just that audience tastes have shifted. Contemporary readers prefer emotional stimulation to intellectual stimulation, and they themselves lack the leisure time needed to fully engage with a book like War and Peace.

The constantly declining sales of books may signal that the Post-Modern rejection of representational art has exceeded the novel's ability to accommodate it. We may, as Flynn predicts, be occupying a transitional period between the death of the book and the advent of a new mode of immersive, interactive storytelling.

On a personal note, Pieper and Flynn have helped me come to a self-realization that explains my general disinterest in most fiction published since the mid-60s. Unlike most readers, who now complain of books being too baroque and populous, I chafe at current novels' concessions to Post-Modernity. Grand, sweeping epics like Dune, The Lord of the Rings (better yet, The Silmarillion), and John C. Wright's Count to Eschaton series hold my interest far better than today's taut, claustrophobic, and brazenly emotionally manipulative fare.

My greatest pleasure in reading comes from meditating over a book. I'll habitually read the same passage over and over till I'm satisfied that I've extracted all of its implications. The Church has long practiced the devotion of lectio divina, in which reading progresses through four stages: lectio, meditatio, oratio, and finally, (with divine aid), contemplatio. Any text can be approached in this way, and doing so greatly enhances the effect that is the true aim of speculative fiction: to kindle the human longing for Eden and for distant stars.

Whether future narrative forms will provide equally apt subject matter for contemplation remains to be seen.


Frank Discussion

I recently had the pleasure of watching Frank by quirky comedy director Lenny Abrahamson. The film follows an aspiring young musician named John whose yearning to escape his dreary workaday life is fulfilled when he's unexpectedly offered a keyboardist gig with avant-garde jam band Soronprfbs (no definitive pronunciation is every established).

John is quickly spellbound by the musical genius of Frank, the band's enigmatic lead singer, and launches a social media campaign to realize Soronprfbs' star potential (ignoring Frank's pathological shyness, which is so severe that he wears a fiberglass mask at all times).

Analysis: I highly recommend seeing this movie while it's still in theaters. Failing that, it will make a worthy addition to anyone's home film library. Populated with engaging, sympathetic characters, and exploring universal themes via comedy based on the aforesaid characters; not mindless slapstick or vulgarity, Frank is best described as "refreshing".

My only caveat to recommending Frank concerns a preconception underlying its central theme--the conflict between artistic integrity and marketability. Though I hold to the observation that being good at art is of little avail if you're bad at the business, Abrahamson (and writers Ronson and Straughan) seem to argue that an audience can only be found at the cost of artistic freedom.

Having only seen Frank once, its contention that artistic legitimacy and popularity are mutually exclusive could be my own misperception. However, one troubling assumption that drives the conflict is the notion that some people have talent, some don't, and that innate difference is what separates "real" artists from posers. The film demonstrates this false dichotomy by contrasting John's industrious yet futile songwriting efforts against Frank's near-effortless compositions. Dismissing John as a parasite rings somewhat false when his efforts to build an audience garner modest success, especially considering Frank's vocal desire for people to love his music (instead, John's fatal error comes when he insists that the band change their sound).

As Tom Simon brilliantly wrote, "Talent is the Snark; but the Snark is actually a Boojum, and the name of the Boojum is Luck. People do not want to believe in Boojums, so they try very hard to hunt for Snarks."


Songs of Relevance

Like five hundred million other iTunes subscribers, I was recently issued U2's latest studio album Songs of Innocence. Which is why you're reading this, since I probably wouldn't have bought the record on my own.

It bears noting that I've been an avid fan of U2 since I was seven. They've long epitomized the best tradition of Christian rock, viz. a band that pursues artistic excellence while letting their faith influence their work, instead of building makeshift songs around sentimental religious cliches.

That said, the latest album, like its two immediate predecessors, is OK ("Every Breaking Wave" qualifies as a minor classic, but it's the only one of the new songs to reach that level). Any other contemporary band would be pleased to have produced so workmanlike a record.

But U2 is not any other band. They've won more Grammys than any other rock & roll artists. Their ambitious and long-running live tours have been attended by more people than any other musical act. They've written multiple songs that have perfectly reflected, and in turn shaped, culture. They are the last in a noble line that descends directly from The Police, The Who, The Beatles.

An OK album from U2 is equivalent to a failure from any other band.

Why has the Biggest Band in the World fallen into artistic stagnation? The current album's troubled production may give us a clue. Tentatively titled Songs of Ascent, the record was originally slated for a 2010 release. Conscious efforts aimed at exceeding the disappointing sales of their previous album eventually split recording sessions between three separate productions and four different producers. The travails surrounding Bono and Edge's involvement in the Spider-Man: Turn out the Dark musical, and the sudden departure of U2's original manager Paul McGuinness likely presented severe distractions.

The band reportedly voiced concerns while recording Songs of Innocence about their ability to stay relevant. They cited the lack of enthusiasm that greeted No Line on the Horizon as a sign that U2's music was losing touch with its audience. I posit that the first warning signs came earlier--ten years ago, with the release of How to Dismantle an Atomic Bomb.

My argument requires a brief historical recap. U2 gained a reputation for constantly reinventing themselves throughout the 80s and 90s. During this period (which, not coincidentally, overlapped with the peak of their growth and creativity), the band managed the nigh-miraculous feat of conducting bold technical and stylistic experiments while retaining thematic integrity and brand recognition.

And then they faltered, unnecessarily, over the stumbling block that is 1997's Pop.

Many fans and critics misidentify Pop as a bad album when it's actually an unfinished album. Prematurely scheduling tour dates led to a rushed production, which made itself heard at release and during the first leg of the tour. U2 was pilloried, quite unjustly, for delivering a substandard product. (My response: check out live versions of the songs from Pop as performed during the latter stages of PopMart to witness the material's strength, untapped on the album, that was finally realized after the band had time to hone the songs onstage.)

Unfortunately, U2 took the unjust criticism of their third experimental album to heart. I like All That You Can't Leave Behind as much as anybody--more than some, in fact--but I don't buy the common wisdom that the album succeeded because it disowned Pop and hearkened back to the band's Joshua Tree glory days.

First All That You Can't Leave Behind doesn't owe as much to The Joshua Tree as most people say. To me, its greater lyrical maturity and production style sound more like Achtung Baby. Second, Pop's successor won over the public less by renouncing its predecessor than by happening to resonate with the popular zeitgeist in the wake of 9/11.

Which bring us back to How to Dismantle an Atomic Bomb--the album that does overtly copy The Joshua Tree. It proves that U2 learned the wrong lesson from their two prior albums. They misinterpreted the backlash against Pop and the artificially exaggerated enthusiasm for All That You Can't Leave Behind as cues to stop experimenting, stop taking risks, and retreat into a "Classic U2 Sound" that they perceived as safe territory.

If U2 seeks a remedy for their declining relevance, they could do worse than ending their self- imposed exile to a musical fortress built on well-tread ground. Right now, the greatest risk is to avoid taking risks.


Sandman Overture #3

I recently finished the third and most recent volume of Neil Gaiman's triumphant return to The Sandman. (For the sake of the sole reader who doesn't know what The Sandman is, it's the horror/fantasy comic book series that established Gaiman's place among the greatest storytellers currently working in any medium.)

[Spoiler Alert]

Issue 1 sets quite an intriguing plot hook with Dream discussing his own murder with several other versions of himself. Issue 2 ends on a cliffhanger when Dream--accompanied by his feline counterpart--sets out to confront the culprit and perhaps avert the end of the universe.

The story promises an escalation from "intriguing" to "bombshell-dropping" in the last panel of Issue 2, when Cat-Dream mentions the possibility that he and regular Dream will meet their father.

Issue 3 lets the suspense deepen by opening with a sequence that shows various cosmic-level players--some of Gaiman's own invention and some recognizable to those familiar with DC continuity--who've gathered in anticipation of the eschaton. (Reading of an expansionist culture that seeks to protect all life by spreading an immortal, sentient cancer served as an apt reminder of Gaiman's top-tier authorial prowess.)

The story does return to Dream (which, despite his status as the eponymous hero, has never been a guarantee), who is travelling alongside...Dream...across a planet-sized "bridge" reminiscent of the Old West. After an obligatory meeting with the triple goddess, Dream dispenses some of his patented cruel and unusual punishment to a band of cannibals and gains his second (first?) companion in the form of an orphaned girl called Hope.

Gaiman promised that Sandman Overture would tie up some of the myriad loose ends that were left dangling when the original series ended its run in 1996. So far, the new miniseries has made good on that promise in some pretty major ways--but none so major as the twofold revelation we get in Issue 3 when Hope asks the Sandman to tell her a bedtime story--his best one. The proffered tale worthily justifies Dream's title as Prince of Stories.

On the whole, Sandman Overture #3 continues the superb storytelling of the previous two issues and their legendary predecessor. Only one flaw mars this otherwise flawless gem--the omission of a payoff to the implied promise that Dream would visit his father. Fortunately, it seems likely that this oversight will be resolved in the next issue. Delaying fulfillment of the promise any longer would risk turning suspense into frustration, and Gaiman's much too clever for that.


Drunken Zombie at Gen Con

Proving again that late is better than never, my excellent colleagues at the Drunken Zombie Podcast have posted the recordings we made during our trip to Gen Con. Unlike last year, we didn't find the time to do much recording at the actual convention, but hopefully you'll find the conversations we recorded in the car entertaining.


Arguments People Think Are Logical Fallacies But Aren't

To someone with formal training in logic, the internet can be a strange place indeed. It's analogous to being a pharmacist at an Old West medicine show. People guzzle down snake oil and keep going back for more.

One of the major reasons for these continued errors is many commentators' habit of parroting phrases they've seen online without taking the time to really understand what these concepts mean. As a public service, I'll cover a few of the more commonly misunderstood ideas, including often distorted logical fallacies.

Let's start by defining what "argument" means. Contrary to popular misuse, arguing doesn't mean verbally attacking someone or browbeating a debate opponent into shutting up. An argument is just two or more people trying to reach the truth through dialogue--trading premises back and forth.

Since this post is primarily concerned with logical fallacies and the misidentification thereof, we need to talk about syllogisms--the category of arguments that logical fallacies apply to. A syllogism is a form of deductive argument constructed from two or more premises leading to a conclusion that, if the premises are true and the form is correct, must also be true.

Syllogism Example:
A) All popes are Catholic.
B) Jorge Mario Bergoglio is the Pope.
Therefore, Jorge Mario Bergoglio is Catholic.

The picture of Chewbacca fighting Nazis while riding a giant squirrel illustrates two important points. The first is the need to define logical invalidity. There's a widespread misconception that "invalid" means "untrue". That's not necessarily the case. A syllogism is valid if its conclusion follows from the premises, even if the premises and/or conclusion are completely false. Therefore, validity only refers to an argument's form; not its truth value.

Here's a false yet valid argument:
A) Everything that's brown is a bear.
B) My car is brown.
Therefore, my car is a bear.

The conclusion is false because premise A is false; not because the syllogism is structured improperly. The false conclusion does follow from the premises.

Got it? Doesn't matter. I'll discuss some fake logical fallacies anyway.

1. Godwin's Law
The Chewie picture's second lesson. Godwin's Law predicts that the likelihood of Hitler/Nazis being invoked in an internet argument is directly proportional to the duration/intensity of said argument. Comparing someone to Hitler in a combox is hyperbolic and cliched, but it's not necessarily fallacious. The following valid argument shows why:

A) An evil act is evil regardless of who the perpetrator is.
B) Hitler committed genocide.
C) Hitler's genocide was evil.
D) Stalin also committed genocide.
Therefore, Stalin's genocide was evil.

So you can say that anyone who brings up Nazis forfeits the debate, but doing so is a rhetorical--not a dialectical--move.

2. No True Scotsman
No True Scotsman is a rhetorical device that isn't formally fallacious (only informally). It's not a structural flaw in an argument, but an attempt to dodge an unwanted conclusion.

Jack: No writers are Libertarians.
Bob: I'm a writer, and I'm a Libertarian.
Jack: Well, no real writers are Libertarians.

On the other hand, No True Scotsman arguments can still be valid.
A) Everything that flies is a bird.
B) Bats fly.
C)Therefore, bats are birds.
D) But all true birds have feathers.
E) Bats don't have feathers.
F) Therefore, bats aren't birds.

3. Reductio ad infinitum
One effective way to disprove an argument is to show that it necessarily leads to an absurd conclusion. An argument that concludes to an infinite regress is one such absurdity. Nevertheless, the web abounds with claims that Reductio ad infinitum is a logical fallacy.

The reason that some folks call shenanigans on this type of reductio argument is because Aristotle and Aquinas use it in their proofs for God. As Dr. Edward Feser definitively shows, the same people who dismiss Reductio ad infinitum as fallacious commit the very real Straw Man fallacy in the process.

The confusion seems to arise from Bertrand Russell cribbing David Hume's incomplete treatment of classical First Cause arguments. Their Straw Man refutation goes like this: "If everything has a cause, and God caused the universe, what caused God?"

Russell's counter-argument would be valid if it addressed what Aristotle, Aquinas, et al. actually said. Neither the Aristotelians nor the Scholastics ever claimed that "Everything has a cause." That canard is a gross simplification of sophisticated arguments that are really more like, "Everything that exists contingently must receive its being from something that exists necessarily." The accusation of special pleading leveled at the straw man utterly fails against the original argument.

As for why an infinite regress is absurd, Consider a train so long as to circle the equator that's all boxcars with no engine. Is the idea of those cars going anywhere on their own rational? Exactly.

So ends the post. Hopefully it has gone a short way toward elevating the state of online discourse. If you can think of any more non-fallacies, leave a comment below. I may do other posts on this topic.


Gen Con Recap

As promised, here's my Gen Con 2014 after action report.

Over the weekend I joined the Drunken Zombie Podcast in their now traditional excursion to Indianapolis, where they put on a couple of short horror film marathons at Gen Con. Just like last year, I drove. I call my car the Drunken Zombie Mobile Studio because we always record a few segments while in transit. (Look for this year's episode at DZ's site soon.)

Nobody was sure if the trip was going to happen until a week or so before the con. Somehow, we got our personal, work, and money issues under control, but that left us rushing to make travel arrangements at the last minute. The best deal we could get on accommodations was at a hotel twenty-plus minutes away from the con, but it was only fifty bucks a night, so who am I to complain?

Work schedule conflicts meant skipping Thursday to show up on Friday. There was a bit of a mix-up over our badges, but GMHQ sorted it out in time for me to catch a fascinating panel with Dave Wolverton. (If you're like me, you're probably most familiar with his work in the Star Wars Expanded Universe, but he's written everything from novels and short stories to screenplays and video games.) Dave taught Brandon Sanderson and Dan Wells, and when it comes to writing, nothing but world-altering knowledge bombs come forth from his mouth.

I've been attending conventions of various kinds for fifteen years now and have experienced a diminishing return on fun at most of them starting with the second year. Gen Con bucked that trend--hard. I'd deeply enjoyed seeing pros like Brandon Sanderson, Mary Robinette Kowal, and Scott Lynch last year; and since they and many other writers skipped Gen Con this year (some because of Worldcon), I lowered my expectations for the 2014 Writers' Symposium. I couldn't have been more wrong.

In addition to the venerable David Farland, the literary tag team of Jim Butcher and Larry Correia rocked my weekend like a jug of nitroglycerin launched from a trebuchet. These guys supposedly hadn't met before, but their mutual entertainment-first work ethic and aversion to authority makes me suspect that they're twins separated at birth (yes, that kind).

DZ ended up having a great turnout for their mini film fest, despite a few scheduling snags, and every random congoer I talked to expressed a keen interest in indie horror films. I think it bodes well for the Drunken Zombie International Film Festival in November.

Now that I've recovered from my weekend of dizzying nerdery, I can say that I had as much--if not more--fun at Gen Con this year as I did last year. I got career advice from borderline psychotic authors, the chance to try out some cool games, and best of all: three days of mind-blowing fun with some of my best buds. Clearly, my life can only go downhill from here!


Get a Different Table

The winners of the 2014 Hugo Awards were announced yesterday. I heartily congratulate the authors of the winning works. As for the other nominees, take heart. Being nominated actually is an honor--especially considering how the whole process is structured.

Much is being made of the losses suffered by Opera Vita Aeterna and Warbound. Partisans on both sides seem to have forgotten the purpose of Larry Correia's Sad Puppies campaign. The point was to prove that the Hugos aren't an objective standard of literary quality, but a reflection of Worldcon attendees' tastes (since the winners are decided by the membership's popular vote, this conclusion should spark no controversy).

The record clearly shows that Larry never expected to win. His strategy hinged on provoking his detractors so the public could see their vitriolic response. Said reaction was duly provided when he and Day were nominated, and their defeats occasioned a second, definitive round of bad sportsmanship from their critics.

Some fans disappointed by Sunday's results are lobbying for an even bigger Sad Puppies effort next year in an attempt to outnumber the traditional Worldcon membership. As someone who rooted for Wheel of Time (and a fan of Larry), my take is, if you're not welcome at the table; don't call more football team members over to crowd out the drama club kids who were there first. Get a different table.

I'm aware of arguments that favor invading Worldcon to "save SFF". Though saving science fiction and fantasy is a worthy goal, the Hugos are the wrong battlefield. We need only look to Jordan and Sanderson's loss for proof that the set of "Worldcon voters" is not identical to the set of "people who read SFF". If The Wheel of Time's sales are any indication, the former group is a tiny subset of the latter.

Worldcon is a free association of likeminded SFF fans who give awards to works that satisfy their collective tastes. Let them enjoy the right to continue their tradition. Some see the phenomenon of "message fiction" beating more popular fare in awards contests as an ill omen for the genre. Fear not. The paroxysms shaking legacy publishing in the course of the digital revolution make it unlikely that debut and midlist works--which usually seem to sweep the Hugo Awards--will appear in print for much longer.

Authors who primarily wish to preach a message will always have a digital platform. But SFF is part of the entertainment industry, where the real crown goes to the best entertainers. Activism isn't needed. Market forces have already determined that the future belongs to the Jordans, Sandersons, and Correias of genre publishing.


Going to Gen Con

I'm heading to Indianapolis today as part of Drunken Zombie's Gen Con excursion. We'll be running a mini festival of short horror films by independent filmmakers. There are two showings scheduled: one on Friday evening and one on Saturday afternoon, so please stop by if you're a fan of independent horror who's attending the con.

Besides helping DZ run their event, I plan to spend a lot of time at the Gen Con Writer's Symposium. An unfortunate scheduling conflict with World Con means that many major authors will be in London this year. Fortunately, several authors I'm really excited about have decided to remain stateside for Gen Con, including Jim Butcher, Larry Correia, and Dave Wolverton.

I'll be back with a full report after the con.


The Primacy of Speculative Reason

Yeah, yeah, but your scientists were so preoccupied with whether or not they could that they didn't stop to think if they should.
-Dr. Ian Malcolm

Americans' habitual contempt for speculative reason never fails to dismay me, though our great country's myopic fascination with pure practicality is hardly surprising. A brief survey of our history reveals a clear preference for asking, "What should be done/how should we do it?" over "Why should we do this/what does it mean?".

From before the time of Plato; through Aristotle and Aquinas, the chief concern of Western philosophy was to address important questions through dialogue based on appeals to first principles (i.e. speculative reason). This noble tradition's downfall can be traced to the work of a single German philosopher. No, it's not Karl Marx. To pinpoint the moment when speculative reason toppled  from its throne, we must go back yet another century to the work of Immanuel Kant.

Frustrated by the perceived lack of stability in classical metaphysics (despite probably having read very little of it), Kant restricted the sphere of rational knowledge to experience and empiricism--despite the fact that doing so requires an appeal to sources of knowledge beyond experience and empiricism. Likewise, he failed to anticipate the catastrophic results of undermining natural law-based ethics while absolutizing personal autonomy.

If you're a typical postmodern Westerner, you probably couldn't care less about anything in the post above (except for the Jurassic Park quote--man, is it amazing how well that movie holds up or what?). You can be certain that I understand your deeply ingrained impatience with history, ontology, and philosophy in general. Rest assured that I'll explain why you should be gravely, intimately concerned with the airy notions that a bunch of Greeks and Germans discussed in the forgotten dark age that gripped the world before last Wednesday.

Exhibit A in my case for speculative reason is this article by Matt Saccaro. I cite this piece as a perfect example of 1) the practical reason-fueled utilitarian bias that dominates American culture and 2) the self-refuting absurdity of that bias. In support of his proposal to cut liberal arts disciplines from college curricula, the author argues that these fields of study serve only to shelter "intellectual cravens" unfit for science, technology, engineering, and mathematics degrees. Removing "soft disciplines" like literature, fine arts, etc. would keep the riffraff out of college and in their rightful place as blue-collar laborers.

Mr. Saccaro's belief that, "the realities of the 21st century world make it true" that students whose natural gifts and dispositions lead them to non-STEM vocations have no business in college is less self-evident than he assumes. I could build a counter argument based on declining STEM job security due to the glut of outsourcing and work visas, along with the need for authoritative standards in fields like law, education, and yes, art; but that would mean first accepting the current zeitgeist's false, biased terms. No, all that's needed to show the faults in Saccaro's position is to ask, "How do you know that?"

Setting aside the flagrant hubris of pigeonholing all human beings in either STEM or Intellectual Craven categories (I'll take Mr. Saccaro's identification of skilled tradesmen with college washouts more seriously when he demonstrates enough skill to install water and gas lines for a laundry room without flooding/blowing up his home), I'll point out that asserting the supremacy of STEM fields over liberal arts involves a value statement. I.e. to avoid circularity, arguments from utility must appeal to principles discovered through metaphysics. Practical reason depends on speculative reason.

I couldn't cast a silver bullet more lethal to utilitarian bias than the one Saccaro uses to shoot himself in the foot:
There are two possible fates for the American postsecondary education system. One is for it to maintain its current status as a factory that produces debt-slaves and baristas that can recite Emmanuel Kant’s passages from memory. The other is for Universities and Colleges to become leaner, more-functional institutions that remove all unnecessary coursework, and focus only on what matters.
That whirring sound is Kant spinning in his grave.