Tuesday, April 9, 2013

Some stuff, not really a blog.

Hey there!

No blog today, as I have a pretty serious case of writer's block and about five projects to finish. The overload is making me shut down. It's not pretty. Working eleven hours a day for no pay and an audience of zero isn't the most rewarding thing on the planet, either. You are out there right? Is this thing on? Is the comments section broken? Are you following my Twitter feed? Sharing with your friends? It's cuz I'm a shitty writer, isn't it? LOVE ME DAMMIT!

Told you it wasn't pretty. Back tomorrow, I promise.

For now, enjoy this leaked footage of Farcry3: Blood Dragon, starring Michael Biehn. You read that right. Bask in the glory of the Biehn's gravely, dulcet tones. That is if you can hear past the player that seems to think someone, somewhere, is interested in him swearing at the game in heavily accented English.



Monday, April 8, 2013

Irish Coffee, April 8th, 2013


My weekend involved a few regrets, many drinks, and a gaggle of IMAX dinosaurs. This morning will involve a few espresso shots, many crunches, and some personal time with Dishonored, a game I'm finally getting around to. Good luck with your week and see you tomorrow.

I should just write up all of my predictions. That way, when facts finally come along to support them, I can firmly lay claim to my desired title of “gaming soothsayer.” Because predicting the future is groovy, and who the fuck else is going to voluntarily call themselves a “soothsayer?” 

It seems, based on the latest rumors, that the next Xbox will offer two models: One subscription based and running $300, and a non-subscription flavor, clocking in at $500. Seems that a persistent internet connection is also “confirmed,” or as close to such a thing as we can hope for this month.

Now to my “prediction,” and this didn’t relate directly to the coming console cycle, but the world of tomorrow. 


Presently, tablets are in a race to provide power comparable to that of the game console. While it will surely take them a few years, they will eventually catch up, and the landscape for home gaming will change completely. Steam box and the Ouya are bringing mobile games to the television, and it takes little more than a bluetooth capable controller and an HDMI cable for any compatible device to output in a similar fashion. Devices like the Razer Edge.

The titles that can benefit from this coming together of devices don’t run anything like their AAA console counterparts, save for those running on Razer’s holy-shit-why-ow-my-wallet-expensive gaming tablet, but time will cure that particular affliction. What this will create - and rather unceremoniously - is not the death of the game console, but its inevitable transformation. The game console market will change as well, because if the tablet that replaced your laptop can hook up to your television and run a AAA game, why on Earth would you buy a separate machine to do that job?

The world of mobile gaming and independent publication will create an environment in which not just Sony, Nintendo, and Microsoft create game consoles, but Motorola, LG, and Apple do as well. And how, prey, do these mobile companies provide most people that can’t afford a $700 smart phone with a yearly or bi-yearly upgrade? Contracts. Sign with Verizon for two years, and that $700 Droid is now $200 ($150 with a mail-in rebate! Act now!)

Microsoft’s only real problem, if today’s rumors prove to be true, is that they jumped the gun. This will certainly be the last generation for the traditional game console, but this is the last generation of the traditional game console. A move such as this would make perfect sense once technological convergence has had its way with the gaming world, but we’re not quite there yet. Further, Microsoft is in fact releasing a traditional game console, not the jumped-up, TV/Oculus Rift-ready tablet of tomorrow. The market is simply not ready for this sort of thing yet, nor is Microsoft’s console.

All signs point to the final console cycle being dominated by Sony, bookending their superiority with the PSX and the PS4. If even half of the rumors surrounding Microsoft’s next console prove to be true, I’ve already made my decision.

Also, my 30th birthday is next month, and I’m hoping someone buys me a Vita. Hint hint.

Friday, April 5, 2013

Irish Coffee, April 5th, 2013

Livid. I'm fucking LIVID this morning. I should calm down, but if I drink alone before noon it'll be time to seek help and I hate hospitals. I think I'll just go outside and kick a tree until it falls over. Also, happy Friday! I'll try to get a blog up this weekend, but if I don't manage it, see you on Monday morning!
Mahalo.

Rut roh Raggy. Someone’s been a dipshit.

Meet Adam Orth, creative director at Microsoft, who is probably being unceremoniously spanked as we speak. If there is such a position in Microsoft’s PR department. I can’t imagine why there wouldn’t be. He took to twitter to espouse the virtues (if I could be so generous) of an always online console. The conversation in question can be found here. Shortly after this exchange, Orth set his twitter feed to private.

Microsoft has been very tight lipped about their next-gen offering, but rumors have been circulating  that the coming console (codenamed Durango) will require a persistent internet connection. What this means, for the uninitiated, is that the console will essentially cease to function as a game machine if your internet connection drops or you don’t have the internet period. Let’s say, some of the problems you might have if you live in a rural area. Something Orth seems to have a problem with as well:



If there’s one thing gamers have proven, it’s that we want control of our content. For evidence of this, look no further than the clusterfuck that was SimCity’s launch. Within a week, the more technically inclined of our ilk had found a way to make an “always online” game function offline, and publicly humiliated the game’s publisher. Uh-oh. EA, you silly little liars. Someone call the spank team from the PR department.

So, naturally, when Orth started saying stupid things in a public forum, the gaming community went into White Knight Mode. Here are just a few of the reactions from Reddit, from whence this story came:









Now, it’s not just that this guy is a towering douchebag (he is) or that a game consoles with persistent internet connections are a bad idea (they are), it’s that this is yet another demonstration of the ever-widening disconnect between the corporate structure of game publication, and the average gamer. Can they not see the signs? Do they not understand that the AAA title is rapidly becoming a relic, and that gamers can and will take their money to independent markets if they aren’t treated with at least some basic respect? This isn’t the movie industry, and companies like EA and Microsoft need to realize that. Video games and their distribution platforms are far too mercurial an entity to be controlled with an iron grip.

Additionally, in regards to Orth’s tweet that he should be able to say what he wants on Twitter: You have the words “Microsoft Creative Director” in your Twitter description, you fucking child. You are therefor responsible for the stupid shit that oozes out of your brain. If you want to spout this tripe at the populace, create a different twitter account that doesn’t have your name or the name of your employer on it and that doesn’t list your position. See if anyone wants to listen to your bullshit then.

Sorry, I went a little off the rails, there. It just makes me unbelievably fucking angry when rich, bitch-made little cocksuckers like Orth take offense to the idea that someone might be poor or live in a rural area. I did live in a rural area most of my life, and I was poor, and if I still lived on Daufuskie Island, I wouldn’t be able to own a console with a persistent internet connection, if I could even afford the thing in the first place. Why don’t you go somewhere in rural America and try saying this shit to someone’s face. See how well that goes for you, you impotent little shit.

Welcome to the real world, Sweet Billy. You’re going to be pantsed and stuffed in a locker.

Thursday, April 4, 2013

No blog today

Hello all!... Five of you. Comment or something, I feel so lonely, here. 

No blog today, as I'm putting the finishing touches on an article about empathy in video games. Shouldn't be a bad read, and should be finished this afternoon. I'm also editing that particular piece, not writing it. An important distinction, as I gouged the tip of my index finger with a six inch German chef's knife yesterday. It's my kitchen excalibur! How could it betray me! Anyway, typing hurts.

I'll post an update when my article's online. Full blog tomorrow!

Wednesday, April 3, 2013

Irish Coffee, April 3rd, 2013


Oh boy! A blog that actually makes sense and has a point! I'm going to start calling this "Wednesday for Thought." Or something clever. Leave me alone, I'm finishing up an article for submission to gamesbeat.com, putting together my first video rant, and trying to keep my cat from going completely mental (I don't think she's had enough catnip this week.) Once more unto the breach. See you tomorrow!

I have a problem, and it’s all TED’s fault. Every time I come across a link to a TED talk, I go down a six-hour-deep rabbit hole of educational videos. When I emerge, I swear I’m an expert in cognitive psychology, astrophysics, or international politics. Someone give me a tweed jacket with leather elbow patches, I’m ready for my students. I might think I’m in A Beautiful Mind, but it’s more like the episode of Futurama where Fry thinks he’s a robot.

My most recent adventure to the bowels of TED.com did lead me to something fairly interesting, however.


If you don’t have seventeen minutes to spend watching that video, I’ll give you the gist of it: Cognitive researcher Daphne Bavelier runs down the positive aspects of action video games on the brain, mostly as they relate to vision and multitasking. After espousing the positive aspects of moderate action game use, she goes on to say that she’d like to develop a game that was not only fun to play, but that nested the beneficial qualities she’d just finished covering.

And this got me thinking about games that have not only been improving my ability to track multiple moving objects or bolster my reaction time, but that have really taught me something subconsciously. One experience in particular stood out: Assassin’s Creed II.

During college, I had the good fortune of traveling around Europe for a semester on a literary tour. About a month in to the trip, I found myself in Venice. After getting settled in, I had some time to myself and decided to go for a wander. I didn’t know the city, and wanted to grab a beer at a local joint and soak in the atmosphere, as is my wont when I’m in a new place.

And wander I did. I was completely lost. Compounding my trouble, it was three days before the start of Carnevale di Venezia, and the streets weren’t exactly scant. So I kept turning, down one narrow alley to a dead end, back again and another narrow alley. Lost. That is until I rounded a corner and found myself in Piazza San Marco. 

Purty, ain't it?

Almost immediately, the entirety of ACII came flooding back, and I knew exactly where I was. Even without the benefit of Ezio’s parkour skills, I managed to find my way from Piazza San Marco to the Rialto Bridge, where I promptly wandered into a smokey bar full of locals, cheap beer, and salmon mousse. Memory synched.

It bothers me that more games haven’t taught me something as valuable as even my basic geographical understanding of Venice. Games rarely teach me much of anything. In some sort of ethereal way I might come to better understand character development or people in general, or I might find something like Bioshock Infinite’s city of Columbia interesting enough that I spend some time studying American exceptionalism, but games rarely teach me anything in the way that Assassin’s Creed II taught me the layout of Venice.

And perhaps that’s what games need. Not to suddenly bear the burden of the classroom, but to own up to a social responsibility inherent in their existence as an artistic medium. If games can improve the lives of those who play them, isn’t it the responsibility of developers to do so? If they already improve the cognitive functions mentioned in Bavelier’s TED talk, why can’t they take it a step further and give the end user not just an experience, but some tangible knowledge as well? 

Assassin’s Creed II was far from an “educational video game” in the derogatory sense, but it certainly educated me.

Tuesday, April 2, 2013

Irish Coffee, April 2nd, 2013


Something has my dander up this morning, and I like it. Oh, and I mention BBC's Sherlock in this morning's post, and I can't recommend it highly enough. All six ninety minute episodes are on Netflix. You have no excuse.

Games are stupid. Or at least they assume their audience is stupid. I suppose one hand washes the other in this case. There is an ever-shrinking bubble of technical limitations placed on the modern game designer, yet the under-utilization of modern techniques has left us playing the same games in the same way for well over a decade. It’s something like buying a $100 worth of Kobe beef to make a chicken fried steak. Complete waste of talent.

I recently finished watching BBC’s Sherlock series on Netflix, and if anything can be taken away from that show, it’s that the modern television audience is quite a bit smarter than it was when CSI debuted thirteen years ago. Even if the audience isn’t any smarter, show creators are clearly assuming their audience is more intelligent and acting accordingly. While there is a glut of stupid entertainment out there for the shoveling (I’m looking at you, American reality TV), there is certainly a market for “smarter” shows like Sherlock. Where are my smarter games?

Someone call Baker Street.

Before you jump all over me in an effort to point out that there are a plenty of “smart” games, let me just say: Yes, but they’re indies, and if I play one more quirky side-scroller in an effort to escape the Michael Bayification of the modern AAA title, I’m going to beat a hipster to death with their own pomade can. Even when taking indies into account, there aren’t many games that assume the player is smart. Hand holding tutorial sequences, hyper-obvious objective markers, plots that (even when complex) are explained to us like we’re five. Sure, there are plenty of games that are a little clever, a little complex, but nothing on par with complex, intelligent story telling in other mediums. Where's our Gatsby? Our Citizen Cane? Our Sherlock Holmes? At this point I’ll even settle for a Fight Club.

Now this could be chalked up to difficulty of conveyance for the video game. Game designers can assume their audience is intelligent all they like, but if the player happens to be a chimp that got a hold of a game controller or my girlfriend on tequila night, no amount of complexity is going to keep the avatar from jumping up and down in a corner and spinning around in a circle. But I don’t know that this is much of an excuse anymore. The entertainment world seems to care less and less about the lowest common denominator. They’re taken care of: just strap on TLC or Bravo like a feed bag.

The top tier games (as far as budget is concerned) are running anywhere from one-hundred to three-hundred million dollars. What we’re being told, essentially, is that nowhere in that budget is there room to develop new and interesting ways for games to tell a story. I call bullshit. Even when considering corporate interests such as broad appeal and profitability, there is still plenty of room to improve the delivery mechanism for interactive story telling.

And yet, there seems a dearth of any real effort to do so.

How it is impossible, with all the money thrown into game creation, and all of intelligent people developing them, that the closest we can get to a Sherlockian detective tale is L.A. Noire? A game so poorly executed that any review over a 7.0 must have been the result of a reviewer taking pity on the careers of the D-list actors called in to have their heads digitized. “Look! Over here! A clue! Press ‘A!’ It’s the green button!” 

We deserve a better class of video game, dammit. Not because we’re all so smart, but because game creators should stop assuming that we aren’t.

I don’t want my games to come from a feed bag, I want Kobe, dammit.

Monday, April 1, 2013

Irish Coffee, April 1st, 2013


Not funny, kind of lame, and very short. Welcome to Monday's blog! No, this isn't an April Fool's prank. I don't think I have the faculties for such a thing at the moment. Try as I might, I can never write a decent blog on Monday. Funny seems to leave my fingers, and each word is a mountain to climb. Though today's issues might have more to do with the ten-or-so pounds of Easter Dinner's baked mac and cheese presently residing in my distended gut. Fat-glazed fog. Least I wrote this fucker. Oh, and mild Tomb Raider spoiler ahead. See you tomorrow. 

I just finished playing Bioshock Infinite, but I want to talk about Tomb Raider, a game I finished just minutes before stepping on to Irrational’s floating city. Specifically, I tend to feel a little empty inside when I finish a particularly engaging single player campaign, and while this feeling was equally pronounced in both Tomb Raider and Bioshock Infinite, something bothered me quite a bit more about Tomb Raider's departure.

Been one hell of a week.

Tomb Raider’s story certainly isn’t more engaging than Bioshock Infinite’s, and I don’t know that the former has the smarts of the latter. I say “don’t know,” because I don’t think even Irrational Games has a firm grasp of the paradoxical clusterfuck that constitutes the closing minutes of their latest offering. But Tomb Raider suffers from an often used, and unfortunately necessary feature shared by many action games: After the end credits role, you can once again enter the game world.

Ostensibly, this feature exists for the collection of items you might have missed, and general achievement hunting, but by its very nature it casts a cold pallor on the game you’ve just finished playing.

No one wants a game world to feel empty. The entire goal of game design (from indie to AAA monster) is make a world feel populated and alive. As mentioned, I often feel a little empty inside when I finish a particularly engaging title, and I jump at the opportunity to go back to the game world to fill the void. All that I find when I utilize this feature is a shadow of the world that I lived in minutes before. Never more empty, never more devoid of life, my avatar trapped in a purgatory that they can never escape. There is no goal marker, there is no ending, there are no battles to fight. There is no “life” for the character anymore.

The unfortunate side effect: If I choose to stop playing the game for an extended period of time after a brief visit to the netherverse, the netherverse becomes the defining memory of that game. Lara didn’t escape the island with her friends, she’s still trapped there. Hunting for a finite supply of collectibles in an infinite world. Fucking depressing, really.

And I’m more impacted by the departure of Tomb Raider for this reason. Its characters didn’t mean as much to me as those in Infinite, not did the plot hold the depth of Infinite's. But seeing Lara running around a lifeless game world left a bad taste in my mouth. The empty island almost begging me to start a new game.

But this is a personal problem, and not a broader game industry issue. Not much more can be expected from me on a Monday morning. I have completed two pretty fantastic AAA titles over the past week, each one draining my emotional reserves with their passing. I had damn good time with both, but I’m going to need some more time with Tomb Raider before I can move on to a third game.

I’ll be funny tomorrow, I swear.