And so Apple’s attempt to convert all my electronic gadgets into sleek, glossy slabs of sci-fi magic continues, as I cave in and finally buy an iPad.
I no longer have a laptop, so my only two options for doing computery things were my desktop PC and my iPod Touch. As you can imagine, the former isn’t very mobile and the latter has a variety of limitations, so I needed something in between.
I did debate getting a new laptop, or even a notebook, but to a certain extent I still ran into the mobility issue. You see, I do a helluva lot of reading. Not just ebooks, but magazines, blogs, web sites, etc. And being fairly active on Facebook, Google+ and Twitter, I like to share and discuss that content as much as possible. And, more often than not, most of the opportunities I get for reading, sharing, and discussing, tend to occur when I’m not in front of my desktop computer.
The iPod Touch is very good at handling all of this, and its retina display is pretty nifty, but the display screen is still relatively small and typing anything longer than a tweet or quick email can become a chore. Given the sheer amount of reading I like to do, a laptop simply wouldn’t have been a viable option; I’d have quickly fallen back to using the iPod Touch to read stuff. So it looked like a tablet device was the smartest choice.
It didn’t take me too long to decide which brand to go for. There’s some nice Android-based tablets out there, but having lived with an Android smartphone for the past year or so, and run into numerous problems with it, I decided to pass up that corner of the marketplace for now. The nook Tablet is one of the best devices in its class, but it’s not really powerful enough to run the sort of apps I need. So that only really left one contender, and given how useful I’ve found my iPod Touch, the iPad remained the only logical choice.
At this point I’ve only had it about 24 hours, yet I already feel perfectly at home with it. Well, it does have the same operating system as the iPod Touch, so that’s pretty much a given. And most of the apps work the same way.
I was a little disappointed to find that Reeder, my news reader of choice on the iPod, behaved a lot differently on the iPad. I subscribe to about 1300+ blogs via Google Reader (more about that in a future blog post), so I’m very particular about news reader apps. I like to be able to punch up a label and see a list of all the news feeds with that label. The iPod Touch version of Reeder allowed you to do that, but the iPad version will just group articles by feed or chronologically. With some labels containing up to 50 blogs, that would mean a lot of scrolling to find what I’m looking for, so being able to punch up one of those blogs as quickly as possible is quite important. (Edit: Reeder kinda gives you a label/tag view, but it treats them like photo gallery images, where each label is represented as a thumbnail which you can pinch open or closed. It looks nice visually but isn’t very practical for browsing large numbers of feeds.)
Thankfully, a bit of digging around the interwebs led me to a wonderful app, exclusive to the iPad, called Mr. Reader. It does everything that the iPod version of Reeder does, but has a lot more bells and whistles. If they ever bring out an iPod version I’ll most likely be switching to it on that platform too.
Given how unsatisfied I’ve been with my Android phone, once Verizon open the gates and let me upgrade, I’ll more than likely be switching to an iPhone. See how easily I’ve slipped into Apple’s grasp? But while they may even talk me into getting a MacBook in the dim, distant future, I’m going to draw the line at desktop computing. I’ve been a hardcore PC gamer for almost 20 years now, so unless something happens to kill off the PC gaming industry, my desktop allegiance will remain with Windows.
And that’s all it boils down to really—picking the most suitable tool for the job. Brand loyalty’s all well and good, but at the end of the day I’ll go for whichever device lets me do what I need to do as quickly and efficiently as possible. You’ve won this round, Apple. Five years down the line? All bets are off.
If you’re anything like me, not only do you have access to more than one gaming platform, but you also like to take advantage of sales, deals and other such discountery in order to bolster your video game collection.
This is good. We like games. We like good games. We like good, cheap games. Unfortunately, the planet we live on isn’t terribly sporting and refuses to increase the number of hours available to us on any given day, so at some point that collection becomes just a little too unwieldy. We can’t possibly play everything through to completion, so choices need to be made, priorities readjusted, lovers unspurned, etc.
In the past I’ve cultivated a rather nasty habit of starting a new game, getting anywhere from 10-90% into it, then gleefully abandoning it once something newer and shinier comes along. In short, I’m a gaming polygamist. Which I guess is a fancy-pants way of saying that I’m easily distracted.
A casual glance at the PlayStation 3 games on my shelf reveals a startling number of titles that have long since been abandoned, simply because I bought something else (probably many something elses) before I finished them. Deus Ex: Human Revolution, Red Dead Redemption, LA Noire, Fallout: New Vegas. They all stare accusingly back at me, demanding to know how they wronged me. A similar glance at the (gulp) 230+ games on my Steam account earns similar disapproving looks from those I grabbed during a Steam sale and have barely touched since.
Why do so many games in my collection remain unfinished? Has my attention span shortened over the years? Am I buying more games than I have spare time to play? Or are developers at fault for failing to deliver a sustainable gaming experience?
I pondered these questions a couple of months ago after I noticed just how many games I’d purchased since Thanksgiving. To that end, I made a conscious effort (I guess you could call it a belated New Year’s Resolution) to see more games through to completion, or at the very least play through one game at a time. I started out well, playing through the PC version of Alan Wake from start to finish in less than a week, before moving onto Uncharted 3: Drake’s Deception and then spending a week or so with Journey.
I’ve currently clocked up about 16 hours with STALKER: Call of Pripyat. I purchased it when it first came out, but the desktop PC I had at the time wasn’t quite up to snuff, so it wasn’t until I put a new system together in February that I finally found myself with the opportunity to go back to it. I’m very much near the end of that game now, although I have been a little bit naughty and have started to overlap with a new play through of LA Noire. In fact, Rockstar’s 1940s detective epic has become something of a bedtime ritual for me, where I’m able to play through one new case each night. The game’s episodic nature lends itself quite well to shorter, concentrated bursts of gameplay, which contrasts nicely with Call of Pripyat’s more voracious time consumption.
Assuming I do manage to stick with more titles through to the end, or not play as many games simultaneously, I don’t think I’ll be able to completely curtail my bargain hunting tendencies. But I am being mindful of reducing the quantity of sale items I end up purchasing. Sometimes it’s too tempting to grab a title simply because it’s down 75% to $4.99 rather than because I actually have a burning desire to play that game any time soon. So now my bargain hunting will be conducted with an eye to choosing titles I’m more likely to play.
Thankfully I’m not alone in all this madness. Many of my online peers report the same problem—too many games, too little time to play them. I guess we should count our blessings that our favorite hobby consistently delivers products of an exceptionally high quality, otherwise this problem wouldn’t exist in the first place.
More shitty games, please, developers? It’s the only way some of us may ever hope to catch up with our backlog.
I finally got round to seeing The Hunger Games last week.
It’s not the kind of movie I’d normally run out to see at the theater, but having been subjected to the TV spot trailer in excess of 1800 times (yes, I worked it out) within the last month, thanks to my day job, I really needed to get the damn movie out of my system.
Verdict? It was okay.
I’d previously read about half the novel and skimmed the rest. I found the story to be derivative, but Suzanne Collins had a fairly engaging prose style that kept me going. The movie itself had a vague TV movie whiff about it, although I’d long since resigned myself to the notion of a watered down interpretation of the novel once I heard that Gary Ross (previously responsible for Seabiscuit and Pleasantville) was involved.
Jennifer Lawrence did a remarkable job with the material at hand, but losing Katniss’s first-person narration from the novel ensured that we only ever got to see a very superficially-rendered Katniss. The odd blub or two aside, the movie version of Katniss never really seemed emotionally connected to the unfolding drama. In the novel, she’s very much plunged into inner turmoil and conflict, all of which is deftly handled, but we never really see that translated to the big screen.
I’ll avoid drawing the obvious comparisons with the much meatier Battle Royale, but one thing The Hunger Games did remind me of is the post-apocalyptic young adult fiction written by John Christopher throughout the sixties and seventies, particularly The Guardians and Wild Jack. While the plots of those books have very little to do with The Hunger Games, the broader themes at play are very similar: a world divided by the haves and have-nots in the wake of some post-apocalyptic tragedy; young adult protagonists making the transition from one half of that world to another, inciting the underdogs to overthrow the technological elite, etc.
Now, given that this is primarily a video game blog, I’d be remiss if I didn’t address the obvious question: why isn’t there a video game tie-in?
Well, apparently Lionsgate are keen to expand the franchise into video game territory. Not only does it make financial sense (it’s unusual for a $300M grossing movie to not have a video game tie-in), but it makes logistical sense too, given that the eponymous Hunger Games event within the book/movie is essentially one huge deathmatch.
Of course, anyone familiar with the source material will automatically recognize the biggest hurdle facing any developer brave enough to tackle this property: how do you present a game in which a 12 year-old child will inevitably kill, or be killed by, another child? You can already hear a tsunami of outcry and indignation beginning to well up, from certain interest groups, at the merest hint of such a possibility. So what’s a game developer to do?
To ignore the Hunger Games event itself would be nuts, because every narrative thread and character arc converges there. It’s the primary focus of the novel (and movie), and to sidestep around the event just to avoid a controversy wouldn’t be doing the source material justice. So assuming the game is about the Hunger Games event, how do you depict teen-on-teen violence without earning the game an M rating (or equivalent) and incurring the wrath of the Perpetually Indignant?
Should the game pull back, mere milliseconds from the moment of a kill, and depict things implicitly rather than explicitly? Call of Duty and Battlefield fans wouldn’t be happy. They’re quite used to shoving shotguns up one another’s assholes and dancing in the post-trigger-pull shower of viscera that follows. People get shot or sliced and diced in The Hunger Games just like any other deathmatch game, but how far do you go to portray that?
I’m asking a lot of questions here but not really giving any answers. The Hunger Games event isn’t something that’s supposed to be entertaining. While it’s a nationwide TV event, it’s only the pampered power elite who gain any sort of pleasure from it. The friends and families of the Tributes who watch on the giant screens erected in their respective Districts simply watch in numbed silence, praying that their sons and daughters make it out alive. Given that any video game’s primary intent is to entertain those who play it, it will be interesting to see how that discrepancy is addressed.
Any prospective Hunger Games video game should be about survival, compassion and constantly require the player to question their role in the event and the society that allows it to continue, year after year. Only once that strong, contextual backdrop is in place can the developer start to explore just how far they’re willing to take the violence. We’ll just have to wait and see if anyone out there is up to this task.
You know what I like most about STALKER: Call of Pripyat? It doesn’t hold your hand.
I think we’ve become rather too accustomed to hand-holding within the last ten years or so. These days you’d be hard-pressed to find a game that doesn’t want to walk you through the first hour or two of the game, telling you what all the buttons do, introducing you to an array of gameplay mechanics, doing its best to shoehorn some exposition (or even keep the plot moving!) as it does so. Some games do this a little more invisibly than others, but that hand-holding phase is still there.
Call of Pripyat grabs the comfortable crutch of the tutorial phase and snaps it in two over its knee. You get a brief FMV sequence that serves as the intro to the game before you’re thrust out into the world, given six vague objectives to get you started, but are otherwise left to your own devices. Granted, there’s a major plot line you can follow, if you decide to focus on the yellow missions, but there’s a huge array of optional side missions that really allow you to see everything the game has to offer.
Regardless of whether you doggedly stick to the main narrative or embark upon an intricate web of tangents, one thing quickly becomes abundantly clear: you’ll need resources in order to survive. Once again, the game doesn’t really give you any pointers as to where to locate these resources. It’s up to you to decide where these are to be found and how much risk you’re willing to take on in order to claim your reward. You won’t get very far in the game unless you manage the essentials: ammo, food and medicine.
Call of Pripyat isn’t one of those games that scatters these resources around like cheap candy. You need to find them, work for them, trade them. For the first five or six hours of the game, you’ll be hanging onto every bullet for dear life, painfully aware that every bullet that doesn’t hit its target is yet another resource being thrown down the drain. Carelessly wander into an area of high radiation, without adequate protection, and you’ll be cursing yourself when you have to use one of only two or three notoriously expensive anti-radiation kits. Similarly, if you take a bullet and start bleeding, you’ll wish you exercised more caution before you needlessly exposed yourself to enemy fire, reluctantly using up one of your rolls of bandages or precious medical kits.
But the game is fair. After a while you’ll start hammering out your own particular trade routes between locations, know where certain resources can be found, how to stockpile the useful items and sell off the ones you’ll never use. You’ll become more adept with your weapons of choice, learn to make every bullet count, and know which risks are worth taking. Eventually, those narrative-progressing missions that previously seemed unattainable will eventually fall into your reach, taking you to a new, foreign region of the map in which everything you’ve previously learned may not necessarily hold you in adequate stead.
So here I am, edging around the perimeter of the Jupiter Plant, on a wet, grey and windswept morning. Thunder rumbles ominously overhead as I edge round the crumbling concrete facade of this apparently derelict building, knowing only that somewhere inside is a clue about where I should head next if I want to remain on the trail of the military convoy that disappeared somewhere into the heart of the Zone. I’m well prepared: I have adequate medical supplies, enough ammo for my shotgun, scoped-rifle and back-up pistol to see me through a major skirmish or two. All my equipment has been repaired and upgraded with a few enhancements. I have enough drugs and anti-radiation pills to see me through one or two minor mishaps, and plenty of food should hunger strike. Anything else I need will have to procured along the way; I may be lucky enough to stumble upon another Stalker’s cache, or I may have to resort to salvaging what I can from dead bodies.
Other than the booming thunder and lashing rain, the outer perimeter of the Jupiter plant remains eerily quiet. I peer around the corner of a pillar, hoping to gain some sense of what lies before me. A flash of lightning and… was that something moving in the wild grass that has sprouted from the cracked asphalt of the courtyard? I risk another glance. There they are: two wild dogs wander across the abandoned concrete plateau. I remain still, aware that any sudden movement could alert them to my presence, watch them as far as I can without moving from my vantage point, attempting to make a mental note of where they could be heading. I may need to make a dash across that courtyard should an emergency arise.
But for now I’m heading into the building via an open doorway. Darkness looms within. I turn on my flashlight, switch out my rifle for the shotgun and cautiously proceed inside. I can only pray that I’m adequately prepared for whatever I find within.
Oh look, some vague rumors about the technical specs of the next Xbox console.
As expected, the internets are now all a-twitter (or a-plussing, if that’s your thing) about what this could all possibly mean for the fate of Microsoft’s next big hope. Technophiles have been quick to point out how laughably underpowered the new console’s proposed video card is, at least compared to what’s currently available for desktop PC gamers. Naturally, they’re overlooking the fact that the Xbox 360′s Xenos video card was comparably underpowered back in 2005, but we don’t want that to get in the way of a good whine.
The level of negativity this rumor is attracting does beg one question though: what was everyone expecting?
Ever since the Sony PlayStation and Sega Saturn arrived on the scene — long enough ago to make me feel depressed about how old I’m getting — console manufacturers have been playing a never-ending game of catch-up with PC hardware manufacturers. As each new console rolled off the production line, PC gamers would be there waiting to point, laugh, and compare tech spec e-peens. But in recent years that gap has been closing, at least where visuals are concerned. And, let’s face it, that’s the only yardstick most people use when it comes to assessing just how “advanced” a console is. What’s its maximum resolution? What shaders does it have? How many magical video card things does it do per second? Will the next Final Fantasy game finally look like the FMV of my dreams, dammit?
Not that many PC gamers are in a position to claim a significant technical advantage. The average PC gamer isn’t even playing games in 1080p right now. In fact, fewer than 10% of gamers who use Steam are playing in 1920 x 1080 or above. When the next generation Wii, Xbox and PlayStation arrive on the scene, every console gamer will be playing in 1080p. Now who’s playing catch-up?
Relax, guys. It all boils down to this: PC gamers and console gamers are going to be on pretty level footing when it comes to how pretty everything looks. Sure, those ten percenters will have insane resolutions and enough anti-aliasing to turn a rough day around — and they’ll remind us of this fact every damn opportunity they get — but we’re all one big happy family now.
So now we can stop worrying about how many polygons the neighbors are throwing around every second, let’s turn our attention to some of the things we’ve been neglecting lately, such as AI, physics, sound propagation, narrative/mechanical cohesion, etc. Working on those areas is going to deliver more interesting gaming experiences, not the ability to count a space marine’s bountiful supply of nasal hairs.