It's rumored that if you give a mouse a cookie, he'll probably ask you for a glass of milk. They also say that if you give a geek at $250 gift card, he's bound to ask for some FLAC audio to complement the mountain of Newegg boxes on the doorstep. Okay, maybe they don't say that just yet—I mean, who gives a mouse a cookie anyway? That's just asking for a rabies shot.
Recently, I was presented with a Newegg gift card and decided to take the unexpected opportunity to upgrade my digital universe. After maxing out the RAM in my ThinkPad and investing in a high-speed CF card and USB reader for the DSLR, a large chunk of my windfall remained for the pièce de résistance: some respectable headphones.
Until now, the most I'd ever spent on a head-mounted audio device was about $30 for some Sony in-ear buds. For the past couple years, I've been rocking a pair of inexpensive Sennheiser HD 201s, which deal-hunters can commonly score for less than an Andrew Jackson. While the 201s are a steal for their asking price, they also represented the weakest link in my computer's audio chain. While Audio Engine A5 speakers happily project the clear sound produced by my Asus Xonar DX sound card, the 201s seemingly shrug and exclaim, "meh."
With a $200 budget, I set out to find the perfect headphones to replace ambivalence with goosebumps. I didn't expect just how hard it would be to make a final decision.
My first stop was Best Buy—what was that groan for? While I never had any intention of making my final purchase from the retail giant, most stores have several high-end demo models available to sample. Online reviews and customer feedback are great resources, but actually hearing headphones first-hand is important when choosing your audio poison.
The first headphones I grabbed off the shelf were a pair of crimson Beats by Dre Solos. I see kids wearing the Solos all over the various airports I frequent, and I figured they must be alright. The verdict took less than five seconds: utter crap. The Solos look cool, but I'm convinced my 201s sound better. For grins, I placed the $300 Beats Studio cans over my ears for comparison. They sounded infinitely better than the red-banded Solos but were well beyond my budget. Frankly, they didn't have the sound signature I was looking for, either.
As I moved down the rack, testing out cans from the likes of Sony, Bose, and Klipsch, I found qualities in each that I liked. Nothing really got me excited until I reached the end of the line and a pair of unassuming Sennheiser HD 380 Pros. As I flipped through the demo music, the 380s consistently produced the sound signature I was seeking: tight bass that wasn't overwhelming and boomy, with excellent mid and high frequency response. The sound was just right, but after wearing the Sennheisers for only a few minutes, my head was already feeling the squeeze of the clamp-like headband.
I walked out of the store empty-handed that day, but the trip had not been a complete failure. I knew that I wanted to stick with the Sennheiser sound; it was just a matter of finding headphones that didn't also pull double duty as a workbench vice. Many Google searches and countless customer reviews later, I settled on Sennheiser's HD 558s and sealed the deal for $180.
The Sennheiser HD 558s are a direct successor to the 555s we use for listening tests here at TR. They have a circumaural design that completely surrounds the ear instead of resting on it. The open-back cans aren't sealed to prevent leakage, producing sound that feels more like it's surrounding you rather than being driven into your skull. People in the vicinity will be able to hear your music, though. You probably don't want to bring open-back headphones into the office unless you're at war with a neighboring cube.
Unlike on the Sennheiser HD 380 Pros, the sturdy plastic headband on the 558s doesn't try to narrow your noggin. On my head, the 558s are just tight enough to feel secure but loose enough to be worn comfortably worn for hours. Adding to the comfort, the ear cups are covered in a plush velour material; they surround the ears instead of mashing them into the side of your head.
For most users, the 1/4" stereo plug is going to be inconvenient. I have to install the included 3.5-mm adapter to use the 558s with any of my audio gear. Apart from A/V receivers and high-end sound cards, the vast majority of today's devices feature smaller 3.5-mm audio ports. In my opinion, it would make more sense to have a 3.5-mm plug with an adapter that steps up to a 1/4" jack.
Trailing the connector is a detachable 10-foot cord that's free of coils. A quick twist to the left and a downward tug allows the cable to break free from the left can. Being able to remove the cable allows users the freedom to roll over the cord with their office chairs without ruining the headphones as a whole. It also gives Sennheiser the opportunity to sell different versions of the cord as it sees fit.
When it comes to audio quality, I couldn't be happier with the 558s. They sound similar to the Sennheiser HD 380 Pros but have a little less bass and crisper highs. The audio is clear enough that I've been playing a new game called "guess the bitrate" when listening to my MP3 collection. I've even noticed artifacts in tracks encoded with bitrates as high as 320kbps.
As I ramp up the volume, I can hear the different instruments get more pronounced, as if I were walking toward the stage. With lesser headphones, like the 201s I replaced, cranking the volume makes the overall mix louder but not necessarily any clearer. This only works up to a point; as you approach the 112 dB ceiling of the 558s, you're really just increasing the pain, not the clarity.
After I was done amusing myself with various MP3 bitrates and deafening volume tests, I decided to see how my new headphones fared with different sources. For an impromptu test, I queued Girl Talk's All Day album in FLAC format on both my laptop and my Xonar-equipped desktop. I hit play on both devices simultaneously and switched between them a number of times to appreciate the difference. The Xonar DX in my desktop system produced a noticeably brighter and crisper rendition of the music, while the integrated audio of my notebook sounded a little murky in comparison. Despite this fact, I still maintain that money is better spent on upgrading one's speakers or headphones before adding a discrete sound card. It's difficult to hear much of a difference when your speakers sound muddy already.
I have to admit, I'm somewhat smitten with Sennheiser's HD 558 headphones and fear I may have taken a step down the expensive path of the audiophile. You don't realize what you're missing until you've heard your music through great headphones or speakers. As much as I adore my Audio Engine A5 speakers, the 558s provide a much more complete soundscape to my ears. Whereas the speakers are tasked with filling the entire room, the cans focus on what matters most: my eardrums.
I'm still getting goosebumps and hearing new things during the piano-and-guitar finale of Atreyu's Lip Gloss and Black, even though I've listened to the song at least a dozen times through the 558s. That kind of music rediscovery is a testament to the quality of the headphones.
During my shopping adventure, I discovered that people have more than a few opinions about their preferred audio gear. I'd love to hear yours.Why do mobile displays get all the love?
With few exceptions, it seems, the gadgets hogging the media limelight today tend to be of the fingerprint-collecting variety. Tablets, phones, tablet-phones, e-readers, and anything else manufacturers can saddle with a media decoding chip and a touchscreen saturate the headlines on a daily basis. Much like during the netbook onslaught of yesteryear, the portion of my brain devoted to caring about such things is reaching full capacity. There is, however, one aspect of these shiny slabs that continues to pique my interest: the displays.
In this regard, mobile devices have become the daddy's girls of the electronics world. Where PCs have to pinch pennies to score a rusty set of wheels when they reach legal driving age, tablets and phones wake up on their 16th birthday with a brand new Mustang convertible in the driveway. In PC land, aspect ratios keep getting squished, cheap panels abound, and accurate color reproduction and contrast are relegated to budget-busting models. Looking at laptops, the situation is even more dire. Apparently, we all missed the board meeting where it was decided that 1366x768 ought to be enough for anybody.
Smaller mobile devices, on the other hand, are going through a display renaissance of sorts. Manufacturers are doing everything they can to increase pixel counts, and vivid panels based on IPS and OLED technology permeate the market. Apple's new iPad introduced the world to a 2048x1536 resolution spread across 9.7" of IPS-fueled liquid crystal goodness, and many Android handsets now stretch 720p resolutions across 4" screens. Despite these advances, here I sit, cruising at 35,000 feet, banging out a blog post on a 12.1" screen with a mere 25% of the new iPad's pixel count.
The average LCD monitor has slowly but surely gained extra color-changing dots over the years. However, the upticks have been somewhat less than dramatic. We've basically just stretched the field of 1024x768 and 1280x1024 monitors into squat, 16:9 panels with 1366x768 and 1920x1080 resolutions. Screen sizes have increased to accommodate the additional pixels, but the number of pixels per inch (PPI) has remained largely unchanged.
In stark contrast, the humble smartphone screen has seen its pixel count skyrocket from 320x240 to as much as 1280x720. That's up to a 12X increase in the number of pixels crammed into displays that have grown perhaps only an inch or two larger. As a result, smartphone screens have managed to ramp up not only their resolution, but also their PPI.
Why does increasing the pixel density matter? Because adding more pixels per unit area makes everything look better, from pictures to icons to text. High-PPI displays are particularly good at smoothing out the jagged edges of fonts, resulting in crisper text that's easier to read. This added fidelity is particularly beneficial to handheld devices equipped with relatively small screens, so it's no wonder smartphones have been the first consumer devices to feature higher pixel densities. But why are smartphone and tablet panels advancing at such a breakneck pace while I can still count the individual pixels on my laptop?
If I were a betting man, I'd drop my hard earned dough on the theory that, at a given density, smaller panels offer much higher manufacturing yields than their larger counterparts. LCD panels are created in large batches, with many displays occupying a single sheet of glass. Think of it like CPU fabrication, but on two-meter glass substrates. Larger panels mean fewer screens can be squeezed out of a single substrate. Entire panels must be scrapped if defects exist, even if they only cover a relatively small area, making defects costlier with larger displays. With smaller screens, manufacturers can squeeze more panels onto a single sheet of glass, reducing the amount of waste due to localized defects.
Beyond yields, there is also the matter of capacity. If manufacturers are able to saturate their production lines with smaller displays that offer good yields and consistently sell for a tidy sum, there is little incentive to risk the time and materials on larger panels that may be less profitable. Some vendors have dabbled in high-PPI desktop displays; over a decade ago, IBM pimped a 22.2" 3840x2400 IPS panels to the medical imaging community. That's a niche market where exorbitant prices are common, though.
Tackling the manufacturing issues involved in creating large, high-density panels is only half the battle. Software compatibility is another hurdle that must be overcome. As much as I want to live the illustrious life of an Apple hater, I feel like the Mac maker has approached the high-resolution conundrum the right way with its Retina displays. By starting with a usable base resolution and increasing the number of pixels by a factor of four in the same area, software can use simple scaling to take advantage of the extra pixels without making all the icons and menus microscopic.
By contrast, the high-density display on Sony's Z-series ultraportable runs into issues. Even with an impressively dense 1920x1080 pixels under its 13" belt, the screen isn't conducive to the same seamless upscaling of fonts and icons as the iPhone or iPad. To accomplish that feat, the base resolution would have to be set at a paltry 960x540 to offer the ideal factor-of-four upscaling to 1080p.
Just the other day, Cyril reported on Windows 8's upcoming scaling features for high-density displays. This scaling is designed to allow 10.1" and 11.6" tablets with 1920x1080 resolutions to display fonts and GUI elements smoothly and macroscopically. The end results remain to be seen, but I would really prefer to see PC manufacturers set their sights on 4X scaling rather than cutting corners with odd upscaling ratios.
When might we reasonably expect high-PPI goodness to permeate laptops and desktop monitors? Sooner rather than later, I hope. Higher-resolution screens have started seeping into premium ultraportables, and Apple is expected to incorporate Retina panels in its next line of MacBook laptops. Stand-alone monitors will probably take longer to catch up, which is a shame considering how many of our readers are pining for higher pixel densities on their desktops.
The success of the fledgling quad-HD (4K) video format will be another crucial factor in the adoption rate of high-resolution displays. The native resolution of 3840x2160 fits the factor-of-four criteria perfectly and provides a new, exciting canvas for video artists to paint on. As a matter of fact, YouTube already hosts a small collection of 4K videos for your bandwidth-crushing pleasure.
What if you don't really care about 4K video just yet? What if sharper text doesn't get your motor running? What else could a dramatic increase in pixel density do for the PC? Make games look better, of course. Adding pixels allows for more detail, and making pixels smaller reduces the need for antialiasing algorithms that can slow down rendering—although, on the flip side, your graphics card will have more pixels to render. With more pixels comes more computing responsibility.
Well, I'm landing now. Time to wrap this up. While my jet-lagged ranting probably won't get super-duper-dense LCDs into our notebooks and desktops any faster, perhaps we can still make a difference. Gentlemen, gather your picket signs and MRE pouches! At the very least, we can raise awareness about the obscene, Volkswagen-sized pixels commonly found on consumer notebooks. Occupy Best Buy! Who's with me? *crickets*Windows 8-Ball Consumer Preview
There aren't many experiences in life more disconcerting than waking up and not knowing where you are or how you got there. I was privy to this uncomfortable experience once during college, following a late night of studying for my History of American Beverages exam. I'm feeling some serious déjà vu right about now, only this time I've awoken to the Windows 8 Metro interface instead of an unkempt farm field and curious geese.
This past Wednesday, Microsoft opened the flood gates on its Windows 8 Consumer Preview. I've been very interested to see how Microsoft would merge its desktop and mobile paradigms ever since I got my grubby mitts on a Windows Phone 7 device and discovered that, hey, this isn't half bad. Having carried a Windows handset for the past few months, I've come to appreciate the quickness and ease of use provided by the Metro interface. I was pretty psyched to see how it would blend into the PC experience, so I downloaded the 64-bit Win8 image from Microsoft's website.
Without any fancy touch-based hardware lying around, I ventured into the parts closet and dusted off an unused Core 2 Duo tower. After installing a spare hard drive, I flipped the power switch, slapped in the Windows 8 install disc, and beheld... a fish? I'm still not sure what that was all about. Perhaps it was a metaphor, foreshadowing the feelings of aquatic extrication to come.
The setup routine was pretty straightforward and intuitive. Unlike prior versions, Windows 8 asks you to cough up an awful lot of personal information to obtain a precious Live ID. The routine stops short of asking for credit cards and first-born children, but it still feels invasive. My email address was requested early on—with the promise to not send me spam. However, later in the install process, I had to uncheck a box that would automatically sign me up to receive MSN's "special offers."
After a couple reboots and a trip to the coffee maker to refill my mug, the monitor went black. I could hear the hard drive grinding away, frantically seeking for the last few bits of information needed to launch my first Windows 8 experience. I took a long sip of my coffee. As the mug dropped out of view and the screen came back into focus, I found myself in a spartan blue field surrounded by curious colored tiles.
Coming directly from Windows 7, I honestly didn't know what to do beyond clicking tiles to launch the corresponding applications. Right-clicking the background only brings up an option for "All Apps," which displays several cluttered columns of icons for your installed applications. Right-clicking live tiles doesn't do much, either. You get a menu bar across the bottom of the screen with a scant few options to pin and unpin things. As a power user, Metro's lack of immediately available options makes me see red. When options do present themselves, an inordinate amount of mousing is required to accomplish simple tasks.
Things get a little more promising as the mouse pointer finds its way to the lower left corner of the screen. Upon entering this hot-zone, an obnoxious "Start" graphic pops up, then promptly disappears if you try to center the mouse pointer on the image. This behavior feels extremely unnatural. I'm used to pop-up graphics persisting as long as the pointer remains over some part of the image. To actually click the start graphic, the cursor must remain in the bottom corner's tiny hit zone.
Once you've managed to click on the Start icon, Metro fades away and reveals the familiar (albeit more angular) Windows Aero desktop with one glaring omission. Over the past 17 years, Microsoft has taught millions upon millions of people that clicking the Start button is an essential entry point to any computing activity. Surprise! It's gone. There isn't even an option to bring back the old girl. The registry hacks enabling a Win7-style Start menu have been even removed from this release. Instead, we're left with an empty space on the taskbar and a usability puzzle to solve.
When I arrived at the desktop for the first time, with no desktop icons and nothing yet pinned to my taskbar, I was unsure how to proceed. Out of habit, I moused to the empty lower left corner anyway. Lo and behold, that same out-of-place Start graphic popped up and promptly disappeared the instant I left its minuscule hit zone. Clicking the graphic brought me right back to the blue, tiled, dumbed-down Metro UI. Arghh! It's extremely disorienting and distracting to be thrust into a completely different interface every time you need to launch an application that isn't pinned to the taskbar or desktop.
Hot spots in the corners are something of a theme in both the Metro and desktop modes. Mousing to the upper left corner will show you a small preview of the last active Metro app that was running. Holding the cursor against the screen's edge and moving downward will eventually reveal all of one's running Metro apps. From here, one can left-click an app to call it up or right-click to close it. Frustratingly, there is no option to close a Metro app when you're actually using it.
Holding the cursor in the upper right corner of the screen reveals the "Charms bar". This bar is a list of consistent buttons, akin to what you'd find adorning the bottom of an Android device. There are dedicated buttons for Search, Share, Start, Devices, and Settings. It's nice to have a persistent Settings button that changes context depending on the running application, but that's a Metro-specific function. The Settings context doesn't change in desktop mode, and the Charms bar tends to be invoked inadvertently when closing maximized windows.
In its current form, I feel Windows 8 is woefully inadequate for desktop power users. At best, the Metro tiles can be organized into groups and used like a restrictive version of Stardock's Fences. For touch-enabled devices, however, Windows 8 will truly be able to shine. Swiping from the edges of the screen to access menus is more natural and intuitive than having to drive your mouse pointer all over creation to call up and click on options that seemingly never pop up near the cursor's present position.
After playing with the OS, it's painfully obvious Windows 8 should be marketed purely for touch devices. The fact that it can run regular desktop applications may suggest otherwise, but even in desktop mode, everything tries to get you back to the Metro interface as soon as possible. In fact, desktop mode feels a lot like a virtual machine, existing for those rare moments when you need to dock your tablet to a keyboard and get some meaningful work done. Admittedly, I could happily put up with the annoyances of the desktop interface if I were only using it in such short bursts.
For devices without touch capabilities, things will get a little dicey. It would be unfortunate if Microsoft decided to force this touch-optimized interface on its corporate customer base. Windows 8 feels positively schizophrenic when used with the keyboard-and-mouse combo common among business users. Unless Microsoft has plans in place to simultaneously support and promote Windows 7 as its desktop-oriented OS, a "Professional" version of Windows 8 with the ability to turn off the Metro enhancements and reinstate the Start menu would be a smart move. At this point, I wouldn't even consider purchasing the OS for a desktop or laptop I needed to be productive using. It's simply the wrong tool for the job.
On the plus side, even this early release appears to be extremely stable and well polished. If hardware makers build Windows 8 devices that can play to the strengths of the operating system, you might find me changing my tune regarding usability. I think it would be universally awesome to have an Atrix-style smartphone or a dockable tablet capable of running full-blown Windows programs in a pinch. There's still much work to be done before we can all carry our computers in our pockets, but this appears to be the path Windows 8 is taking.The $300 ultraportable experiment
Unless you've locked yourself into the Apple ecosystem and thrown away the key, laptop shopping can be a daunting prospect. A dizzying array of options cater to almost every possible niche: ultrabooks, netbooks, thin-and-lights, desktop replacements, ultraportables, convertibles, lions, tigers, and bears—need I go on? Within each category, countless models from numerous brands are on the prowl, stalking your wallet with various feature and levels of quality.
I recently embarked on a mission to snare a 12" or smaller traveling companion. My freshly penguinized 15" HP laptop isn't exactly tray table friendly, and the geeky voices inside my head were crying out for something with a little extra horsepower. Beyond this, my humble list of demands included a decent keyboard, 128GB (or so) of SSD goodness, matte body panels, and reasonable battery life. To make things interesting, I decided to see if those demands could be met with a budget of only $300.
After searching extensively, I found that brand-new machines in the $300 ballpark weren't exactly bending over backwards to satisfy my needs. Nearly all of the contenders at this price point are outfitted with anemic Atom processors and enough glossy parts to make a Zamboni blush. SSD? Dream on. Decent keyboard? You might as well draw some letters and numbers on a soggy loaf of bread. What's a discerning laptop shopper on a budget to do?
Disappointed, but hardly surprised, I ripped a page from the car buyer's handbook and tried my luck at the used lots instead. Because good laptop keyboards are notoriously hard to come by, I started my search with the Lenovo X Series. First stop: the ThinkPad X200. Several units were available near the upper limits of my price range. However, there would be no room in the budget for upgrades, and I'd be saddled with a used battery and a mechanical hard drive. I kept looking.
Dropping back a generation to the X61 gave me cheaper options, but they were still too expensive to accommodate my planned SSD upgrade. I trudged on, dipping my sifting pan into the eBay river until I finally struck gold: $87 shipped for an X60 devoid of its hard drive, battery, and power adapter. I placed my bid and bit my nails as the auction clock ticked down. Three days later, a laptop-sized parcel was waiting on my doorstep. The foundation had been laid.
A ThinkPad X60 with a 12.1" 1024x768 display, 1.83GHz Core 2 Duo processor, and 1GB of DDR2-667 RAM is quite a bit of kit for 87% of a Benjamin. The CPU easily outpaces the similarly clocked Pentium M chip in my HP laptop, and it can obliterate any Atom-based pretender. The crowning jewel has to be the keyboard, though. Some would argue that Lenovo itself doesn't make keyboards this good anymore. The shell I received obviously spent much of its life in a docking station, and as a result, its keyboard and chassis are nearly pristine.
As luck would have it, my parts closet already housed another 1GB stick of 667MHz DDR2, which slotted into the ThinkPad without fuss. The necessary 20V power adapter was also on hand, courtesy of my electronics hoarding tendencies. I would only have to hunt down a battery, an SSD, and an operating system.
Battery shopping presented me with a conundrum. Purchasing an official Lenovo battery would seriously eat into my remaining budget. Alternatively, I could gamble on a cheaper, aftermarket model. I've used one of those in my HP laptop without issue for over a year and a half, but I've also had an aftermarket battery die after only a week. Those are 50/50 odds. In the end, I put down $25 for an aftermarket X60 battery. After two weeks of use, the battery is holding up well and offers between three and five hours of run time depending on what I'm doing—usually long enough to hold me over between outlets.
At this point in the game, I had spent only $112 out of pocket. The extra RAM and power adapter would have added another $30 to the total, leaving $158 in the budget. Armed with this knowledge, I impatiently headed over to the local Micro Center to seek out an SSD. Reasonably priced options in the 64GB range were plentiful, but I came to go big or go home. After enlisting the aid of a friendly sales associate, I eventually I found my 120GB contenders: an OCZ Vertex Plus and a SanDisk Ultra SSD.
Neither offering was familiar to me. Because the X60 sports a first generation SATA link with only 150MB/s of bandwidth, either SSD would be fast enough to saturate the interface. In need of a higher power to guide my decision, I busted out my trusty Galaxy S and fired up its Newegg app. Customer reviews would determine which drive got to ride shotgun on the way home. The verdict was swift and decisive. The SandForce 1222-based SanDisk drive beat its Indilinx-powered competitor to a scrambled pulp, five eggs to three. With the SSD in hand, I grabbed my customary bottle of Bawls from the beverage cooler and handed over $162 to the girl behind the front counter.
My new mini-laptop was nearly ready to take on the world. All it needed was an operating system to make sense of things. Having exhausted my budget, my OS options were limited to Linux or an unused Windows XP license I had on hand. Despite the suggestive Windows XP sticker affixed to the bottom of the X60, I opted to roll with a fresh copy of Ubuntu 11.04. The decision to use Linux was mostly based on the lack of TRIM support in Windows XP. Most modern Linux distributions (running kernel 2.6.33 or greater) support TRIM in some fashion, with a simple modification to the fstab file.
To accommodate the handful of Windows applications I need to run, I dusted off my existing copy of Windows XP Pro and installed it on a virtual machine using VirtualBox. Although my ThinkPad now runs the software I need, even the cheapest new netbooks come with some version of Windows 7, which would have been my preferred OS if the budget allowed. I also would have preferred to avoid the pathetic Intel GMA 950 integrated graphics. While it's enough to handle some basic Compiz interface effects, I'd be better off gaming on an Etch-a-Sketch. Fortunately, gaming is not what I bought this computer for.
Overall, I've been extremely happy with my ultraportable number cruncher. Despite its age, the X60's Core 2 Duo will soundly thrash any modern netbook that gets in its way. As icing on the cake, the extra oomph provided by the SSD makes this system feel impossibly fast for something that cost only $300.
As I tickle the fantastic keys on my "new" machine, I'm convinced I've made the right choice. It may not be as sleek as an ultrabook or as small as a netbook, but this X60 has the size and performance to suit my needs. I set out to prove that $300 can buy a lot more portable computing than most people think, and I feel like I've succeeded. Taking the DIY mentality to notebooks probably isn't for everyone, but there's a lot of value to be found on the path less traveled.The misadventures of Blu-ray in HTPC land
If you'll humor me, I'd like to take a brief break from all the CES coverage to complain about the sad state of Blu-ray on the PC. We've been, err, blessed with this format for more than five years now, yet the simplest task of watching a movie is still frustratingly complicated and expensive. I've taken a live-and-let-live position on the issue, but recent events have broken the proverbial camel's back and prompted this outpouring of negative emotions.
I should probably preface this rant by explaining the current state of my home theater. In short, it's a mess. In recent years, the bulk of my income has been diverted to finance causes like college, vehicles, rent, and occasionally some beer and ramen noodles. As a result, the home-theater setup I've assembled is really just the culmination of random, opportunistic acquisitions over the years. This motley collection consists of an aging 4:3 projector, some respectable tower speakers, a pre-HDMI Pioneer receiver, several game consoles, a DVD player, and a basic HTPC. If I want to watch something at a resolution above 800x600, I turn to my desktop PC.
Notwithstanding the high-def deficiencies of my living room, Blu-ray discs appeared in my stocking over the holidays. Even though my primary PC has had a Blu-ray drive for a couple of years now, the number of Blu-ray movies in my pre-Christmas collection is relatively small. My enthusiasm for the format waned after installing the drive and wasting several hours of my life discovering that my monitors were too old and lacked the necessary HDCP support. Very few things frustrate me more than having hardware that's willing and able to do the job, but can't because of artificial restraints.
About a week ago, I realized there was an Asus Blu-ray drive collecting dust in my parts closet, so I decided to give the ol' HTPC some upgrade love. A full HD projector is on my short list of near-future toy purchases, and I figured that preparation would be the Boy Scout thing to do. My current projector uses an analog VGA input, and I vaguely remembered reading somewhere that HDCP specifically required a digital connection to crash the party. In addition, software developers have had a couple of years to refine their playback solutions. What could possibly go wrong this time?
The short version: after two hours of researching and downloading various software titles to watch my new copy of Terminator 2, I shelved it and popped in the DVD version instead.
Now, my HTPC is nothing special—it's powered by a 2.7GHz Athlon II X2 running on a 785G motherboard, but it certainly has enough grunt to decode a Blu-ray movie. As it turns out, the issue I encountered was an artificial restriction present in newer versions of PowerDVD that limits Blu-ray playback to HDCP-compliant digital outputs. The VGA rebellion had been crushed. You win again copy protection. Best three out of five?
A few days later, I thought I found salvation in an older version of PowerDVD that still supports analog outputs. For whatever reason, that version crashed upon loading the Blu-ray menu. After few more swings and misses on the software front, and I finally stumbled upon a trial version of Arcsoft TotalMedia Theatre 5 that actually worked. At last, my shiny circular coaster could project its Schwarzenegger-laden wares onto my wall.
To celebrate this victory over my digital oppressors, I fired up T2 and sank into my Sumo beanbag with a can of Mountain Dew and the butteriest bag of popcorn I could find. This celebration was short-lived when I realized that the trial would eventually expire, forcing me to shell out $99 for the privilege of watching my movies in Arcsoft's theatre. No thanks.
Because I don't have a lot invested into Blu-ray movies at this point, I'm uncertain that I ever will. Already, the vast majority of my viewing is dominated by streaming services like Netflix and Hulu Plus. I don't even subscribe to basic cable anymore, opting instead to funnel the savings into a fatter Internet pipe. By the time I get around to upgrading my home theater equipment to something more awesome (and high-def), HD media streaming should be coming into its own, rendering Blu-ray superfluous for my purposes.
The inevitability of streamed entertainment doesn't mean that content producers want to make the transition easy for us, as Cyril recently lamented. I get funny the feeling that the ongoing back-alley knife fight between producers and distributors is partially intended to prop up Blu-ray sales while the industry figures out the most efficient way to extract every possible cent from streaming content.
Still, things didn't have to end up this way. While production companies and studios blunder about trying to boost Blu-ray sales and reduce piracy with iron-fist tactics, they are actually shooting themselves in the foot. Had my original attempts to play Blu-ray discs on the PC been successful, I guarantee that I would own more than a handful of titles today. Even today, if I could play Blu-ray discs reliably using VLC or Windows Media Player, I'd be more inclined to purchase more Blu-ray titles. I doubt that I'm alone in this situation.
To promote the format, Blu-ray content providers should encourage and subsidize the development and distribution of playback software instead of locking the format down so tightly that people look for a path of lesser resistance. I can't think of a similar garden with comparably high walls surrounding it; Apple may tightly control its products, but it opted to bypass Blu-ray in favor of streaming.
I'm not here to argue the financial merits of such a move. Undoubtedly, many good trees were lost to analysts printing projections of various ways to pimp Blu-ray media and licensing agreements to the masses. I'm simply here to state that, from a PC enthusiast's perspective, I think things could have been done a lot better. There is no reason that Blu-ray and streaming services shouldn't coexist on an HTPC. Streaming HD feeds are likely to remain highly compressed in the near future, ensuring that physical discs offer a superior experience, at least from a quality perspective.
Unfortunately, there are many question marks regarding the future of Blu-ray on the PC. HTPC communities have been clamoring for native playback support in the forthcoming Windows 8, but so far the outlook is grim. There are also some extensions in the works for the popular media playback software VLC. Libbluray is an open-source "research project" library for VLC that enables some Blu-ray playback capabilities on PC, Mac, and Linux. The development of this library is still in its infancy, and it must be compiled from source, which will scare away most casual tinkerers. Hopefully, the library will become easier to use as the project matures.
For now, I think I'll continue taking a wait-and-see approach to Blu-ray. If good playback software becomes available at a more palatable price point by the time I upgrade my projector, I will probably take the plunge. I like the consolidation and expandability that HTPCs offer over their set-top-box counterparts; I just wish somebody would throw us enthusiasts a bone.Mining away hours of my life
Hi, my name's David, and I'm an addict. It's only been about ten minutes since my last fix, and already, the ants are crawling all over my body. Why does my neck itch? Oh man, I forgot to change my spawn point back to the main base. Hold up, this will only take a second...
...and I'm back. Crisis averted. I swear this all started innocently enough with one quick hit of a free online demo. What harm could that possibly do? Shelling out $20 to support an indie developer and snag a reasonably priced game is justifiable, right? Configuring some server software to run 24/7 on the home file-server and setting up a nightly Cron job to back up my progress, that's smart, no? Oh, it's 4 AM on a work day again. Crap, addicted.
Fortunately, I'm not alone. There are currently about 4.4 million fellow addicts paying the same dealer to satisfy their craving for this innocuous-sounding diversion: Minecraft. I've even managed to get some of my friends hooked and burning away their spare time busting rocks on my server, but why? What is it about this simplistic realm that keeps us signed in well past any reasonable bedtime?
When explaining the game to the uninitiated, I'm often met with a quizzical stare and uncomfortable attempts to change the subject. At best, the conversation will elicit the response, "...and that's fun?" Now, I've never been much of a salesman, but pitching the idea of collecting and relocating various digital cubes for hours on end is a tough sell just about anywhere. That said, I'm going to try to outline my fascination with the game, if only to admit my dependence and start down the road to recovery.
There are three primary gameplay modes in Minecraft: creative, survival and hardcore. Creative mode is useful for ambitious projects requiring infinite resources and a focus on design rather than resource gathering. That mode is fun in its own right. However, I find the lack of adversity and challenge makes any accomplishments seem somewhat hollow and unfulfilling. By contrast, survival mode introduces such hardships as death, resource scarcity, and time management, which add a hint of strategy into the mix. Honestly, 99% of the times I've logged into the game, my mode of choice has been survival. I enjoy the added challenge it presents, and the objects I construct block-by-block seem somewhat more legitimate when I've served hard time collecting the required resources. Hardcore mode exists for the truly masochistic among us. It is similar to survival mode but offers a higher degree of difficulty, and you only get one life to live. Once dead, the world you've created, along with the hours of your life spent creating it, are simply wiped away.
Here's the gist of your mission in survival mode: begin gathering the resources around you to build shelter and infrastructure before the sun goes down, because at nightfall, monsters will begin spawning and make your life more difficult. After collecting various resources, you can begin crafting new items out of your inventory. That allows for the creation of tools and enhanced building materials to aid in your construction projects.
So, in a nutshell, you mine assorted resources and craft them together. Minecraft. Get it?
To keep players on their toes, the developer has implemented a rather fast day-and-night cycle. Daylight lasts for approximately 10 minutes, during which the amount of light present is usually adequate to ward off any monsters. Those monsters can spawn in just about any location where light levels fall below a certain threshold, though, so even during the daytime, you can run into trouble in dimly lit rooms. Because the monsters can kill your character, some strategy must be employed during long overnight or underground travels. In standard survival mode, being killed causes you to respawn at the last bed you slept in or at the nearest spawn point on the map. However, when resurrected, you are relieved of all your tools and resources that were on your person at the time of death. If you're lucky, you can return to the place of termination and retrieve these items. Still, planning ahead and stashing unneeded or valuable resources before embarking on long journeys or dangerous activities will prevent much frustration.
Other ways to maim and kill your character exist: fire, lava, falling, drowning, and arrows to the, err, patella, courtesy of other players on the server. One must therefore be constantly cognizant of their environment whilst whittling away the landscape for resources.
However, this vein of strategy borne from fear of death is not the reason millions of people have dropped an Andrew Jackson for the privilege of playing the game. Nope, the primary focus is on creation. The ability to manipulate nearly everything in the fully destructible environment is akin to having an infinite number of digital Lego blocks at your disposal. You are only limited by your imagination and the upper and lower boundaries of the Minecraft world. Elements exist that allow for various degrees of automation and the creation of complex machines, and some have even gone so far as to produce 8- and 16-bit processors using Minecraft elements as transistors.
As for me, I have been toiling away on two main projects: building and upgrading a respectable house and digging a big hole. Architecture is fun in this environment and is part of the natural evolution of nearly everyone's Minecraft experience. The large hole, on the other hand, is just an extension of some pathological need of mine to dig large holes. It probably explains why I choose to live about as far away as one can get from any coastal beaches, but that's a blog post for a different day.
Unless you're predisposed to reclusive behavior, Minecraft's single-player mode will only entertain you for so long. The real fun begins when friends and family can all work together on the same map to build a virtual world from the ground up—literally. The ability to share accomplishments and collaborate with others is the addictive catalyst that keeps me logging in for more.
In order to create a common world that can be shared among many users, special server software must be downloaded and installed. Like the game itself, the server software is Java-based and can run on just about any machine with a Java interpreter and loads of RAM—Minecraft will graciously nibble on as much memory as you can feed it. Since my primary PC goes to sleep during idle periods, I decided to install the server software on my Ubuntu-based home file/web server, which I leave running 24 hours a day. If you're looking for something a little less permanent and easier to manage, there is also a GUI-based server for Windows that is designed to get you up and hosting quickly.
Whether or not others can access your server over the Internet is highly dependent on your ISP and its open-port policy. By default, the server software listens for and accepts requests on port 25565, but that can be changed as desired. Once the software has been configured and a port number has been pointed to the appropriate internal IP address in your router's settings, people should be able to access your newly hosted world by feeding their Minecraft client your server's public IP address (which can be easily discovered with a quick Google search for "what is my IP").
So, there you have it: Minecraft. Are you convinced yet? As I read back over this post from the perspective of an unfamiliar bystander, I still find myself asking, "...and that's fun?" You know what, unfamiliar bystander? Don't take my word for it. I know a guy, who knows a guy, who can hook you up with a free online demo over at Minecraft.net. Just don't come back to me complaining when it's 4 AM and you're $20 poorer.A quick look at Gunnar's glasses for computer junkies
Quick poll: everyone who regularly squints at a computer monitor or television at least eight hours a day, raise your hand. Do I hear nine hours? Ten? Twelve? Depressingly, you'll still find my hand waving timidly in the air.
A long time ago at a corner desk far, far away... I came to the realization that my eyeballs are constantly fixated on the same glowing rectangle well beyond any time frame that could be considered healthy. Working alongside others who share the same fate, I've noticed that everybody has their own techniques for combating the inevitable symptoms of visual fatigue. Some of my colleagues take regular breaks to refuel on coffee or to plant a Nerf dart in someone's back, while others print their work and turn off their monitors for a period of time. Recently, a new trend has crept into the office: Gunnar computer glasses.
It didn't take long to succumb to my cat-like curiosity and snag a pair of my own. Anything that might assuage the eye-strain-induced headaches I've lived with over my many years as a relentless computer junkie would be a welcomed addition to my daily routine. If the glasses could provide even half the purported functionality of their marketing bravado, I figured the somewhat steep entry price could be rationalized. After all, I don't hesitate to shell out extra dough for high-quality keyboards and mice that enhance the tactile link to my computers. Why not invest a few bucks in the outbound interface between the computer's monitor and my eyes, as well?
The glasses will run anywhere from $79 to $189 depending on the style and retailer. My particular model, the Phenom, is listed at $99 online. With a little comparison shopping and coupon code-clipping, I was able to get the final price down to $85.
People rocking a pair of Gunnars are easily singled out in a crowd, thanks to the distinctive yellow-tinted lenses. The amber tint is only one ingredient in the secret sauce that Gunnar terms i-AMP lens technology. i-AMP is made up of four obnoxiously named (and trademarked) components: diAMIX lens material, iONik lens tints, fRACTYL lens geometry, and i-FI lens coatings—I promise the caps lock key is not acting up.
Ignoring the awkward capitalization for now, these four technologies are intended to mesh together to offer the wearer better contrast, a warmer color spectrum that's easier on the eyes, a protective barrier that keeps eyes moist, and a dose of anti-glare properties. The lens material was designed to cut the harsh light emitted by the backlit displays and fluorescent tubes commonly found in the average workplace.
Do these face-huggers actually work? In short, yes! Having been blessed with 20/20 vision, I am not accustomed to wearing anything beyond cheapo sunglasses outdoors, so it took about half an hour or so for my eyes to relax upon donning the Gunnars for the first time. Part of that time was spent getting used to the small amount of magnification provided by the lenses. Once my eyes calmed down and my brain had time to adjust its internal white balance to compensate for the yellow tint, the improvement was evident.
The glasses seem to work best for long-haul computing or gaming sessions. Despite using them every day for the past couple of weeks, my eyes still need a little bit of time to adjust each time I put them on. Short bursts of emailing or perusing the news over lunch don't give my eyeballs enough time to fully relax before taking the Gunnars off again. As a general rule of thumb, I'll slap them on for pixel stare-downs lasting 30 minutes or longer.
So far, the thing that's impressed me the most about these glasses is their ability to make on-screen motion less painful. The motion doesn't necessarily look more fluid, but my eyes can follow it with less effort. The eye strain I feel with constant scrolling has been reduced substantially, and I find it easier to track specific lines of text while traversing endless pages of code.
The Gunnars have had a significant impact on my gaming stamina as well. The ability to quickly acquire and track a target with one's eyes is essential in most 3D games, but that task gets tiring after a while. Even in casual games, animations and repeated motion can wear down your eyeballs. As a Minecraft addict, I sometimes lose all track of time; five or six hours later, my head pays the price. If I'm wearing the Gunnars during these sessions, my brain and eyes feel less exhausted, allowing me to refocus on the real world a little bit faster.
Gunnar claims the glasses lock in moisture in by blocking outside drafts and promoting natural blinking. Apparently, we tend to blink more when our eyes are relaxed than when we're squinting, which keeps our peepers naturally watered down. I never noticed any issues with dryness prior to wearing the Gunnars, so it's hard to tell if they make a difference. Some unscientific experimentation showed that I can stare at a single point without blinking quite a bit longer with the glasses on. However, this claim feels like marketing exploiting a phenomenon that occurs behind the lenses of most glasses.
Despite my current infatuation with these spectacles, they do have some drawbacks. First, and perhaps foremost, the yellow tint is counterproductive when working with images or projects where color accuracy is important. For programming, productivity applications, web-browsing, and even gaming, absolute color accuracy is often an afterthought. However, as soon as Photoshop launches on my PC, the glasses have to come off. Thankfully, the vast majority of what I do on a computer makes no demands for color purity. For die-hard graphic designers and the like, Gunnar does offer a clear "CRySTALLINE" lens that sports the same anti-glare coating and moisture protection without the shift to warmer colors.
Another point of frustration for me is the fact that Gunnar felt the need to print its name in the upper corner of the left lens, similar to Ray-Ban shades. Unfortunately, this corner is well within my peripheral view, and I constantly see a slight blur from the lettering in my field of vision. Why create a product whose sole purpose is to reduce eye strain and then introduce a permanent blurry spot into the equation? Part of the initial adjustment period my eyes require with these glasses can be attributed to attempts to focus on and then compensate for the blur. Gunnar, if you're listening, please ditch the lettering on the lens itself. Your name is already printed in two other places on the frame in case I forget.
Overall, I have been pleasantly surprised by the Gunnars, which have become an integral part of my work day and play time. I was somewhat skeptical at first, despite positive testimonials from co-workers, but am glad I took a gamble on this product. Although I am satisfied, the opportunity to test-drive a pair before proceeding to checkout would have been nice. If you know somebody who uses these glasses, ask to borrow them for an hour or so before passing final judgement. If they do seem beneficial, a wide variety of frames exists to suit most stylistic persuasions. The only target market seemingly excluded from the lineup is the affluent monocle crowd. For shame.
|SilverStone Nitrogon NT08-115XP cooler fits in nearly any case||2|
|Samsung set to disable remaining Galaxy Note 7 handsets||33|
|Deals of the week: laptops and spinning storage||12|
|Qualcomm readies up 48-core Centriq 2400 ARM server chip||54|
|BitFenix Shogun chassis goes for internal and external coolness||3|
|AMD and Intel join forces for a bundle of hardware and games||59|
|Report: Samsung Galaxy S8 may go into full-screen mode||23|
|Gigabyte XK700 keyboard will challenge your limits||22|
|Microsoft and Intel set to bring AR to the people with Project Evo||10|