As technology becomes a more important part of everyone’s life, hardware has, ironically, fallen to the wayside. People are more interested in gadgets than ever before, but that hasn’t translated to respect for what’s inside.
This doesn’t mean the demand for hardware has decreased, however. On the contrary, implementing better technology while making it opaque to the user is a huge challenge, and all the major hardware manufacturers, from Apple to NVIDIA, are working at break-neck speed. This has made 2013 a great year for hardware enthusiasts – even if you wouldn’t know it at first glance.
The struggle for dominance between AMD and NVIDIA has gone on for years and, in a way, has become predictable. One company releases a new, faster chip, and it is inevitably countered several months later by something ever faster. This encourages dramatic progress, but it also makes the video card itself little more than a host for the latest and greatest hardware.
With the TITAN, however, NVIDIA deviated from the script. Instead of simply releasing a new, faster chip, as it had always done before, the company decided to release a premier product that absolutely reeked of brilliance. That product became the TITAN.
There are several things that set the TITAN apart. The most immediately apparent is the construction, which consists mostly of metal rather than the cheap plastic most video cards use, and the shroud also serves as a heatsink for the blower fan. This allows the TITAN to operate at maximum load without creating much noise, a dramatic departure from the obnoxiously loud video cards of the past. And the TITAN, unlike most cards sold today, is explicitly a product of NVIDIA; re-sellers can slap their name on the box, but the card itself is sold without deviation from NVIDIA’s reference design.
Though released early in 2013, the TITAN is already obsolete; its performance is exceeded by the GTX 780 Ti and AMD’s latest-and-greatest. But the TITAN’s influence on NVIDIA’s card design remains strong, and it has upped the ante on third-party video card manufacturers seeking to sell customized cards.
There are many reasons why the PlayStation 4 and Xbox One deserve to be listed, and what’s inside is one of the most important. Consoles of the past have always been customized products, built almost from the ground up to meet specific performance targets. That tradition, long a defining feature of the console market, has now been abandoned in favor of hardware very similar to that found in a PC.
In the short term, this will allow for a much closer relationship between the PC and the console than ever before. Games will be easier port, encouraging the proliferation of multi-platform releases (already a common sight), and apps and services once restricted to PC will start to appear in the living room on home consoles. The Xbox One shows particular promise in this regard because it runs a Windows kernel, which means developers should find writing software for Microsoft’s console familiar.
A longer view opens even greater potential and questions whether the PC and console should even be treated as separate platforms. Might a game be played on a PC, saved in the cloud, and then finished on a console? Could developers start selling an all-access pass that provides a version for every platform at the same price? Might multi-player finally span platforms, rather than separating console and PC players into their own playgrounds? And who will be first to hack Windows or Linux onto a PS4 or Xbox One? The answers to these questions could completely change how gamers play.
Intel’s 4th-Gen Core Processors (Haswell)
The latest Intel processors, branded 4th generation Core and also known by their development code name, Haswell, have not significantly improved performance over the 3rd generation. That was never the goal. Instead, the 4th-gen was designed to make PC laptops and convertibles more competitive with the battery life smartphones and tablets provide. Measured by this metric, the new processors have been a massive success.
Some of today’s best computers, like the Dell XPS 12 and 13-inch MacBook Pro with Retina Display, can manage almost seven hours of continuous web browsing with display brightness somewhere between 50% and 100% of maximum. Throw in a few idle pauses or less demanding tasks, like word processing, and you can expect to exceed 10 hours, and a 13-inch MacBook Pro with Retina Display left at idle with the display on can last over twenty hours. That’s better than some tablets and many smartphones!
These gains are even more impressive in light of what used to be acceptable. Most computers sold in 2010 struggled to crack the four-hour mark, and inexpensive laptops were lucky to manage much beyond two hours of life. Today’s laptops, even budget models, last at least twice as long. This could give tablets legitimate competition – if Microsoft ever delivers a touch interface users enjoy.
Efficiency may, at first glance, seem the bane of geeks everywhere. It’s the reason why the latest MacBook Pro is actually a bit slower than the old one, it’s why Intel’s newest processors aren’t that quick, and it’s the rate of advancement in 3D graphics has slowed. But there are benefits, and no product illustrates them better than Razer’s Blade laptop.
The first generation of the Blade was released in 2012 and tried to provide hardcore gamers with a legitimately powerful PC that was also highly portable. Though remarkably thin and light, it also offered lackluster performance because powerful processors and GPUs created more heat than the Blade’s slim chassis could handle.
Just a year later, however, Razer has debuted the second generation line of Razer Blade laptops, which uses both Intel Haswell processors and the latest 700-series GPUs from NVIDIA. Both are radically more efficient than their predecessors and, as a result, the new Blade can run with much larger laptops despite the fact it is only 17 millimeters thick.
Razer’s Blade proves that gamers, power users and hardware enthusiasts can have their cake and eat it, too. Mobile performance no longer implies hopelessly bulky notebooks but instead can be contained in tiny systems that even offer decent battery life. Compromise is so 2012.
Like it or not, the resolution wars have swung into full effect, having spread to not only televisions and tablets but also laptops and monitors. Increasing pixel count is a relatively simple way to make a product look more attractive on store shelves, and manufacturers are pursing it with all possible speed.
The fact that “4K” looks good in brochures doesn’t mean more pixels are pointless, however. Enhancing resolution has a very obvious effect and improves the sharpness of things many users take for granted. Fonts are of particular note, as they look impeccably smooth on a 4K display, but games also benefit. So many pixels so close to your face results in an image so sharp that anti-aliasing becomes almost unnecessary.
Granted, there are a few small problems. Current 4K monitors are absurdly expensive (the 32-inch ASUS PQ321Q retails at over $3,000), Windows has trouble scaling to such high resolutions, and the nearly four-fold increase in pixel counter (relative to 1080p) strains even the most powerful video cards when modern 3D games are played. Even so, high-resolution displays are here to stay and determined to make visible pixels a thing of the past.
Many of the advancements in 2013 set the stage for greater advancements in 2014. Consoles that run x86 hardware open up a world of possibilities, more efficient processors pave the path for new types of convertible and table PCs, and high-resolution displays will lead to amazing image quality that movies and games have only begun to capitalize on.
What do you think? Will these innovations in hardware have an impact, or was there another contender in 2013 that could be even more important? Let us know in the comments!