Zotac's Steam Machine is ready to power your living room
— 10:00 AM on March 4, 2015

Despite a long series of delays, Valve is still finding partners for its Steam Machine hardware initiative. One of the latest entrants is Zotac, whose Steam Machine SN970 packs what looks like some potent hardware into a form factor similar to a bigger Zbox. Have a look:

The exact nature of the hardware inside the SN970 is shrouded in mystery for the moment, unfortunately. Zotac's press release says the SN970 contains a "sixth-generation Intel CPU," which probably means this box is built around Intel's upcoming Skylake CPUs. On the graphics front, Zotac says the SN970 has "a discrete GTX level graphics card with Nvidia's Maxwell GPU." Again, not much help.

Maximum PC did get up close and personal with the SN970, however, and they say the GPU inside is a GeForce GTX 970M with 3GB of VRAM. Whether that's enough processing power to substantiate Zotac's claim that the SN970 is capable of "smooth 4K gaming" with "graphics sliders on ultra" remains to be seen.

Each SN970 will come with SteamOS preinstalled, along with one of Valve's Steam Controllers. Maximum PC says a fully-equipped SN970 will run $999, which seems like a lot of money for something that's meant to take the place of a console. Zotac didn't provide a release date, but Valve news site SteamDB reports that a number of Steam Machines are slated for a November release.

1 comment — First by sweatshopking at 10:03 AM on 03/04/15

BitTorrent Sync exits beta, offers free private cloud storage
— 6:00 AM on March 4, 2015

Cloud-based storage makes it easy to share files between multiple users and devices, but what if you don't trust third-party servers with your data? One option is Sync, a private cloud system from the people behind the BitTorrent file-sharing protocol. In development for over two years, Sync can share files across a wide range of PC, mobile, and NAS platforms. The latest iteration, version 2.0, finally delivers what BitTorrent Sync VP Erik Pounds describes as a "final product" devoid of beta branding.

There's a fancy promo video and everything:

Sync, er, syncs files with direct, device-to-device transfers wrapped in a comforting layer of encryption. Folders can be shared not only between devices, but also between users, making Sync an intriguing option for both individuals and groups. There's also a Pro tier that's licensed specifically for business use.

The free version of Sync 2.0 is limited 10 folders, but there are no caps on folder size or transfer speeds. For $39.99 per year, the Pro tier offers unlimited folders, better permission control, and additional support, among other perks.

If you want to puff your own cloud, Sync 2.0 is available for pretty much every operating system: Windows, Windows Phone, OS X, iOS, Android, Fire OS, Linux, and Free BSD. Compatible apps are either already available or coming soon from all the big NAS vendors, as well.

10 comments — Last by odizzido at 9:47 AM on 03/04/15

Valve's $50 Steam Link looks like a Chromecast for games
— 12:49 AM on March 4, 2015

Valve's bid for the living room includes more than just standalone Steam Machines. At the Game Developers Conference yesterday, the firm announced Steam Link, a $49.99 device designed to stream games from local PCs.

According to the press release posted by Steam Database, the device supports 1080p streaming at up to 60 frames per second. Valve promises low latency, though the official product page (which has since disappeared) notes that Steam Link is designed for folks with a "fast home network." A wired network connection—or a very fast wireless one—will likely be required to get the best experience.

The final hardware will look something like this. Source: Valve

The press release and product page are surprisingly bereft of details on the actual hardware. However, the product renders show a slim, compact device that appears to be fanless. I count one HDMI output, one Ethernet jack, and three USB ports, one of which isn't pictured in the image above.

The number of USB ports suggests Steam Link is compatible with third-party controllers. Steam Link will also be sold with Valve's own controller for an additional $49.99, but it won't be available for a while. Valve says Steam Link is due in November, just in time for the holidays.

Valve co-founder Gabe Newell has been pushing in-home streaming since 2013. The functionality has been part of the Steam client for almost that long, and it works reasonably well over my home gigabit network. Streaming games isn't as good as playing them natively, of course, but Steam Link may still be able to deliver a compelling experience given its low asking price. $50-100 is a lot less than the cost of even a low-end gaming PC.

25 comments — Last by wierdo at 9:17 AM on 03/04/15

Nvidia introduces Shield set-top box with Android TV
— 10:14 PM on March 3, 2015

Tonight at GDC, Nvidia CEO Jen-Hsun Huang introduced the latest in the company's lineup of Shield devices: a set-top box running Android TV.

The set-top box is officially called just Shield. This newest Shield is based on Nvidia's Tegra X1 SoC, which has an eight ARM CPU cores in a big.LITTLE configuration plus Maxwell-based integrated graphics with 256 stream processors. Thanks to its H.265 and HEVC decode capability, plus an HDMI 2.0 connection, the Tegra X1 allows the Shield to handle 4K video output at 60Hz, which should be nice as 4K TVs become more prevalent.

Huang says Nvidia is committed to providing a great gaming experience on the Shield. To that end, the company will be curating games within its own storefront on Google TV. The hardware sounds up to the task, too. Huang says the Shield's Tegra X1 should provide about twice the performance of the Xbox 360 while consuming one-fifth of the power.

Those claims aren't just hot air, either. Nvidia demonstrated Borderlands: The Pre-Sequel, The Talos Principle, Doom 3: BFG Edition, and Crysis all running fluidly on the Shield. That's impressive performance for a 10W chip. All told, Huang claims that there will be over 50 titles in Nvidia's Shield store at launch.

The Shield itself is an ultra-thin, anodized aluminum slab with some fancy green lighting highlighting its fractured-looking outer fascia. Gaming input is handled by the same Shield controller we know and love from other Shield devices. Aftermarket add-ons include a stand to hold the Shield vertically and a remote for issuing voice commands to Google TV's Google Now features. In its base configuration, the Shield will cost $199, with general availability scheduled for May 2015.

34 comments — Last by not@home at 10:09 AM on 03/04/15

Tuesday Night Shortbread
— 8:01 PM on March 3, 2015

Eight is Enough

  1. Kroll Ontrack survey reveals solid state disk (SSD)
    technology highly adopted, but not infallible
  2. Microsoft: A first look at the Windows 10 universal app platform
  3. Softpedia: PC gaming expected to grow to 35 billion dollars by 2018
  4. PC Gamer: Valve announces $50 Steam Link
    streaming box and StreamVR for November release
  5. Polygon: Valve announces Source 2 engine, free for developers
  6. WCCFtech: Stardock's DirectX 12 game looks so
    good that the partners wouldn't believe it was real
  7. VideoCardz: Khronos releases OpenCL 2.1 provisional specification for public review
  8. WCCFtech: Find out if Qualcomm's still got it; Exynos
    7420 vs. Snapdragon 810 benchmarks analyzed


16 comments — Last by Krogoth at 9:23 AM on 03/04/15

With LiquidVR, AMD aims to make virtual reality more fluid
— 7:39 PM on March 3, 2015

Virtual reality is a hot topic at this year's Game Developers Conference, and AMD has its head in the game, too. Today at the show, the chipmaker announced LiquidVR, a set of tools designed to bring about the holy grail of virtual reality: a "motion-to-photon" latency low enough to make the experience subjectively seamless—an effect VR aficionados call "presence."

AMD says this quest involves optimizations across "the entire processing pipeline," from the GPU to the display hardware on VR headsets. Here are the main features of the LiquidVR 1.0 SDK, in the company's own words:

  • Async Shaders for smooth head-tracking enabling Hardware-Accelerated Time Warp, a technology that uses updated information on a user’s head position after a frame has been rendered and then warps the image to reflect the new viewpoint just before sending it to a VR headset, effectively minimizing latency between when a user turns their head and what appears on screen.
  • Affinity Multi-GPU for scalable rendering, a technology that allows multiple GPUs to work together to improve frame rates in VR applications by allowing them to assign work to run on specific GPUs. Each GPU renders the viewpoint from one eye, and then composites the outputs into a single stereo 3D image. With this technology, multi-GPU configurations become ideal for high performance VR rendering, delivering high frame rates for a smoother experience.
  • Latest data latch for smooth head-tracking, a programming mechanism that helps get head tracking data from the head-mounted display to the GPU as quickly as possible by binding data as close to real-time as possible, practically eliminating any API overhead and removing latency.
  • Direct-to-display for intuitively attaching VR headsets, to deliver a seamless plug-and-play virtual reality experience from an AMD Radeon™ graphics card to a connected VR headset, while enabling features such as booting directly to the display or using extended display features within Windows.

The official LiquidVR page mentions cutting motion-to-photon latency to "less than 10 millisconds." That means delivering a solid 100 FPS to the user's eyes—and it's about in line with the target I heard Oculus quote at AMD's APU13 conference a couple years back. I seem to recall Oculus mentioning tricks like time warping, as well, which it said would enable low latencies without requiring the GPU to sustain triple-digit frame rates.

Developers (and users, too) can sign up to learn more about LiquidVR here. AMD has also posted a LiquidVR video on YouTube, but there's not much in there beside a back-to-the-basics explanation of how VR works.

11 comments — Last by ronch at 8:54 AM on 03/04/15

Gartner: Apple overtook Samsung as top smartphone vendor last quarter
— 3:33 PM on March 3, 2015

It's that time again: in the endless horse race of quarterly smartphone sales, we now know who's a nose ahead and who fell off the pace—at least, according to one firm's numbers. According to a new report by market research firm Gartner, Apple narrowly took the per-vendor unit share crown from Samsung in the fourth quarter of 2014. Have a look:

Company Thousands
of units sold
(Q4 2014)
% market share
(Q4 2014)
of units sold
(Q4 2013)
% market share
(Q4 2013)
Apple 74,832 20.4 50,224 17.8
Samsung 73,032 19.9 83,317 29.5
Lenovo 24,300 6.6 16,465 5.8
Huawei 21,038 5.7 16,057 5.7
Xiaomi 18,582 5.1 5,598 2.0
Others 155,701.6 42.4 111,204.3 39.3
Total 367,484.5 100.0 282,866.2 100.0
Source: Gartner.

Apple has plenty to be happy about of late. Thanks to the most recent round of iPhones, the company posted record results last quarter, with $74.6 billion in revenue and $18 billion in profit. What's more, Apple's rise to parity with Samsung comes even as the average selling price of iPhones continues to increase, according to research firm IDC's similar report from January.

Samsung isn't sitting still, though. As we reported yesterday, the new Galaxy S6 and S6 Edge feature the same metal-and-glass construction as Apple's high-end devices, plus higher-resolution screens and more photon-hungry camera optics. Nonetheless, Apple's iOS is still a unique selling point, and Samsung is still using a version of its much-maligned TouchWiz Android skin on the latest Galaxies.

No matter which enormous multinational corporation you favor, I'm curious to see whether Apple can sustain its growth—and whether the Galaxy S6 will be enough for Samsung to move back into first place. I suppose we'll find out next quarter.

43 comments — Last by BabelHuber at 9:26 AM on 03/04/15

Unity 5 wants to be the game engine for everyone, everywhere
— 1:40 PM on March 3, 2015

Today at GDC 2015, Unity released the latest version of its eponymous game engine: Unity 5. The company emphasized the egalitarian philosophy and broad cross-platform compatibility of its engine, and it talked up the power and performance that the new version of Unity brings to developers of all sizes, from indie to AAA.

Key features in Unity 5 include a powerful built-in DAW-class audio editing platform; 64-bit support, which allows for bigger and more complicated game worlds; real-time global illumination based on the Enlighten lighting engine; and physically based shading support, which makes it easier to create convincing-looking interactions between light and simulated materials like wood and stone.

I'm not a game developer, but the internal demonstrations and games from guest developers Unity featured did look (and sound) impressive.

Republique Remastered, one of the featured games built on Unity 5.

Keep in mind this isn't just a PC-targeted engine, either: developers using Unity 5 can target up to 21 platforms, including consoles and mobile devices. The Unity engine's "write once, run anywhere" capability was a major selling point during the keynote, and Unity emphasized the labor savings that this WORM capability provides.

Unity is looking to the future of gaming, as well. Oculus founder Palmer Luckey came on stage to announce that built-in Oculus Rift and Samsung Gear VR support is in the alpha stages for Unity, and a beta version of this platform support will be made available to Unity developers later this month.

Unity also showed one of its PC-targeted demos from last year's GDC playing in a web browser using WebGL. The company teased the prospect of being able to share a full-fat 3D game across the web with the click of a link.

Unity's keynote highlight reel, as shown at the GDC presentation.

With all of this power on hand, one could be forgiven for thinking that building on Unity is an expensive proposition. However, the company is making the full-featured engine available for free to small developers with less than $100,000 of revenue or investment backing. The professional version of Unity isn't free, but it is affordable; developers can pay either $75 per month or a $1,500 one-time fee. The professional version includes features for commercial studios, such as Unity Cloud Build, which makes it easier to coordinate development efforts for distributed teams, and Unity Analytics, which provides "actionable insights into your players' behavior." Whether you choose to develop with the personal or professional edition of Unity, both are royalty-free.

Putting a full-featured, cross-platform development engine in the hands of anyone who wants it is exciting. Given that Unreal Engine 4 is now available free of charge to anybody who wants it, developers seem to be faced with an embarrassment of riches when choosing their development platform these days. If you're interested, you can download Unity 5 today.

26 comments — Last by njoydesign at 9:30 AM on 03/04/15

ARM and Geomerics announce Enlighten 3 engine
— 11:00 AM on March 3, 2015

GDC — In 2013, ARM invested in a company called Geomerics, which provides a middleware lighting engine for real-time graphics and games. This week, at the Game Developer's Conference in San Francisco, ARM and Geomerics announced a new version of that lighting engine, Enlighten 3.

Enlighten is distinguished from other lighting algorithms by its ability to achieve what is more or less the holy grail of real-time graphics: global illumination with multiple bounces, essentially a simulation of how actual light acts in an environment. Of course, given the limits of current hardware, Enlighten has to take some shortcuts in order to achieve a reasonable fascimile of multi-bounce global illumination. Still, ARM says Enlighten can scale across multiple platforms and classes of hardware, from Windows-based desktop PCs to game consoles to Android mobile devices.

ARM's involvement in this space might seem like an odd fit, but lighting is one of the key fundamentals in graphics. ARM says Geomerics "influences and informs" its processor roadmaps, no doubt including the plans for its Mali GPUs, which are widely used in mobile devices these days. Nvidia, another contender in the GPU space, has its own global illumination middleware solution known as VXGI.

Enlighten is already the primary lighting routine for the popular Unity game engine, and it's available as a licensable option for Unreal Engine 4, as well. Version 3 of Enlighten adds a number of new features, including improved indirect lighting with certain types of geometry, support for richer simulated materials, and better transparency.

ARM and Geomerics have released a couple of demos showing Enlighten's new features in action. Here's real-time global illumination at work:

And this "Subway" demo shows a number other new features.

In addition to the integration with major game engines, Geomerics is also releasing a new lighting editor known as Forge that allows stand-alone editing and fast visualization of light environments.

Enlighten with Forge includes a software development kit, and Geomerics says its software is "already integrated into many leading in-house engines."

8 comments — Last by Milo Burke at 9:09 AM on 03/04/15

Vulkan is the low-overhead future of OpenGL
— 10:38 AM on March 3, 2015

Another piece in the next-gen graphics API puzzle has fallen into place. The Khronos Group has formally announced Vulkan, the API formerly known as glNext. The open standards body revealed its intention to rebuild OpenGL as a low-overhead API in August, and Vulkan is the result. I'll let the press release fill in the details:

Vulkan is a unified specification that minimizes driver overhead and enables multi-threaded GPU command preparation for optimal graphics and compute performance on diverse mobile, desktop, console and embedded platforms. Vulkan also provides the direct GPU control demanded by sophisticated game engines, middleware and applications with the cross vendor performance and functional portability resulting from simpler, more predictable drivers. The layered design of Vulkan enables multiple IHVs to plug into a common, extensible architecture for code validation, debugging and profiling during development without impacting production performance; this layering flexibility is expected to catalyze strong innovation in cross-vendor GPU tools.

In another significant announcement today, Vulkan and OpenCL 2.1 are now sharing core intermediate language technologies resulting in SPIR-V; a revolution in the Khronos Standard Portable Intermediate Representation initially used by OpenCL™, now fully defined by Khronos with native support for shader and kernel features. SPIR-V splits the compiler chain, enabling high-level language front-ends to emit programs in a standardized intermediate form to be ingested by Vulkan or OpenCL drivers. Eliminating the need for a built-in high-level language source compiler significantly reduces GPU driver complexity and will enable a diversity of language front-ends. Additionally, a standardized IR provides a measure of shader IP protection, accelerated shader load times and enables developers to use a common language front-end, improving shader reliability and portability across multiple implementations.

More information on the API is available in this overview presentation (PDF). The Game Developers Conference is also hosting two sessions on Vulkan this Thursday. One will provide a technical preview of the API, while the other promises demos and interaction with the folks behind the standard. Scott is at GDC this week, and I expect he'll be attending at least one of those sessions.

Vulkan remains a work in progress, but the initial specification and first implementations are due later this year. The Khronos Group says it has made "rapid progress" since last summer, with "significant proposals and IP contributions received from members." Some of those contributions came from AMD, whose low-overhead Mantle API shares a similar focus. AMD Gaming Scientist Richard Huddy told us in August that the firm had done "a great deal of work" with the Khronos Group on the next-gen OpenGL spec. It's unclear how much of Vulkan is derived from a mind meld with Mantle, though.

AMD isn't the only hardware company with a hand in Vulkan development, of course. Intel, Nvidia, ARM, Qualcomm, and Imagination Technologies are all part of the group behind the standard. Interestingly, the Khronos Group says it also experienced an "unprecedented level of participation from game engine ISVs." Valve is even presenting one of the GDC sessions on Vulkan.

54 comments — Last by njoydesign at 9:32 AM on 03/04/15

Video shows Microsoft's Project Spartan browser, Cortana in action
— 10:07 AM on March 3, 2015

Microsoft confirmed the existence of Project Spartan, its Internet Explorer replacement, back in January. At the time, the company touted the integration of its Cortana digital assistant with the new browser. Now, thanks to a leaked release in the hands of the folks at WinBeta, we know a little more about how Spartan will work with Cortana. Have a look:

It appears that Cortana will automatically notify the user whenever she can provide contextually useful information (in this case, the details for a restaurant), at which point one can click the Cortana symbol in the address bar to pull up a sidebar with the relevant info. WinBeta says one can also invoke Cortana when needed for small tasks like looking up info on a neighborhood or defining a word. Handy.

While WinBeta warns that this build of Spartan isn't yet available to the public and is likely subject to change, I'm impressed by the browser's clean interface and fluid animations. Cortana also looks quite useful. Maybe Spartan will be good enough to tempt me away from Chrome and Firefox for a while when it's released with Windows 10 later this year.

34 comments — Last by MadManOriginal at 8:51 AM on 03/04/15

AMD changes plans for public Mantle SDK, hints at evolution of API
— 6:00 AM on March 3, 2015

At PDXLan in November, AMD Gaming Scientist Richard Huddy reportedly promised that the firm would release a public Mantle SDK by the end of 2014. The in-house API has been available as part of a private beta since May, but the nitty gritty details of the supposedly open standard still aren't accessible to the public. Now, an official blog post attributed to Vice President of Visual and Perceptual Computing Raja Koduri suggests AMD has changed its plans for the public SDK—and possibly for the very future of the API.

2015 "will be a transitional year for Mantle," Koduri says. He suggests game developers have shifted their focus to DirectX 12 and glNext, and he clarifies Mantle's future with the following points:

  • AMD will continue to support our trusted partners that have committed to Mantle in future projects, like Battlefield™ Hardline, with all the resources at our disposal.
  • Mantle’s definition of “open” must widen. It already has, in fact. This vital effort has replaced our intention to release a public Mantle SDK, and you will learn the facts on Thursday, March 5 at GDC 2015.
  • Mantle must take on new capabilities and evolve beyond mastery of the draw call. It will continue to serve AMD as a graphics innovation platform available to select partners with custom needs.
    • The Mantle SDK also remains available to partners who register in this co-development and evaluation program. However, if you are a developer interested in Mantle "1.0" functionality, we suggest that you focus your attention on DirectX® 12 or GLnext.

So, AMD's "intention" to release the SDK has been replaced by an "effort" to widen the definition of open. Or something. Koduri plans to clear up that obfuscation at the Game Developers Conference on Thursday. Scott is at GDC this week, and he should be able to get us the straight dope on exactly what's going on.

In the meantime, Koduri says AMD will release a Mantle "programming guide and API reference" this month. The 450-page document will provide a "detailed look at the capabilities we’ve implemented and the design decisions we made," he adds.

111 comments — Last by AJSB at 9:51 AM on 03/04/15

End is in sight for Intel's contra-revenue efforts
— 4:38 PM on March 2, 2015

Intel may soon stop pursuing mobile market share by offering massive discounts on Atom processors. In a press briefing about the new Atom x3, x5, and x7 CPUs last week, Aicha Evans, Corporate VP and General Manager of Intel's Communications and Devices Group, suggested that such "contra-revenue" tactics won't be necessary with some of the new chips—and that, in the future, we shouldn't expect Intel to engage in these tactics at all.

In general, first of all, that was a very calculated move. I think our Chairman—the Chairman of our board—which was a pretty tough moment for us, stood up and said, "Look, this is something we're gonna have to do to get into the market and be relevant in the conversation." So, long term, you should not expect to be seeing that.

Now, nothing happens overnight. You don't lose 100 lbs overnight, right. So, some of the product will still have a little bit of a BOM [bill of materials] offset this year, but in general, we are now starting to be able to be in a position where we don't have to automatically offer contra revenue in order to make up for some of the BOM offset issues we have.

And I think we've been public also at the Investor Day . . . in November, and we set a very public [profits and losses] goal of $800 million of improvement in this space. And that comes from the mix, and that comes from the fact that we don't have to offer contra revenue in the SoFIA line and in some of these different lines, because we're getting more and more efficient in the BOM and the overall platform design and pre-integration.

The contra-revenue strategy has cost Intel quite a bit of money. Last quarter, revenue for the chipmaker's Mobile and Communications Group was -$6 million. The division only brought in $202 million in revenue for 2014 as a whole, down from $1.38 billion for 2013.

But Intel is hardly strapped for cash: its net income for 2014 was a cool $11.7 billion. And more importantly, the company could soon reap substantial rewards from its enlarged foothold in the mobile market.

48 comments — Last by Klimax at 3:43 AM on 03/04/15

Phanteks announces enthusiast-friendly Enthoo Evolv ITX case
— 2:32 PM on March 2, 2015

While we haven't yet had an opportunity to review a Phanteks case here at TR, the company has been making waves among enthusiasts with its Enthoo line of cases. Today, the company has added an ITX-only version of its Enthoo Evolv mini-tower to the lineup, the Enthoo Evolv ITX.

The Evolv ITX looks like it has just the sort of features I like to see in enthusiast-oriented ITX enclosures. The case appears to include a modular 2.5" drive sled, a modular 3.5" sled, and two 3.5" modular bays. The top radiator mount looks like it slides out for easy radiator installation, and Phanteks caters to the custom liquid-cooling enthusiast with a dampened pump bracket that can be installed atop the hard drive bays.

Most interestingly, the Evolv ITX doesn't skimp on room for air cooling. Though the only included fan is a single 200-mm unit at the front of the case, the Evolv ITX is shallow enough that the single big fan should be enough cooling for most builds. There's also room for two more 120- or 140-mm fans up top and one more 120- or 140-mm fan in the rear. Unlike most ITX cases I've tested, the Evolv ITX can accept tower-style coolers up to 200 mm in height, which means overclockers shouldn't have to resort to liquid cooling by default.

Even so, there's plenty of room for liquid coolers in the Evolv ITX. Phanteks claims the case can accept radiators up to 240 mm in length in the front of the case, up to 280 mm in length up top, or up to 140 mm in length at the rear.

The Evolv ITX includes the usual complement of rubber cable grommets and Phanteks' trademark hook-and-loop cable straps for clean cable management. The Evolv ITX also has its complement of USB ports and audio jacks located at the front of the case rather than the right side (as with its bigger brother, the Enthoo Evolv).

At 9" (230 mm) wide by 14.8" (375 mm) tall by 15.5" (395 mm) deep, the Evolv ITX is reasonably compact, too. Overall, the Evolv ITX appears to be one of the more enthusiast-friendly ITX cases on the market, and I'd definitely like to get my hands on one.

Phanteks says that the Evolv ITX should be available starting this month for $79.99.

Hat tip to Hexus for the Phanteks press release.

24 comments — Last by Chrispy_ at 6:33 PM on 03/03/15

SanDisk unveils microSD card with a whopping 200GB capacity
— 11:58 AM on March 2, 2015

Moore's Law is an amazing thing, folks. Yesterday, SanDisk announced its latest high-capacity Ultra-branded microSDXC card, which crams 200GB of storage into a card that can fit within the circumference of a penny. That's a bit staggering to think about.

The SanDisk Ultra 128GB microSD card. The 200GB version looks exactly like this, but says 200GB.

SanDisk claims that the 200GB capacity is a first for a microSDXC card, and the company says a "new design and production process that allows for more bits per die" helped make the jump from 128GB to 200GB possible. SanDisk claims 90MB/s transfer speeds for the Ultra 200GB, as well, though the company's press release didn't describe how that figure was determined.

While 200GB of storage for a phone may sound like overkill,  4K (or 2160p) video capture is an increasingly common feature on mobile devices. 4K TVs and monitors will probably become more mainstream in the next few years, as well. With four times the number of pixels of 1080p video, 4K content will chew up storage space accordingly, so SanDisk might be ahead of the curve here.

Along with this huge card, SanDisk announced an updated version of its Memory Zone app, which can monitor internal storage utilization and automatically move files to the microSD card as needed. SanDisk says that this app is compatible with most Android devices, and it sounds useful to me. Manually managing files on a mobile device isn't my idea of a fun time.

Hopefully, aspiring 4K videographers and SanDisk alike can still find a home for microSD cards in mobile devices in the years to come. Apple's iPhones and Google's Nexus 6 lack microSD slots, and so does the new Samsung Galaxy S6.

SanDisk says the Ultra 200GB microSDXC card should be available in the second quarter for $399.99. That price will get you a bundled SD adapter and a 10-year limited warranty.

32 comments — Last by DPete27 at 9:27 AM on 03/04/15

Unreal Engine 4 now free for everyone
— 11:51 AM on March 2, 2015

Free-to-play games are all the rage these days, so it's only fitting that development tools appear to be headed in the same direction. Epic has revealed that Unreal Engine 4 is now "available to everyone for free."

The announcement comes nearly a year after Epic started selling Unreal Engine subscriptions for $19/month. The UE4 development community has "grown tremendously" since then, the company says, and lifting the monthly fee is expected to encourage even more developers to get in on the action.

As far as I can tell, Unreal Engine 4 hasn't been watered down or laced with freemium features. You still get the full engine source code and the same suite of tools used by Epic's own staff. The only catch is that Epic wants a small slice of sales from any commercial products created with the engine. Developers are on the hook for 5% of gross quarterly revenue after the first $3,000.

Anyone with an Unreal Engine 4 subscription will be getting a partial refund on their monthly bill. All subscribers past and present will also get a $30 credit for the Unreal Engine Marketplace. The online content store is loaded with game assets, including environments, models, materials, and effects.

31 comments — Last by odizzido at 8:44 AM on 03/03/15

Sony's waterproof Xperia Z4 takes on premium tablets
— 10:24 AM on March 2, 2015

Sony is the latest tablet maker with an uber-slender Android slate. The Xperia Z4 measures just 6.1 mm thick, allowing it to slip into the same dress size as Dell's Venue 8 7000 and Apple's iPad Air 2. It's a relative featherweight, too, at only 13.9 ounces (393 grams). And, unlike most tablets, it can be dunked under water for up to 30 minutes with no ill effects.

Source: Sony

The Xperia's lithe, water-tight frame is wrapped around a 10.1" display with a 2560x1600 resolution. The underlying IPS panel covers 130% of the sRGB spectrum, according to Sony, and it's 40% brighter than the display in the Xperia Z2. If the backlight is bright enough for outdoor viewing, the Xperia could be the perfect poolside tablet.

Qualcomm's Snapdragon 810 SoC lurks under the hood, serving up a big.LITTLE combination of quad ARM Cortex-A57 and -A53 cores. The 64-bit cores scale up to 2GHz, and they're backed by Adreno 430 graphics. Sony also outfits the tablet with 3GB of RAM, 32GB of storage, and a Micro SD slot that can take memory cards up to 128GB. Everything is powered by a 6000-mAh battey rated for an impressive 17 hours of video playback.

And there's more.

The Xperia Z4 tablet works with the Remote Play functionality built into Sony's PlayStation 4 console. Games can be streamed from the console over Wi-Fi, and the PS4's DualShock controller can be used as a gamepad. The controller presumably works with native Android games, as well.

On a more productive note, the tablet can be paired with Sony's BKB50 keyboard. This Bluetooth-based unit has a hinged dock with 130° of tilt freedom. It's about the same weight as the tablet, which could make the combo a little tippy at extreme angles, but at least the clamshell doesn't require a kickstand to stay upright. Sony claims the keyboard's 430-mAh battery is good for about 60 hours of typing.

Pricing hasn't been announced, but the Xperia Z4 tablet is scheduled for release in June. Versions will be available with and without 4G connectivity, and the keyboard will be sold separately.

37 comments — Last by MadManOriginal at 8:53 AM on 03/03/15

Samsung's Galaxy S6 is ready for battle at the high end
— 9:41 AM on March 2, 2015

Ahead of Mobile World Congress 2015, Samsung announced the latest updates to its Galaxy S line of high-end handsets: the Galaxy S6 and Galaxy S6 Edge. The new Galaxies feature a striking new design that incorporates metal and glass materials, a marked change from the more plastic-y bodies of previous Galaxy S phones.  

Common to both phones is a 5.1", 2560x1440 AMOLED display whose pixel density works out to an eye-popping 577 ppi. On the regular S6, the screen is flat—no surprises there. The S6 Edge, however, has a curved front panel that arcs over the screen. Why? I have no idea, but Samsung did demonstrate a notification feature wherein a call from a contact caused the edges of the screen to spill colored light onto a table or desk if the phone was face-down. That's somewhat useful, I suppose. Mostly, the curves just look cool, and they're a welcome flourish.

More exciting for me are the improvements Samsung has made to the S6's cameras. The lenses on the front and rear of the S6 now feature a wide f/1.9 aperture for better photon-gathering in low light, versus the S5's (and the iPhone 6's) slower f/2.2 optics. The front-side shooter now packs 5MP for high-quality selfies, while the 16MP rear camera adds optical image stabilization for extra help when shooting in low light.  I've yet to see any sample images from the S6, but the camera does sound promising.

Under the hood, there's an eight-core Samsung SoC that's presumably part of the Exynos family, though Samsung didn't describe it as such in its press materials. Samsung says this SoC is 64-bit-capable and that it's fabbed on a 14-nm process. Four of the SoC's cores are clocked at 2.1GHz, while the other four clock in at 1.5GHz. Sounds a lot like an improved version of the chip Scott looked at in the Galaxy Note 4. The Verge couldn't get Samsung to confirm that this is the SoC we'll see in the U.S. version of the S6, however. The SoC is backed up with a beefy 3GB of RAM, plus 32, 64, or 128GB of onboard storage.

The higher-end materials and construction do come at a cost. The S6 abandons the removable battery and expandable storage slots that were mainstays of the Galaxy line until now. While there's nothing to be done about the missing microSD slot, Samsung did claim major improvements to battery charging speeds for the S6. The device also supports wireless charging out of the box, so heavy users should still be able to keep the S6 juiced up.

The Galaxy S6 and S6 Edge will be available on April 10 in a range of colors, running Android 5.0 Lollipop. Samsung didn't talk about prices during its presentation, but it's probably safe to say the S6 won't be cheap.

119 comments — Last by trackerben at 9:51 AM on 03/04/15

Cherry Trail debuts as the Atom x5 and x7 series
— 8:00 AM on March 2, 2015

With the Atom x3 series, Intel has made certain concessions to cost and time-to-market concerns. The company has made fewer compromises with the Atom x5 and x7 series, formerly known as Cherry Trail. These are the first Atoms built using Intel's 14-nm fab process, and they boast Intel Gen8 integrated graphics—the very same found in Broadwell.

Where the Atom x3 lineup targets entry-level phones and tablets, Intel expects the Atom x5 and x7 family to power everything from 7" tablets to 10.1" two-in-ones with prices ranging from $119 to $499.

Source: Intel.

The Atom x5 and x7 series includes three chips, the specs of which are outlined below. Click the buttons above the table to switch between them:

Specifications Intel Atom x5-8300
CPU Quad core 64-bit Atom x5
Up to 1.84 GHz
Process 14 nm
Graphics Gen8 12 EU, up to 500 MHz
DX11.1, OpenGL ES 3.1, OpenCL 1.2, OpenGL 4.3, RS Compute
Media (encode/decode) HEVC (decode), H.264, VP8
Memory 1x32, 1x64 DDR3L-RS 1600, 1-2GB
Display resolution Internal: 1920x1200 (MIPI-DSI or LVDS)
External: 1920x1080 (HDMI)
Modem (discrete) Intel XMM 7260/62 LTE Cat-6
(up to 300Mbps DL) M.2 only for x5 8300
Connectivity Intel WLAN, Intel WWAN (M.2 modules),
Intel NFC
Input Output 6xI2C, 2xHSUART, 1xSDIO, 3xI2S, SPI, PCIe 2.0 x1, 1xI2C(ISH), 1xI2C (NFC)
Storage eMMC 4.51
Up to 8MP Intel RealSense Snapshot

Note that the Atom x5-8300's peak speed is limited to 1.6GHz when more than two cores are active. Also, Intel says its customers can pair that processor with LPDDR3 memory "if needed."

The company hasn't given us a full architectural run-down of Cherry Trail yet, but its presentation did include this block diagram of the new silicon and its accompanying platform:

Source: Intel.

The Airmont CPU cores are die-shrunk versions of the Silvermont cores that debuted inside Bay Trail. Accordingly, Intel says Cherry Trail offers "very similar" CPU performance. Power consumption and battery life, too, are supposed to be similar to last year's offerings.

What has changed since last year is in the graphics department. The new Gen8 IGP makes Cherry Trail quite a bit faster than Bay Trail, by Intel's account. Compared to the Atom Z3795, Intel says the new Atom x7 delivers gains of up to 2X in GFXBench 2.7's T-Rex HD scene and up to 50% in 3DMark's Ice Storm Unlimited test.

On top of that, Cherry Trail's platform has been spruced up with "new user experiences." Those include RealSense, Intel's 3D camera technology; True Key, its facial recognition system; and Pro Wireless Display, which enables secure wireless projection to conference-room screens.

Devices based on Atom x5 and x7 processors are due out in the first half of this year. Intel's partners for the launch include Asus, Acer, Dell, HP, Lenovo, and Toshiba.

Some of those systems will feature LTE connectivity. Initially, Intel says its XMM 7260 LTE modem will be used. In the second half of the year, however, the company plans to release the XMM 7360, its third-gen LTE modem. The 7360 will support 3x carrier aggregation and speeds of up to 450 Mbps.

41 comments — Last by fredsnotdead at 11:53 PM on 03/03/15

Atom x3 chips target cheap phones and tablets, feature ARM graphics
— 8:00 AM on March 2, 2015

At the Mobile World Congress today, Intel unwrapped a new line of Atom processors for entry-level phones and tablets. Known as the Atom x3 series, this lineup comprises chips formerly known under the SoFIA code name. These chips combine x86 Intel cores with integrated baseband modems and, oddly enough, graphics based on ARM's Mali IP.

Source: Intel.

Three Atom x3 processor are launching today. Each one features a built-in cellular baseband modem and comes with companion silicon that adds extra connectivity. The table below offers a more complete overview of each model's specs. Use the buttons above the table to switch between the three chips:

Specifications Intel Atom x3-C3130 (3G)
CPU Dual core 64-bit Atom x3
Up to 1.0 GHz
Process 28 nm
Graphics Mali 400 MP2
OpenGL ES 2.0
Media (encode/decode) Encode: H.264 @ 720p30
Decode: H.264, VP8 @ 1080p30
Memory 1x32 LPDDR2 800
Display resolution 1280x800 @ 60fps
Modem (integrated) GSM/GPRS/EDGE,
HSPA+ 21/5.8, DSDS, eDvP
Connectivity Wi-Fi 802.11bgn, BT 4.0 LE,
Input Output UART/SPI, I2C, I2S, SDIO
Storage eMMC 4.41
Up to 13MP/5MP

According to Aicha Evans, Corporate VP and General Manager of Intel's Communications and Devices Group, the quad-core Atom x3 variants were developed in collaboration with Rockchip, a fabless SoC vendor headquartered in China. Intel first announced the collaboration last May.

All three Atom x3 products are fabbed on a 28-nm process, and Evans suggested pretty strongly that Intel is contracting out manufacturing to third-party foundries. When pressed for specifics, she said, "I'm not going to comment on exactly which TSMC process—or UMC, or whoever it is, right, because that's not appropriate. I don't usually comment on partners and other companies." All Evans would confirm unequivocally was that Intel prioritized "low power, ease of integration, [time to market], and pragmatism."

Some of those same priorities pushed Intel to use ARM Mali graphics instead of a home-brewed IGP. Evans said the Atom x3 line was born out of the SoC portfolio of Infineon's Wireless Solutions business, which Intel acquired four years ago. "In that device, there's already . . . ARM Mali graphics," she explained. "And we decided that, instead of ripping that apart, we were going to focus on fast time to market [and] be practical."

For reference, the fastest Atom x3 of the brood features Mali-T720 graphics, which ARM launched last year and described as a "cost-optimized solution . . . derived from the market-leading Mali GPU found in the Samsung Galaxy Note 3." The dual-core Atom x3's Mali-400 graphics, meanwhile, were previously featured in the Galaxy S2.

Intel didn't share any graphics performance benchmark numbers, but its presentation included some CPU performance estimates. Those estimates suggest the Atom x3 family could be quite competitive with ARM-based SoCs from Qualcomm and MediaTek:

Intel says the dual-core Atom x3-C3130 is shipping now, while the quad-core Atom x3-C3230RK is coming later in the first half of the year. The LTE-infused Atom x3-C3440 will follow in the second half. In all, the chipmaker names 19 partners on board with the Atom x3 rollout, including Asus, Compal, Foxconn, Pegatron, Weibu, and Wistron.

The Atom x3 family will complement the Atom x5 and x7 series, which feature Intel Gen8 graphics and are manufactured using the company's 14-nm fab process. Check out our coverage of the Atom x5 and x7 processors here.

34 comments — Last by willmore at 8:35 AM on 03/03/15