Most of the world gets a variant of the Note 4 based on the familiar Qualcomm Snapdragon 805 system-on-a-chip (SoC), a fine chip that’s getting a little long in the tooth. In Samsung’s home country of Korea, though, the firm ships a different variant of the Note 4 based on Exynos 5433 SoC from Samsung’s System LSI. This brand-spanking-new SoC is manufactured using the latest 20-nm HKMG fabrication process from Samsung’s chip-making operation, and it’s a showcase for the newest CPU and graphics technology from ARM. With eight 64-bit CPU cores and a 64-bit Mali-T760 GPU, the Exynos 5433 could make this version the fastest and most capable Note 4—and it gives us some quality time with the Cortex-A53 and A57 CPU cores that will likely dominate the Android market in 2015.
Wrapped around this intriguing SoC is an astounding, premium phone-tablet hybrid that goes toe to toe with the iPhone 6 Plus. It includes one of the best displays we’ve ever seen—anywhere. Naturally, we’ve spent some time looking at most aspects of the Note 4.
Galaxy Note 4
|Display size & resolution||5.7″ 2560×1440|
|System RAM||3GB LPDDR3|
|Flash storage capacity||32GB|
|Primary camera resolution||16 megapixels
|Cellular||2G GSM, 3G UMTS,
2.4GHz + 5GHz
|Other connectivity||Bluetooth 4.1,
NFC, MHL, ANT+
|Operating system||Android 4.4.4
with TouchWiz UI
That said, you can’t buy this model of the Note 4 on these shores, so I won’t claim this is a complete product review. For instance, I couldn’t muddle through the menus for all of the features, some of which seem to rely on software supplied by Korean carrier Olleh. They don’t switch to English text when the rest of the phone does. All indications are that this version of the Note 4 is essentially the same hardware as the Qualcomm-based version wrapped around a different motherboard. Still, some features won’t carry over. For example, I don’t believe this version of the phone supports quick-charge battery tech, which is a Qualcomm exclusive.
We’ll look at this version of Note 4 closely, but keep in mind that you may not get the same experience out of the variant of the Note 4 sold in North America.
Design and build quality
Samsung has built the Note 4 with premium everything, from the Corning Gorilla Glass on the device’s front to the metal chassis and high-grade textured plastic on its back. The in-hand feel is excellent, and all of the buttons are well-placed, with solid, clicky feedback when pressed.
|Samsung Galaxy Note 4|
|Height x Width x Depth||153.5 x 78.6 x
|Weight||6.21 oz/176 g|
|I/O ports||microUSB connector,
battery bay, Micro-SIM slot
|Available colors||Charcoal black,
bronze gold, blossom pink
This is, of course, a Really Large Phone. Heck, it’s four grams heavier and a couple of millimeters thicker than the ridiculously large iPhone 6 Plus, with a slightly larger 5.7″ display. The one dimension in which the Note 4 is smaller is important: vertically, where the 6 Plus is a handful of millimeters taller. Neither phone is gonna fit easily into every front pocket, though.
The Note 4 knows one trick that will confound the competition. Slide a fingernail into a small slit on the device’s back cover, run it around the perimeter of the device, and the entire plastic back of the phone pops off with relative ease. Doing so reveals a swappable battery, the Note 4’s Micro-SIM slot, and a microSD slot that can host up to 128GB of additional flash storage. This sort of flexibility is appealing, but I’d rather not have to pull off the back to swap batteries on a daily basis. The process is a little too tedious, and the plastic rear panel feels a little too fragile for everyday access.
The Note 4’s home button houses a fingerprint reader for biometric user authentication. Unlike Apple’s Touch ID sensor, though, this one requires a swipe of your digit across the surface of the button. I set it up with my thumbprint and found out that I’m evidently a clumsy swiper. I’d usually have to swipe several times in order to login successfully. I can’t imagine using the fingerprint reader in place of a numeric PIN or Android’s trace-a-pattern unlock mechanism in everyday use. Those other methods just seem quicker.
Combined with the Note 4’s NFC capability, the fingerprint reader should be capable of supporting payment systems like Google Wallet. I think you’d want to practice your swipe technique before uncorking a payment attempt in line at Target, though.
Notice the stylus peeking out of its cradle there, as well. Pen-based input has always been one of the Note 4’s calling cards. I’ll admit I didn’t spend much time with pen input on this version of the Note, in part because I couldn’t read some of the Korean-language software prompts. I’m intrigued by pen input, though, and wish I had something similar to use for taking notes at trade shows.
The Exynos 5433 SoC
We know the major components of the Exynos 5433 SoC thanks to the Note 4’s public specifications, but quite a few of the details are still mysterious. When asked, Samsung’s System LSI confirmed that the chip is manufactured on a 20-nm fabrication process, but the company declined to answer our other questions about this SoC’s specifics. At least this TechInsights teardown gives us a look at the Note 4’s motherboard. Happily, ARM has been fairly forthcoming about some of the major components used in this chip, as well.
Galaxy Note 4
|Manufacturing process||Samsung 20 nm|
|Die size||113 mm²|
|CPU cores||4 Cortex-A57 + 4
|A53 quad die area||4.6 mm²|
|A57 quad die area||15.1 mm²|
|Max core frequency||1.9GHz (A57) / 1.3GHz
|System memory||3GB LP-DDR3|
|Memory config||2 x 32-bit
channels at 825 MHz
The Exynos 5433 hosts a pair of quad-core CPU clusters, one comprised of Cortex-A53 cores and the other of Cortex-A57s. These CPU cores are compatible with the 64-bit ARMv8 instruction set architecture. Samsung has paired them with a 32-bit version of Android, so the Note 4 can’t reap all of the benefits of ARM’s new instruction set (which can add up to a ~6% performance improvement independent of other factors.) Still, the Note 4 does make use of ARM’s updates to the 32-bit AArch32 instructions, so it shares in some of the improvements, including AES encryption acceleration.
The presence of eight CPU cores may seem like overkill for a phone—and it probably is—but Samsung’s engineers have made the SoC more efficient by implementing ARM’s big.LITTLE scheme for power-efficient performance. To understand how big.LITTLE works, we first need to understand the differences between the CPU cores in question.
The Cortex-A53 is the latest iteration of ARM’s small, ultra-efficient CPU core for application processors, the successor to the Cortex-A5 and A7. This core has a pretty small footprint. Four A53s situated together in a quad occupy about the same die area as a single Cortex-A57.
The A53’s microarchitecture borrows heavily from the Cortex-A7 before it. The A53 can issue two instructions per clock cycle, and instructions execute in program order. The main execution pipeline is just five stages long, while the floating-point/SIMD side has seven stages. ARM thinks the A53 has taken this simple structure about as far as possible in terms of instruction throughput and power efficiency. Thanks to a host of tweaks—including better branch prediction, higher internal throughput, and power reductions than can be converted back into performance—the A53 is over 40% faster than the Cortex A7, according to ARM’s own estimates. (In fact, ARM tells us the A53 is roughly 15% faster than the mid-sized Cortex-A9 rev4.) Crucially, the Cortex-A53 is fully ARMv8 and 64-bit compliant.
The Cortex-A57, meanwhile, is ARM’s largest core. Derived from the Cortex-A15 used in a number of today’s phones and tablets, the A57 adds ARMv8 support and incorporates a number of changes meant to increase instruction throughput. ARM intends to see the A57 used in servers, not just mobile devices, so it’s pretty beefy. This core can fetch, decode, and dispatch three instructions per clock cycle. The engine gets wider after that, as illustrated in the block diagram above, and it executes instructions out of program order to improve throughput. The A57 is quad-issue into the integer execution units, dual-issue to the floating-point/SIMD units, and dual-issue to the load/store unit. ARM estimates the A57 outperforms the A15 by 25% or better.
A single A57 cluster can host up to four CPU cores, and those cores use a single, shared L2 cache up to 2MB in size.
The idea behind ARM’s big.LITTLE is to extend the dynamic operating range of a chip’s CPU cores beyond what’s possible with a single CPU architecture. big.LITTLE operates in conjunction with traditional SoC power-saving measures. The CPU cores still operate at a range of clock speeds and voltages, depending on how busy they are. The CPU cores can still gate off clock signals to inactive units. Idle CPU cores or clusters can still be powered down temporarily when they’re not needed. The difference with big.LITTLE is that threads can also be shifted from a large core to a small one, or vice-versa, depending on which type of core provides the most optimal operating point for the thread’s current demands.
For instance, a simple thread that polls a phone’s GPS sensor periodically might never need anything more than a Cortex-A53 in order to do its thing. Running that thread on a small core might be the most energy-efficient arrangement. Meanwhile, a big, branchy thread for rendering a webpage might fare best when shifted to a Cortex-A57 for quick completion. Since both of these core types support the full ARMv8 instruction set, transitions between them should be seamless.
Earlier SoCs have deployed big.LITTLE in relatively simple fashion, swapping threads between a pair of quad-core clusters or migrating directly between big and little cores as needed. More recently, ARM and its partners have moved toward an arrangement known as global task scheduling in order to extract the most efficiency out of big.LITTLE operation. Global task scheduling is a form of asymmetrical multiprocessing in which all cores are active. The OS scheduler—in this case, a modified version of the Android kernel—chooses where to place threads. Newer Exynos SoCs, including the 5433, have been widely reported to use global task scheduling.
In theory, the most efficient hardware configuration for a mobile device with big.LITTLE would likely involve two big cores and four small ones, for reasons illustrated above. Even relatively intensive workloads like games don’t spend much time executing on the large cores—and the amount of time spent in the highest power state on the big cores is vanishingly small. Two big cores should be more than sufficient to keep performance from dropping in cases where an especially difficult code sequence must be executed. Either the Exynos 5433 was originally conceived with eight cores for use with CPU migration or, perhaps more likely, the octal core config was chosen for marketing rather than power-performance reasons.
“Eight cores” does have a nice ring to it, I suppose.
One thing to keep in mind as we look at the benchmarks below is that the Note 4’s measured CPU performance should largely be defined by its Cortex-A57s. When you’re running benchmarks that really push the limits, the big cores will be the ones doing the lion’s share of the work. The A53s might chip in a little during multithreaded tests in a global task scheduling scheme, which is an interesting prospect, but they’re not going to be the main attraction.
The Exynos 5433 has another major piece of ARM tech inside: the CoreLink CCI-400 north bridge, which glues together the CPU clusters, the Mali graphics processor, and everything else. ARM’s north bridges support the proper interfaces and provide hardware cache coherency, so they should work seamlessly with big.LITTLE thread migration.
We have several devices of, er, note to compare against the Galaxy Note 4 with the Exynos 5433 SoC. The iPhone 6 Plus only has two cores, but those are relatively fast CPU cores of Apple’s own custom design. Aside from the Note 4, the iPhone 5S, 6, and 6 Plus have the only other ARMv8-compatible cores in this comparison. The LG G3 and OnePlus One are both based on the Qualcomm Snapdragon 801 with quad Krait cores. The version of the Note 4 based on the Snapdragon 805 should be up to 20% faster than these devices. Finally, one of the more intriguing comparisons is Nvidia’s Shield Tablet, which packs four Cortex-A15s at up to 2.2GHz. The Cortex-A15 is the direct architectural predecessor to the Cortex-A57, but aboard the Shield Tablet, those cores are operating within the larger power envelope of an 8″ device.
The Note 4 with Exynos isn’t off to a terribly impressive start in this synthetic test of memory bandwidth. The Exynos 5433 has been reported to have dual-channel 32-bit memory running at 825 MHz, or 1650 MT/s, which would yield 13.2 GB/s of peak bandwidth. The Note 4 doesn’t come close to reaching that mark in Stream’s copy test (and it’s no faster in scale, add, or triad, which I’ve not reported above).
The Note 4’s relatively weak showing could be the result of bumping up against a power management limit—directed tests sometimes do that—or it could be something else. Most modern CPUs max out their memory bandwidth by pairing a relatively large cache with a predictive pre-fetch mechanism that analyzes access patterns and pulls data from memory before it’s needed. The Cortex-A57 cluster in the Note 4 may not be tuned as aggressively as its competition for whatever reason.
Geekbench runs natively on both iOS and Android, and it offers us a look at both single- and multi-threaded performance. You can click on the buttons below to toggle between the two sets of results.
The Note 4 performs well in the single-threaded tests, turning in the best score overall among the Android-based devices. Only Apple’s ARMv8 custom cores are faster—and that depends on the test. In integer math, the Note 4 outperforms even the Cyclone core in the iPhone 5S. Only the iPhone 6 and 6 Plus are faster.
Interestingly, in these same integer tests, the quad Cortex-A57s in the Exynos 5433 substantially outperform the four Cortex-A15s in the Shield Tablet’s Tegra K1 SoC, despite spotting them a 300MHz clock speed advantage. The gap closes almost entirely in the floating-point math tests, though.
Switch over to the multithreaded results, and the Exynos 5433 looks even stronger. The Note 4 takes the top spot in Geekbench overall and outright dominates the multi-core integer scores. With only two cores on tap, the 6-series iPhones fall well behind.
The AES encryption test illustrates the impact of tailored acceleration instructions built into the ARMv8 instruction set. The Exynos 5433 benefits from these instructions, as do Apple’s newer custom cores.
Since the Exynos 5433 uses the global task scheduling version of big.LITTLE, we might be able to find some evidence of its four A53 cores assisting the four A57s. One indicator would be cases where the Note 4’s multithreaded performance is more than four times its single-threaded performance.
|Geekbench floating point||1019||3841||3.8x|
|Geekbench AES encryption||818||4069||5.0x|
If the Note 4 is using more than four cores, the effect is fairly subtle. Only in certain sub-tests, like AES encryption, does the Note 4 achieve a speed-up of more than 4x with multiple threads. Hmm.
We might get some more insight by comparing Note 4’s scaling to that of the quad-core Shield Tablet. Here are the overall results along with some hand-picked sub-tests that show the best scaling from one to many threads.
|Shield Tablet integer overall||1197||4303||3.6x|
|Note 4 integer overall||1573||5935||3.8x|
|Shield Tablet AES encryption||67||305||4.6x|
|Note 4 AES encryption||818||4069||5.0x|
|Shield Tablet SHA1 encryption||2108||8259||3.9x|
|Note 4 SHA1 encryption||4393||21792||5.0x|
|Shield Tablet JPEG compress||1322||5384||4.1x|
|Note 4 JPEG compress||1383||5866||4.2x|
The Note 4 does scale slightly better than the Shield Tablet in Geekbench’s overall index, but both are less than four times as fast. In certain sub-tests, the Note’s multithreaded performance improves by more than 4x, but the Shield Tablet’s does, too. Some other factor, like good locality in the tests’ access patterns keeping the caches warm, could account for scaling beyond 4x.
If the Cortex-A53s in the Exynos 5433 are contributing to higher overall performance, it’s awfully hard to tell by looking at these benchmark scores. The reality may be that the Exynos 5433 and its quad A57 cores are too power-constrained to allow those Cortex-A53s any meaningful thermal headroom. Devoting all of the juice to the A57s instead may be the best use of the SoC’s power budget, anyhow.
Notice that the two top spots in SunSpider and Kraken are occupied by desktop CPUs. I sneaked those in just for fun. They have the benefit of much larger power envelopes.
BaseMark OS II
The rest of the CPU benchmark results are strangely inconclusive, in a way. The Note 4 easily outperforms the Snapdragon 801-based LG G3 in Basemark, just like it does in Geekbench, but the balance changes in WebXprt. There’s no question the Exynos 5433 is a solid performer overall, but its position relative to the competition depends quite a bit on the workload in question.
That said, the contest with the iPhone 6 Plus looks to be pretty clear-cut. The iPhone is at or near the top of the stack in nearly every case. Apple’s custom CPU core still gives it an edge, even though the Cortex-A57 helps to close the gap.
The Mali-T760 GPU in the Exynos 5433 is a full-featured graphics processor based on ARM’s Midgard architecture. The Note 4 has given us our first chance to spend significant time with a Mali-equipped device, and our impressions are generally quite positive.
|Samsung Galaxy Note 4|
|GPU||ARM Mali-T760 MP6|
|GPU die area||30.9 mm²|
|Est. clock speed||700 MHz|
|Texture filtering||6 texels/clock|
|Pixel fill||6 pixels/clock|
|System memory||3GB LPDDR3|
Midgard is a bit of an unconventional GPU architecture in terms of its execution model, but it has a robust feature set, especially for the mobile space. The Mali-T760 supports 64-bit addressing and can handle IEEE-compliant floating-point datatypes, including 64-bit double precision. In fact, the Mali-T760 has a nearly desktop-class feature set, with support for DirectX 11.1 (feature level 11_1, or the real thing), OpenGL ES 3.1, and OpenCL 1.2.
Thanks to this combination of high mathematical precision and standards compliance, the T760 is perhaps better suited for GPU computing than most of its competition in the mobile GPU market. ARM was one of the first companies to join the HSA consortium, along with AMD, and has publicly supported that effort.
I’m tempted to call the Mali-T760 a “tiler,” but I expect there are folks in the PowerVR camp who would take umbrage with that wording. ARM’s Midgard architecture doesn’t use fully deferred tile-based rendering like Imagination Tech’s PowerVR GPUs. Midgard uses early Z detection like conventional immediate-mode renderers in order to avoid drawing some pixels that would be occluded by other polygons in the final scene, but it doesn’t reorder the graphics pipeline in order to eliminate overdraw entirely. Instead, Midgard renders all pixels into on-chip buffers representing 16 x 16-pixel tiles, so blending and overdraw happens on the chip. Midgard can conserve bandwidth and save energy by using a tile buffer in this fashion, since DRAM transactions tend to burn a lot of power.
Divining the structure of Midgard’s “shader cores” can be a little confusing. ARM’s public documentation appears to divvy things up according to the number of flops the hardware can process rather than its likely underlying organization. Each Midgard “shader core” has two arithmetic pipelines. My sense is that those pipelines break down into two stages: the first stage includes a 128-bit-wide vector unit plus a scalar ALU. The second stage has a special function unit capable of a four-wide vector dot product, and there’s a scalar ALU in this stage, too.
Midgard’s shader pipelines are unusually flexible in their support for different datatypes. That 128-bit-wide vector unit can be subdivided in various ways. A single vector ALU can process two 64-bit operations, four 32-bit ops, or eight 16-bit operations in a clock cycle.
If you’re counting flops at home, the most relevant tally involves 32-bit operations. That big vector unit in the first stage can process four multiply + add operations (eight flops) and the scalar ALU can contribute a multiply (one flop). Stage two’s SFU then contributes seven more flops, and the scalar ALU adds another, for a total of 17 from the pipeline in each clock cycle. Thanks to the dual pipes, that’s a total of 34 flops per cycle from each “shader core.”
ARM offers versions of the T760 that scale up to 16 shader cores, as depicted in the diagram above. The Exynos 5433 hosts a six-cluster version of the GPU known as the Mali-T760 MP6. Reports have suggested this GPU runs at 700MHz in the Note 4.
If that clock speed is correct, then on paper, the Note 4’s key graphics rates should look awfully similar to those of the PowerVR GX6450 in the iPhone 6 Plus. The two would have roughly similar pixel fill (~4 gigapixels/second) and bilinear texture filtering (~4 gigatexels/second) rates, and both devices’ fp32 arithmetic throughput should peak at about 140 gigaflops. I hesitate to put those numbers into a table, since they’re both theoretical and speculative in nature, but the story here should be rough parity.
Of course, theoretical peaks are one thing, and delivered performance is another. We know the PowerVR GPUs tend to be very efficient in real applications with their given resources. The Midgard architecture also contains some provisions meant to boost efficiency, including a nifty transaction elimination feature that only updates the contents of framebuffer pixel blocks when they have changed from the prior frame.
The story of rough parity we outlined above plays out as expected in the fill rate test, but the Note 4 trails the 6 Plus by quite a bit in GFXBench’s ALU test. I’m not sure exactly what this directed ALU test does or whether it takes advantage of the dot-product unit in the Midgard shader core. I wouldn’t read too much into these results, though.
The Note 4 also trails the 6 Plus in this graphics-specific test of alpha blending capacity. Then again, so does the Tegra K1 chip in the Shield Tablet, and we know it’s one of the faster mobile GPUs on the planet.
As I understand it, this benchmark attempts to measure driver overhead by issuing a draw call, changing state, and doing it again, over and over. Performance in this test may end up being gated by CPU throughput as much as anything else.
All three of these tests are rendered off-screen at a common resolution, so they’re our best bet for cross-device GPU comparisons. They’re also more complete benchmarks than the directed tests above, since they involve rendering a real scene that could plausibly come from a mobile 3D game. The older iPhones can’t run GFXBench’s “Manhattan” test, because it requires OpenGL ES 3.0 compliance.
The iPhone 6 Plus leads all of the phones in the two GFXBench benchmark scenes, while the Note 4 trades blows with the LG G3 and the OnePlus One, both of which are based on Qualcomm’s Snapdragon 801 GPU with Adreno graphics. The Note 4 with Exynos comes roaring back to take the lead in 3DMark’s Ice Storm test. Only the Shield Tablet, which has a larger power envelope, is faster.
Native device resolution gaming
The Note 4 has a higher display resolution than most of its competitors, including the iPhone 6 Plus, so it has to work harder to paint all of those pixels. As a result, the Note 4 ends up shadowing the L3 G3 in these native-resolution gaming tests. Like the Note 4, the G3 has a 2560×1440 display.
The iOS version of Basemark X runs on-screen and off-screen tests and then spits out a single, composite score. I wish we could break out the component tests, especially since this benchmark walks through a nice-looking scene rendered using the popular Unity game engine. As you can see, the Note 4’s Mali GPU fares well in this benchmark.
One other feature of Basemark X is an intriguing quantitative test of graphics image quality.
Real-time graphics is strange because there’s not always one “right” answer about the color of a rendered pixel. Filtering methods and degrees of mathematical precision vary, since GPU makers take different shortcuts to trade off image quality against performance.
Basemark X attempts to quantify the fidelity of a GPU’s output by comparing it to some ideal—in this case, I believe the reference renderer is a desktop-class graphics chip. That’s a fair standard, since desktop chips these days produce something close to ideal imagery. The higher the signal-to-noise ratio reported, the closer these mobile GPUs come to matching the reference image. (Frustratingly, a couple of the devices refused to run the quality test with “out of memory” errors.)
The Mali-T760 fares well in these tests, scoring better than the rest of the mobile GPUs with the exception of the desktop-derived Tegra K1. This solid showing is likely a consequence of ARM’s decision to build in support for IEEE standard datatypes. It’s also an indicator that ARM hasn’t cut too many corners on things like texture filtering techniques in order to boost performance.
To give you a further sense of the Mali-T760’s image quality, I’ve embedded screenshots below from the Note 4 with Exynos and the Nexus 7, which has a Qualcomm Adreno 320 GPU. Move your mouse over each image to see a close-up version of it, or click to open the full-resolution screenshot.
There are more pixels in the Note 4’s image thanks to the device’s higher display resolution, but otherwise, I think the output produced by these two GPUs is fairly comparable.
Especially in the brightest areas of the image near the upper middle of the screen, the Note 4’s imagery tends to look softer. I believe the Mali GPU is applying a post-process bloom filter that the Adreno GPU in the Nexus 7 isn’t using—likely because this application knows the Mali GPU is more capable.
In terms of both image quality and performance, the T760 appears to be a credible alternative to the competition from Qualcomm and Imagination Tech.
In several ways, the Mali-T760 MP6 built into the Exynos 5433 isn’t as efficient as the PowerVR GPU in the iPhones. Although the Note 4 and 6 Plus would seem to be equals on paper, the iPhone generally outperforms the Note 4 in offscreen tests at the same resolution. Meanwhile, the PowerVR GPU on Apple’s A8 SoC occupies about two-thirds of the die space that the Mali-T760 MP6 does in the Exynos 5433. (We’ll consider the question of power efficiency momentarily in our battery life tests.) The counterbalance here is that ARM’s Midgard architecture offers full IEEE compliance. Midgard produces measurably higher fidelity images and is better suited for use in GPU computing applications.
This version of the Note 4 ships with 32GB of onboard MLC flash memory. About 25GB of that space is available after Samsung installs the operating system and its own suite of software.
The Note 4 writes data into its flash storage array nearly as quickly as it reads it, according to these simple tests. As a result, the Note 4’s storage subsystem is one of the fastest ones we’ve tested overall. Still, its 204MB/s average read speed isn’t anything to write home about.
We tested battery life in four different scenarios. In each case, the phones’ display brightness was set to 180 cd/m² before starting, and display auto-brightness features were disabled. Our workload for the web surfing tests was TR Browserbench. The video test involved looped playback of a 1080p video recorded on one of the phones, and our gaming workload was the Unreal Engine-based Epic Citadel demo.
I’d say the iPhone 6 Plus has a slight edge in our battery life results, overall, versus the Note 4. Both of these “phablets” have exceptionally long run times, though.
Keep in mind that the 6 Plus has a 2915-mAh battery, while the Note 4’s battery capacity is a tad higher, at 3220 mAh. Then again, the Note 4 has to drive a slightly larger display with substantially more pixels. The fact that these two devices are so closely matched in our run time tests speaks well of the Exynos 5433 SoC.
The fact that the Note 4 reaches nearly six hours of run time in our gaming test, with Epic Citadel, is an indicator that its Mali-T760 GPU is reasonably power-efficient, too.
The display on the Note 4 will change your view of what’s possible. That’s probably the most succinct way I can sum it up. Samsung’s AMOLED technology is heart-stoppingly bright and crisp, with color and clarity that you may not find anywhere else. Periodically throughout my time with the Note 4, I’d just spend a moment staring at the display in stupefied wonder. I’m constantly surrounded by outstanding desktop displays with IPS panels, but the Note 4 triggers a response in me like almost nothing else.
Galaxy Note 4
|Display type||Super AMOLED|
|Other||S Pen support|
I say “almost” only because the Note 4’s top competitor, the iPhone 6 Plus, also has a display that’s edge-of-the-bell-curve excellent. The two are natural rivals, although they’re strong in different ways.
Like many of Samsung’s past OLED panels, the Note 4’s display does not use a conventional RGB subpixel arrangement. Instead, the OLED panel has twice as many green subpixels as it does red and blue. When discussing Samsung’s older OLEDs that use pentile subpixel layouts, I might have complained about the fact that the display doesn’t really offer its “true” advertised resolution. I might have worried about the difficulty of making vertical or horizontal edges look perfectly straight. Those are criticisms one could offer about this panel, too, and they might not be wrong.
But those objections will melt away as soon as your rods and cones feast on the reality of this display. The effective resolution is razor-sharp, clearly higher than the iPhone 6 Plus’s. I’ve taken some close-up screenshots of the two displays side by side that you can look at below. These shots frankly show more detail than you’re likely to notice with the naked eye, even if you’re just a handful of inches from the screen.
You can move your mouse over the thumbnail to see a pop-up window with a close-up photograph. Mobile users, just tap the thumbnail to load the full image and pinch-zoom to your heart’s content.
According to DisplayMate’s informative analysis, the Note 4 has several user-selectable modes for display calibration and color gamut targets. By default, the Note 4 ships in Adaptive mode, which offers a wide color gamut. Images look vibrant and colors heavily saturated in this mode, a trait that has kind of become a hallmark of Samsung’s consumer devices. Virtually no content is authored to take advantage of this color mode, so it’s not exactly true to life. The Note also offers a Photo mode calibrated to the Adobe RGB standard and a Basic mode meant to match the widely used sRGB standard. To my eyes, the Basic mode’s colors look most correct—and therefore pleasing—most of the time.
We tested the Note 4 in both Adaptive and Basic modes to generate the results below.
I’d like to show you contrast ratios, but the pixels of the Note 4’s AMOLED display only glow when they need to, so black images on the Note 4 are perfectly black. That leads to a contrast ratio of DIVIDE BY ZERO ERROR. Whoops.
Still, one can see that the Note 4’s combination of white and black levels offers exceptional contrast and “perfect” blacks. None of the competing displays based on IPS LCD technology can replicate that latter trait. That said, the iPhone 6 and 6 Plus offer nearly comparable perceived contrast by combining reasonably dark black levels with substantially brighter white levels than the Note 4.
In Adaptive mode, the Note 4’s OLED is capable of displaying a broader spectrum of colors than anything else we’ve measured. When content authoring standards catch up, OLEDs like this one should allow higher-fidelity color reproduction all around. Switch the Note 4 into Basic mode, and it constrains itself almost perfectly to the sRGB color gamut.
The Note 4’s excellence continues in our color tests. In Basic mode, the display tracks closely with the 6500K white point standard, and its overall color accuracy rivals that of the two new iPhones. The Adaptive mode’s gamut-stretching hyper-saturation is reflected in higher levels of delta-E, especially in the primary and secondary colors. Even that outcome is the result of an intentional choice by Samsung. Still, I wish Adaptive mode wasn’t the default setting for the device.
The Note 4 would have a clear lead over the iPhone 6 Plus in the display department were it not for a single obvious weakness: color shift at indirect viewing angles. I first noticed this quirk of the Note 4’s display while reading a book on the Kindle app at the dinner table. The Note 4 was sitting flat on the table next to me, and the white background had taken on a blue-green cast. I think I’ve captured the shift in the images above. The top image shows that the 6 Plus and Note 4 are comparable when viewed straight on. As you can see in the image on the lower left, the angle doesn’t have to be terribly extreme for the Note 4’s colors to begin to shift. The display on the 6 Plus exhibits almost no color shift at these angles thanks to the dual-domain pixels on its IPS panel.
This version of the Note 4 isn’t sold here, and the operation of its camera may be substantially different than what you’d find in the U.S. version. As a result, our coverage of the camera won’t be terribly extensive. Still, the results we were able to get when shooting with this device are certainly worth sharing.
|Samsung Galaxy Note 4|
|Primary camera resolution||16 megapixels
|Front-facing camera resolution||3.7 megapixels (2560×1440)|
|Max video resolution||UHD 4K
(3840×2160) at 30 FPS
The Note 4’s primary camera has double the pixel count of the cameras on the latest iPhones. Chasing more megapixels isn’t always the best choice, but the Note 4 can sometimes capture more detail than the iPhones, as we’ll see in at least one sample shot. More notably, this version of the Note 4 has optical image stablization, which should make it easier to achieve sharp focus in low-light shooting.
Also, several of Samsung’s phone cameras have included an advanced autofocus technique known as phase detection. Imported from the world of DSLR cameras, this mechanism can greatly enhance the camera’s reflexes, cutting the time required to bring a subject into focus. Apple adopted this method in the new iPhones to good effect. Samsung Korea’s spec sheets aren’t terribly helpful on this front, but I believe this Note 4 may be blessed with phase-detection autofocus, too.
What I can tell you is that the camera in the Note 4 with Exynos rivals the one in the iPhone 6 Plus as one of the best we’ve seen in any mobile device. Focus times are exceptionally quick, as is the overall process of capturing images. The dual image signal processors in the Exynos 5433 avoid injecting any tangible delays into normal shooting.
The sample shots
I’m reasonably satisfied that the shots below do most of the work we need, but this setup does have some limitations. For one, I tried to select the best exposures from each camera for the samples below, and I used burst mode to shoot them when possible. As a result, you can’t know from looking at the examples how much easier it was to get clear, sharply focused shots on the phones with optical IS. One can usually achieve similar results on the iPhone 6 without optical stabilization, but not nearly as consistently.
To get a sense of what the Note 4’s 16-megapixel sensor buys you in terms of clarity, have a look at the satellite dish on left of the scene, attached to my neighbors’ house. You can’t read text in the DirecTV logo on the shot from the iPhone 6 Plus, but on the Note 4, it’s clear as day.
That said, the Note 4’s image does have its weaknesses. I was surprised to notice that some colors look a little washed out, most obviously the blue of the sky and the putty-colored paint on the house. Also, Samsung’s default sharpening routines are very aggressive. The heavy sharpening produces images that really pop on the Note 4’s amazing display, but when you zoom in and look at the pictures closely, the resulting artifacts aren’t hard to spot. Check out how extensively the individual blades of grass and the tree branches have been sharpened on the Note 4. Personally, I prefer the less intrusive approach Apple has taken on this front.
Sample scene: Santa
This is my favorite sample scene, since it’s a challenging low-light setting with a complex subject. In this case, even with twice as many megapixels and heavier post-process sharpening, I don’t think the Note 4’s results look as clear as those from the iPhone 6 Plus. The image from the 6 Plus does seem to have a little more of the noise that comes with shooting in high-ISO modes, though.
Be aware that I’m splitting hairs. These two premium phablets both take fantastic pictures, and they seriously outclass cheaper contenders like the OnePlus One.
Sample scene: Edinburgh
One other thing you’ll notice if you click on the sample shots: the Note 4’s pictures aren’t the conventional shape. Samsung has chosen a wider 16:9 aspect ratio for still photos on this product, just as with video. I suppose that’s heresy, but I kind of like it.
The Note 4 will shoot video at resolutions up to 4K (3840×2160) at 30 FPS, and the Exynos 5433 supports H.265 compression to facilitate 4K capture. That resolution is easily a step above the 6 Plus, which tops out at 1080p. Unfortunately, I didn’t find any slow-motion video modes in the Note 4 to rival the 240 FPS capture capability Apple has implemented.
Although we didn’t shoot in 4K, we do have some examples of recorded video from each phone below. YouTube’s compression has done some violence to the quality of the video produced by each device, but these examples should serve to illustrate a couple of things.
iPhone 6 Plus:
Even through all of the compression, I think the individual frames of video look sharper on the Note 4 than on the 6 Plus. However, for this particular scenario where the phone is handheld and I’m walking along, Apple’s algorithmic video stabilization is more effective than lens-based optical image stabilization alone.
Even though this version of the Note 4 isn’t available here, the total package is good enough to make it worth our time. The Exynos 5433 chip is particularly intriguing—especially now that Qualcomm has lost a major design win for the Snapdragon 810. (It’s widely expected that Samsung is the customer Qualcomm lost.) One way or another, the Cortex-A57 and A53 pairings running in a big.LITTLE configuration will be common in 2015’s premium Android devices.
With its quad Cortex-A57 cores doing the heavy lifting, the Exynos 5433 posted the top score of any mobile SoC we tested in Geekbench’s multithreaded performance index. Most phone applications are limited by the performance of one or two threads, and in those cases, the A57 doesn’t quite keep pace with Apple’s beefy custom CPU core. The Cortex-A57 does close the gap with Apple substantially, though, while adding 64-bit compatibility.
Our first look at Mali graphics left us with a positive impression, too. The Mali T760 MP6 GPU in the Exynos 5433 offers decent performance and better-than-average image quality for the mobile market. Thanks to its robust support of high-precision datatypes, the Mali Midgard architecture has the potential to encourage wider use of GPU computing in mobile devices, as well. As we’ve noted, however, this version of Mali doesn’t appear to be as efficient as the PowerVR GPU in the iPhone 6 Plus. The T760 doesn’t extract as much delivered performance from its theoretical peak specs or from its die area.
Speaking of efficiency, our battery life results suggest that both the Cortex A57/A53 big.LITTLE CPU config and the Mali-T760 MP6 graphics processor in the Exynos 5433 are reasonably power efficient when paired with Samsung’s 20-nm fab process. At the very least, Samsung has taken the appropriate measures to ensure nice, long battery run times from the Note 4 in the scenarios we tested.
Aside from the whole “you can’t buy it here” thing and some Korean-only menus, my main beef with this version of the Note 4 is, perhaps predictably, the clumsy TouchWiz skin that Samsung drapes over the lithe body of the stock Android user interface. TouchWiz and I are sworn enemies going way back to the Gingerbread days. I was able to replace some of the TouchWiz elements by installing Google alternatives from the Play Store, but there are definite limits to that approach.
You know what’s impressive about the Note 4? The hardware is good enough to tempt me to relax my stance on the TouchWiz issue. I’m not talking just about the Exynos SoC but the whole shebang—the display, the enclosure, the display, the camera… and did I mention the display? The Note 4 is fast enough that it doesn’t feel particularly slow in absolute terms, even when saddled with TouchWiz. But it does feel slower than it would with stock KitKat, given its hardware spec and measured performance.
Don’t tell anyone, but I would probably do terribly questionable things in order to get a version of the Note 4 with stock Android Lollipop pre-installed.
TouchWiz fanboys can vent their rage at me on Twitter.