Medfield performance, power numbers leak out

New details have leaked out on Medfield, the 32-nm Atom system-on-chip Intel is targeting at tablets. VR-Zone has the scoop on the upcoming processor’s development platform, which is said to feature a 1.6GHz CPU backed by 1GB of DDR2 RAM. The system sports a 10.1" screen with a 1280×800 display resolution and runs the Honeycomb variant of Android, though Ice Cream Sandwich will purportedly be the OS of choice for shipping products. Intel worked with Google to add x86 optimizations to the latest version of the tablet operating system.

The VR-Zone story also has performance scores for Caffeinemark 3, a browser-based Java benchmark:

Intel Medfield 1.6GHz currently scores around 10,500 in Caffeinemark 3. For comparison, NVIDIA Tegra 2 scores around 7500, while Qualcomm Snapdragon MSM8260 scores 8000. Samsung Exynos is the current king of the crop, scoring 8500.

Alas, the site doesn’t provide numbers for Nvidia’s new Tegra 3 processor. There’s no sense of how well the Intel chip’s graphics component, long a weak point of the Atom, compares to the integrated GPUs offered by the competition. VR-Zone does have a sense of Medfield’s power consumption, though. The prototype reportedly consumes 2.6-3.6W, which is shy of Intel’s 2-2.6W target for the finished product.

While it’s highly unlikely that Medfield-based systems will break Apple’s dominance of the tablet market, it will be interesting to see whether the processor proves more popular in slates than previous Atoms. I’m intrigued to see how well Medfield runs Windows 8, which seems perfectly suited to take advantage of the chip’s x86 architecture and tablet focus. The ARM-tailored version of Windows 8 is looking quite limited, but one will presumably be free to run the full-fat flavor of the OS on x86-equipped Atom tablets.

Comments closed
    • Chrispy_
    • 8 years ago

    I am no expert when it comes to battery-tech, but 3.6W of power draw is pretty high when you consider that the largest smartphone batteries to date are around 5 or 6 WH. Effectively, we’re looking at a phone with this tech being stone-dead outta juice after watching one film. I don’t know how much bigger tablet batteries are (4x larger, perhaps?), but I imagine the much larger screens negate a lot of the extra battery capacity.

    Someone who already knows help me out here please 🙂
    Googling for battery specs is high on my list of boring things to avoid doing.

      • sschaem
      • 8 years ago

      ipad 25wh, iphone 5wh.

      Both, the ipad2 and iphone4s, use the same SoC.

      The current Medfield 10″ tablet prototype seem to have 6 hours video play time in Flash, but this can jump to ~10hours by release time.
      This then seem to be exactly what Apple A5 SoC delivers,
      But in general you can expect the same scaling for the A5 as Medfield from tablet to phone.

      Not to forget: The A5 or Medfield use dedicated logic (not the CPU) to decode video, so the ARM vs X86 is moot.
      Same is true for audio playback, thats how even previous Atom SoC could play mp3s for 48hours continuously on the same charge.

      The true test will be web surfing and apps.

    • chuckula
    • 8 years ago

    To all the ARM fanboys who were quoting Semi-Accurate as the be-all end-all source of information showing that Intel will never, ever, be able to compete with ARM, here’s a little quote published today:

    [quote<]At the same time performance figures for the complete Medfield platform have started to appear. These figures show a complete platform power consumption of 3.6W under full load and 2.6W when idling, more on this when the embargo lifts. These figures are comparable to chips using the ARM design, but it is worth noting that the ARM chips already have been on the market for about one year, whereas the Intel offering is brand new. Nevertheless it seems that Intel may finally have a chance in the tablet market. How Medfield will perform in real life will greatly depend on how good a job Intel does of optimizing the latest version of Android for the Atom processor, but if I was holding shares in ARM then I would start feeling a little uneasy.[/quote<] [url<]http://semiaccurate.com/2011/12/28/intel-releases-cedar-trail/[/url<] I'm not saying that Semi-Accurate is necessarily trustworthy (being either pro or anti Intel or anyone else), but this doesn't seem to jive with the ARM propaganda we've been hearing recently.

    • Hattig
    • 8 years ago

    Is this single core? Hyperthreading? 64-bit?

    It will be interesting to see this go up against the 32nm/28nm ARM SoCs in 2012. ARM will have quad-core on its side, and/or higher clock speeds than currently (1.8GHz Exynos, 1.3GHz Tegra 3, etc) to make up for the per-core performance disadvantage. It looks like it will also have power consumption on its side too. However it will, in the main, still be Cortex A9 as A15 won’t be appearing until late in the year if we’re lucky.

    It remains to be seen how the graphics and video capabilities of Medfield compare with the next generation graphics on the ARM side, e.g., ARM Mali T604, PowerVR SGX 600, etc. The point of a SoC is that it is the sum of its components (something the Phoronix article completely avoided) that makes it desirable to have in a device.

    • bcronce
    • 8 years ago

    If it’s good for multimedia on a tablet, it’s good for HTPCs. I approve extra competition in this area.

    • jdaven
    • 8 years ago

    If Intel is barely going to be able to fit Medfield in the design parameters of tablets by the end of 2012, how will they deliver on their promise to bring Medfield to smartphones by year’s end?

      • Farting Bob
      • 8 years ago

      I agree. It would have to be downclocked so much to achieve the kind of power consumption used by current (and near future) phones that it will be too slow. x86 is simply not the best design for such low power devices, and if they want it to run any x86 program then there is a limit to how much they can shave off it while still keeping it functional. If Intel licenced ARM and used their big advantage in process manufacturing then it would be a beast, but i cant see that ever happening. x86 or nothing it would seem.

        • NeelyCam
        • 8 years ago

        This “x86 is horribly inefficient” myth is still being perpetuated without any shred of proof. ARM will have trouble scaling up in performance without giving up efficiency; once it’s in the same performance level as x86, it’s power efficiency will be surprisingly similar to x86 (at which point Intel’s process advantage will make x86 the winner).

        Check out how badly ARM scales in power efficiency:

        [url<]http://www.anandtech.com/show/4991/arms-cortex-a7-bringing-cheaper-dualcore-more-power-efficient-highend-devices/2[/url<] In the last figure, "Rich Web Services", A15 is only about 10% more power efficient than A9, and that's with the benefit of a 40nm->28nm shrink. It should've been at least 30% more efficient from the shrink alone. That extra performance from architectural changes comes with a major power efficiency hit.

          • jdaven
          • 8 years ago

          How can the reality that no one has a shipping, low enough power x86 chip for smartphones/tablets not be considered proof? That IS the proof. Or are you saying that Intel/AMD/VIA has just chosen not to release a sub 2W chip? And they will continue to choose not to ship a sub 2W chip just for s**ts and giggles until, well, they do. I don’t think so.

          x86 chips were not designed for these power envelopes. Regardless of whether it is the x86 instructions or some other reason, the FACT of the matter is that no x86 ships with low enough power to go in a smartphone and a huge majority of the tablets out there. If THAT were not true, then what are we talking about?

          From my research, I found that x86 chips need fundamental changes on the architectural level to achieve the power envelopes that ARM chips have at a given process node. Intel will only finally have a competing chip on a 32 nm node whereas ARM licenses has been shipping chips for the last god knows how many years using much larger nodes.

          Atom is a stripped down, low core-count, low frequency, in-order architecture that to this day can not be found in any shipping smartphones and very few tablets if any. On top of that, Intel needs to add GPU, memory controller, I/O hub, etc. to be a SoC. They think they finally have it at 32 nm by the end of 2012. We’ll see.

            • NeelyCam
            • 8 years ago

            Could you give me a link to your “research”?

            Note that ARM needs fundamental changes on the architectural level to achieve the performance envelopes that x86 chips have at a given process node. Guess what happens to that famed power efficiency then…?

            People learn to want more performance when more performance is available. On a daily basis I’m annoyed with how slow my dual-core A9 based phone is when I’m web browsing.

            • jdaven
            • 8 years ago

            We both have to give proof of our claims. You can’t dismiss the “x86 power myth” just because you say it is a myth. You have to send me a link where the claim is proven false. Likewise I need to send you a link which I did some months ago:

            [url<]http://semiaccurate.com/2011/02/15/atom-dead-strangled-slowly-intel/[/url<] "Medfield will raise the bar, it’s just that no one will care. Intel could come out with the greatest silicon ever, but it will have the one stigma that the company can not overcome, x86. At this point, Intel’s greatest strength is now it’s greatest weakness. All x86 brings to phones is a decode power penalty and a lack of software, no one wants an x86 phone any more. Even Microsoft has cast their mobile lot in with ARM, something quite unthinkable a year ago." Believe Charlie or not but someone else out there agrees with me. But let's think about this logically. Intel's x86 architecture (case in point Atom) has let's say 2 times the IPC of the best ARM chip. So Intel can downclock the chip quick a bit and still be competitive from a performance perspective. To compete against a 1 GHz ARM, Intel can release a 500 Mhz Atom. That should work right because a 1.6 GHz dual core atom can do around 2-3W then a 500 MHz should be only a third of that. That would be right if power scaled linearly with just clock speed but it doesn't. I'm willing to bet that after a certain point, lower clocks yield no more power reductions. This leaves the architecture as the bottleneck. Otherwise, Intel would have released a 500 MHz Atom already and start raking in the smartphone money.

            • NeelyCam
            • 8 years ago

            So, your “research” is based on reading a 20-page CharLIE rant? Just for your convenience, let me summarize that rant for you:

            * I’m Charlie. I knew that Penwell is a Nokia-designed Intel chip before Intel did. I’m awesome.
            * Somebody told me over beers that x86 ‘has baggage’. I will spread it to my loyal disciples as gospel, but I won’t offer them beer.
            * Intel’s manufacturing is great and IvyBridge has in-package LPDDR2 memory for graphics. Trust me – my sources know their sh*t.
            * Sandy Bridge is the biggest disappointment of the year. Why? Because 99% of people use Linux.
            * Intel IGP sucks. Thus, Medfield IGP sucks.
            * Medfield drives are crap. Why? Because 965G drivers were crap, and I can’t get my SB rig to work.
            * Atom is too slow to play with the ‘big boys’. Brazos is better than Atom. Thus, Medfield sucks.

            Half the article is Charlie whining about lack of linux graphics drivers on older atoms, and somehow making a conclusion that all Atoms are dead forever. The only comment about x86 vs. ARM was that unsubstantiated, unreferenced quip you quoted – no link or logical arguments exactly why the “decode penalty” is significant, or how significant. This is the same stuff parroted by everyone without even a tiniest attempt to try to explain why this would be.

            [quote<]That would be right if power scaled linearly with just clock speed but it doesn't. I'm willing to bet that after a certain point, lower clocks yield no more power reductions.[/quote<] I'm not sure you know what you're talking about... Active power consumption of logic scales linearly with clock speed as long as supply voltage doesn't change. Moreover, reducing clock speed allows one to reduce supply voltage (with some limits), reducing power consumption even more. But I'd be happy to take that bet - I'll consider it a late christmas present. Some circuits consume DC power, such as DDR I/O circuits, but those are unrelated to the CPU architecture; ARM chips suffer the same I/O power consumption penalties as x86.

            • madmilk
            • 8 years ago

            You are correct — Power doesn’t scale linearly with clock speed. It actually scales with the CUBE. (P = C*V^2 *F, and in general V is proportional to F so P is proportional to F^3). Not to mention a 500MHz Atom would likely be produced on a low-power process as well, dropping idle power consumption by a magnitude or so.

            However, a 500MHz Atom is useless now — far too slow for Windows and not competitive with faster ARM SoCs. Two years ago it probably would have been OK, but two years ago the iPad was just a rumor and netbooks were the bomb.

            • tay
            • 8 years ago

            Phoronix has Atom vs ARM9 benchmarks. Not too badly skewed towards Atom considering ARM9 is idling at 1 order of magnitude lower power draw than Medfield. i’m not sure what the deal is with that.

            • jdaven
            • 8 years ago

            I think this is the link you are talking about.

            [url<]http://www.phoronix.com/scan.php?page=article&item=pandaboard_es&num=1[/url<] But I did not see any power numbers. The results are interesting but remember 32 nm dual-core Medfield processors (1.6 GHz) will go up against Cortex-A15, quad core processors on a 28 nm process (1.5 GHz). This will be the really interesting comparison.

            • sschaem
            • 8 years ago

            By your logic ARM cant be used in high performance CPU because none have been made yet, so its a proof !

            silly, silly logic you have…

            And Medfield is a SoC. Also unless you are living under a rock, Intel does design GPU, analog logic (wifi on 32nm), memory controllers (and not crap, some of the best ever designed), bus technology, etc.. etc..
            Intel could build a SoC 100% in house, from blank page to final silicon… What does all those ARM SoC company do?
            license the CPU from ARM, license the GPU, license the wifi module, etc… and then go fab somewhere.

            Thinking that just because ARM isa is simpler to decode it will keep an edge in 2012… think again. Time have changed.

            …ARM is the new AMD…

        • tfp
        • 8 years ago

        Intel has both made ARM chips in the past and still has a license.
        [url<]http://en.wikipedia.org/wiki/XScale[/url<] [url<]http://www.zdnet.com/blog/computers/intel-we-have-arm-license-no-plans-to-use-it/5845[/url<]

      • sschaem
      • 8 years ago

      With an ipad class battery, medfield tablets will have 10+ hours of continuous HD video playback.
      Those tablets will also have near identical idle power usage as ARM based tablets.

      Ok, medfield might use more power at peak, but it also finish task faster, so it spend more time in idle mode.
      So peak watt usage is not that clear cut

      All I can say is, dont bet the farm on ARM…

        • Hattig
        • 8 years ago

        An ARM based 10″ tablet gets 10 hours of battery life from a 25WHr battery. That includes the display and other peripheral chips.

        Tell me how a chip that idles at 2W and currently uses 3.6W decoding video can last 10 hours on the same battery if the display and other components are using an additional 1.5W to 2W.

          • sschaem
          • 8 years ago

          Its the whole platform ref design that iddle at 2watt. Backlight, wifi, 1gig ddr2, SSD, etc…

          “The ***Medfield Tablet Platform*** currently has a 2.6-watt TDP when idling”

          Intel final target is ~2 watt. But be sure, Intel wont find any power saving on the CPU or GPU side,
          as it already falls below <20mw at iddle.

          The 3.6w quoted is for 100% peak usage as the 720p HD h.264 stream is decoded on the CPU in this version of medfield x86 Flash on Android.

          The final product will use a 32nm h.264 tweaked HW decoder that is more efficient then the one used on the ipad2.
          Video playback is expected to fall below ~2.5 watt for the entire device. 25watt / 2.5 = 10hours.

            • NeelyCam
            • 8 years ago

            That looks like pretty detailed info there… are you guessing, or do you have inside information?

    • adisor19
    • 8 years ago

    “but one will presumably be free to run the full-fat flavor of the OS on x86-equipped Atom tablets.”

    I doubt this. Medfield doesn’t have PCI bus support and as such, it can not run the “full fat” version of Windows.

    Adi

      • Deanjo
      • 8 years ago

      I agree. Chances are you will see limitations introduced into the operating system just as you do today with Windows and netbooks (and probably locked down with their secure boot).

    • chuckula
    • 8 years ago

    Assumption: The Medfield powered SoC will have about the same performance as a single-core Atom from 2008 such as the N270 or Z530, but bring lower power consumption to the table.

    Go over to Phoronix and look at the [url=http://www.phoronix.com/scan.php?page=article&item=pandaboard_es&num=1<]shootout[/url<] between a brand spanking new TI OMAP 4460 with a dual-core Cortex A9 setup and aging Atoms from 2008 that use the 45nm process. The TI OMAP 4460 is the same chip powering Google's Ice Cream Sandwich reference phone and will be in a bunch of higher-end phones and tablets coming up in 2012. Here's a brief recap of the results: The ARM chip wins one real benchmark (C-Ray rendering) by about 10%. It also wins a couple of synthetic cache benchmarks. [b<]That's it.[/b<] The Atom chips tie or win the other benchmarks, often by considerable margins. Oh, and BTW, the large majority of the benchmarks are multi-threaded so both cores of the ARM chip are being used in these benchmarks. Conclusion: Anybody who says that Medfield can't compete in the mobile space is smoking something. Here's the moral of the story: Intel *can* compete in the tablet and smartphone space. Note to the irrational fanbois: I did not say Intel would [b<]dominate[/b<] this space or [b<]wipe out[/b<] ARM... I just said that they can [b<]compete[/b<]. I fully expect the A15 chips to be able to beat a single-core Atom in most benchmarks. However, these chips are only just taping out now and you ain't going to be seeing them in many devices until well into 2012 or even 2013. Despite some of the rants we've heard from the ARM fanboys, it's entirely possible for Intel to make improvements to Atom that will boost performance and drop power consumption. The move to 22nm is part of the improvement, but Intel can also make some architectural improvements too. Keep in mind that the Atom core hasn't really changed since 2008, but there's nothing that says that Intel can't improve it going forward.

      • dragosmp
      • 8 years ago

      This is a good analysis, but you may have missed the point that nobody questions if Atom is a fast CPU; it’s obvious it is very fast when compared to A9 class ARM CPUs. There are two unknowns: power consumption and graphics.

      Power consumption looks a tad high (from VR-Zone) even for a tablet:
      “As it stands right now, the prototype version is consuming 2.6W in idle with the target being 2W, while the worst case scenarios are video playback: watching the video at 720p in Adobe Flash format will consume 3.6W, while the target for shipping parts should be 1W less (2.6W).”

      And the GPU is a complete unknown. Apple’s A5 is “only” a dual core Cortex A9, but it is paired with a beefy PowerVR 543MP2 GPU; compared to Tegra2 the A5 has slower CPU cores and much faster GPU – so the A5 wins hands-down most tests (check anandtech’s tablet reviews). By this rationale it’s quite easy to see in mobile chips they need to accelerate the development of the GPUs more than the CPUs. That leads to the Atom – there are no information on the performance of the GPU. The only clear fact in the article is the high(ish) power consumption.

        • DancinJack
        • 8 years ago

        I highly doubt the A5 CPU cores are slower than the Tegra2 CPU cores.

          • dragosmp
          • 8 years ago

          There’s no way of knowing really since one can’t use the same GPU, but they’re both DC Cortex A9 and iPad’s cores are 200MHz slower. Both support NEON and otherwise seem to be identical bar the 20% clock speed – hence T2’s cores should be faster. T2’s GPU on the other hand is obviously slower, kinda ironic considering Nvidia’s core competency…

            • Deanjo
            • 8 years ago

            You are forgetting a huge factor as well, the operating system.

            • Goty
            • 8 years ago

            … which has absolutely nothing to do with the fact that the two CPUs are nearly identical except for clockspeed.

            • Deanjo
            • 8 years ago

            Sorry but software and their optimizations have EVERYTHING to do with the performance of a cpu and real world returns.

            • NeelyCam
            • 8 years ago

            Still doesn’t change the fact that A5 is slower than Tegra2. OS is a separate issue.

            • Deanjo
            • 8 years ago

            How can it be a separate issue as both are required to give you a device that functions? Hardware won’t do jack on it’s own and software doesn’t do jack either on its own as well. Unless you take the sum of the parts that give you the end experience then any comparison is completely useless when put to real world application.

            • NeelyCam
            • 8 years ago

            **Whoosh.** dragosmp was pointing out that Tegra2 is faster than A5 – not that Moto Atrix is faster than iPhone 4.

            One could argue that if you use Tegra2 with iOS5, it’s faster than an iPhone4 because [i<]Tegra2 is faster than A5[/i<].

            • Deanjo
            • 8 years ago

            But that is the thing, it is NOT faster then the A5 in real world use. Arguing that it could be faster in iOS is a useless argument that carries no merit since no real world application of that exists. You might as well be saying a that the 80’s Leafs would have been better then the 80’s Oilers if TO had #99.

            • NeelyCam
            • 8 years ago

            You’re completely unable to understand the point, so I give up. Go eat an apple.

            • Deanjo
            • 8 years ago

            That is because you are not making any point. A cpu is not faster if it is bottlenecked or suffers from poor optimization, it is slower. 50 gallons of water going through a 1/2″ funnel isn’t going to have any faster of a flow rate then 5 gallons going though a 1/2″ funnel.

            • clone
            • 8 years ago

            software optimizations are fluid, they can be done anytime but the hardware is forever tied to the product.

            I’d go with better hardware every time so long as the advantages are within reach, no hesitation.

            your position is buy junk now because the current software is better optimized for it…. while not a terrible position to assume software shifts are rapid in response to better hardware as it arrives which immediately leaves junk behind.

            • nafhan
            • 8 years ago

            I was under the impression that Tegra 2 does NOT have NEON (a problem that Tegra 3 remedies).
            The GPU was pretty decent when Tegra 2 came out, and it’s still in the same ballpark as everything except for the A5 (which completely blows it away from a GPU perspective).

            • adisor19
            • 8 years ago

            Ah yes, i had completely missed that point. Tegra 2 lacked NEON instructions support that can be quite usefull for some tasks whereas the A4 as well as the A5 has it.

            Adi

            • adisor19
            • 8 years ago

            Ok so you’ve corrected yourself on that one. Indeed, the A5 in the iPad is clocked at 1Ghz and in the iPhone 4S, it’s clocked at around 800Mhz. Tegra 2 was originally clocked at 1Ghz but it now varies and goes up to 1.2Ghz.

            Adi

            • Hattig
            • 8 years ago

            If we’re talking Tegra 2 vs A5, they both run at 1GHz (iPad A5), the A5 includes Neon which the Tegra 2 misses.

            Tegra 3 is faster (1.3GHz) and has Neon and quad-cores. We don’t know what A6 includes, but it presumably will make a showing in the iPad 3 early in 2012.

            The operating system makes a difference too. iOS apps are compiled (Objective) C, Android apps are bytecode running in a VM, although a lot of the OS libraries are native, and there is a native SDK too for games.

        • chuckula
        • 8 years ago

        The GPU in Medfield is not a complete unknown. It is using a PowerVR 545 series core. The exact performance is still not known, but you can make relatively sane estimates about what it will be capable of. Medfield is *not* using an Intel created graphics solution.

          • nico1982
          • 8 years ago

          … but it will run graphic drivers written by Intel 😛 It is not the first time Intel resort to Img Tec gfx solutions for Atom, but the software side has never been up to the task (just look at the dx10 un-certification of the latest Atoms).

          • adisor19
          • 8 years ago

          Thank goodness for that one ! It has to be said that tile based GPU renderer is more power efficient compared to a traditional renderer and this matters greatly in SoCs. For PCs, it’s not much of a big deal as power usage was never really very important back in the nascent days of the late 90s where PowerVR dipped their toe in PC graphics with the Kyro. ATI an NVidia smacked them around with brute force rendering.

          In a tablets and smartphones, it’s completely different and both Intel and ATI/Nvidia are having a hard time adapting. Heck, ATI hasn’t even tried.

          Adi

            • sschaem
            • 8 years ago

            Are you sure?

            [url<]http://developer.amd.com/gpu_assets/gdc2008_ribble_maurice_TileBasedGpus.pdf[/url<] AMD, all tile based design: • Imageon 2380 (OpenGL ES 1.x) • Xenos (Xbox 360) • Z430 and Z460 (targets OpenGL ES 2.0) Imageon (and the Z serie) was sold by AMD ex CEO Dirk Meyer to Qualcomm for a few millions... now know as Adreno. He also sold all the other mobile IP, tech , expertise and technology to other company that are making a killing now with it. Qualcomm is the most visible example, but in 2009 AMD was washing its hand out of the 'dead mobile market to focus on x86 processors' But saying AMD didn't try mobile GPU and doesn't know about tile based rendering is wrong. (ATI was a pioneer in mobile GPU) (ATI had GPU accelerated smart phone prototypes like a decade ago.)

        • adisor19
        • 8 years ago

        “compared to Tegra2 the A5 has slower CPU”

        FALSE. The A5 has the same standard A9 Cores that Tegra 2 has. It is NOT a custom design like Qualcom’s.

        Adi

          • NeelyCam
          • 8 years ago

          Not false. A5 runs at lower clock speeds (to save power), so it’s slower than Tegra2.

            • adisor19
            • 8 years ago

            Depends on the device. On the iPad 2, the A5 runs at 1Ghz just like the Tegra 2 in most tablets where battery life is longer. Technically speaking, it’s the same exact unmodified reference Cortex A9 design from ARM and that was the point i was trying to make.

            Adi

        • NeelyCam
        • 8 years ago

        The 2.6W idle power is too high – my guess is that power management isn’t on. The only clear fact in the article is the ARM-beating performance.

        Like chuckula said, the GPU isn’t unknown. Intel is licensing PowerVR cores as well – I don’t know if Medfield is using the same one as A5 or not, but it’s definitely not an “Intel IGP” we all know and “love”.

      • stmok
      • 8 years ago

      The Phoronix article only looks at performance under Linux. Which isn’t surprising when ARM has been about power efficiency, low cost of implementation, and flexibility to the licensed implementer of the design.

      The article doesn’t bother going into power. Nor is there evaluation of the economics and other factors by going with ARM.

      When you obtain an ARM license and wish to modify the base design of, say the Cortex A9 or A15 to suit your needs, you get the full support of ARM Holdings. They don’t care if you decide to use another GPU-based technology or modify the whole design itself. They’ll have their engineers assist you in implementation. (Just look at that Tegra 3 with its quad-core with low power single core approach.)

      With Intel, you have to go with their implementation that you can’t really change to meet your needs. You also have to pay a higher per chip cost. (Why do you think many don’t bother with the Atom when it comes to phones, tablets, etc? Intel has tried to curb the cost by paying implementors on a per chip basis to close the price gap. They haven’t been successful in this regard.)

      The Medfield (32nm) prototype at 1.6Ghz consumes 2.6-3.6W.

      A Cortex A9 (40nm TSMC) at 2Ghz does 1.9W.

      (The Phoronix article’s Cortex A9 embedded solution was running at 1.2GHz)

      The only thing Intel has going for it is their brute force engineering and resources pool. They lead in manufacturing capabilities. (Where Samsung is their closest competitor in the 2nd spot.)

      There’s also the presumption that ARM-based producers are sitting still. They aren’t, they’re running test samples on 28nm and 20nm (TSMC and GlobalFoundries) as we speak, with both Cortex A9 and A15 designs. eg: GlobalFoundries demo’ed a Cortex A9 (dual-core) on 28nm running at 2.5Ghz on 0.85V. (While the 20nm version has been taped out.) …Maybe this is why AMD’s Llano and Bulldozer has been hampered on 32nm SOI? GF is dedicating their engineering talent to future process nodes.

      The biggest obstacle for the Atom is that Intel can’t make it too powerful in fear of colliding with their other processor product lines. ARM doesn’t have this ceiling to deal with. So its no surprise Intel has to stick to a certain speed range, while ARM producers can clock upwards based on ther power consumption contraints of the implementation.

      What ARM is doing is bringing something different to the table. They’ll license to anyone with far fewer conditions. That’s what will make them popular with implementers. This is especially true with fabless companies in China. (Quite a number want to make their mark in tablets on a local level via the ARM path.)

        • dpaus
        • 8 years ago

        [quote<]The biggest obstacle for the Atom is that Intel can't make it too powerful in fear of colliding with their other processor product lines[/quote<] Oh, I think there's still plenty of room between a 1.6 GHz Atom and a low-end Core i3. Other than that, though, great analysis.

          • nico1982
          • 8 years ago

          The balance between a few Xeons an an hundred of Atom nodes might be trickier.

            • dpaus
            • 8 years ago

            Agreed, but that says far more about the pricing of Xeons than it does about the market positioning of Atoms.

        • adisor19
        • 8 years ago

        I agree with most of your reasoning. However i don’t think Intel is hampered by the performance ceiling of their more expensive products but rather by the limitations of the x86 architecture. They have kept Atom at about the same level of performance as they can’t add more features to it without increasing power consumption. Already they are having difficulty getting it to work in tablets and smartphones..

        Adi

        • NeelyCam
        • 8 years ago

        You are underestimating Intel’s manufacturing lead and its impact on these chips. Also, people should focus on energy consumption instead of power consumption.

        Medfield beats current 40nm ARM chips in performance, and consumes just a little bit more power. However, tasks get finished faster so the chip can go idle quicker. So, total energy consumption is pretty much comparable but Medfield has better performance – this is a net win.

        Now, a switch to 28nm will help ARM chips to reduce power consumption, but 28nm is late and still has yield issues. Also, as the Anandtech article showed, switching to A15 will improve performance but reduces efficiency. It’s not the magic bullet everybody thinks it is.

        So, Medfield is roughly equal to A15 chips in power and efficiency. Clearly this is not enough to take significant market share – a new entrant has to be much better than the entrenched competitors to manage that. But this will happen in 2013 with Intel’s 22nm chip which uses trigate to improve efficiency in a major way. At that point, none of the ARM chips available then can touch Intel in efficiency, and Intel will grab a lot of market share. When ARM switches to (non-trigate) 20nm in 2014, Intel moves to trigate 15nm, keeping the major efficiency advantage.

        OEMs know this is happening, and will slowly migrate to Intel chips. The one thing they were waiting for was to see if x86 is able to compete with ARM, and Medfield shows it can – [i<]that's[/i<] why Medfield is so significant.

          • Hattig
          • 8 years ago

          We’re all bored of Intel’s “next year Atom will win” storyline now, or Intel’s comparison of their as-yet unreleased Atom tech with year-old competitor tech.

          32nm Medfield is going to be compared to 32nm Exynos, 32nm A6 and 40nm Tegra 3 (with low-power co-processor). Later on it will be compared to dual-A15 + dual-A7 products.

          Intel hasn’t got an option like the low-power core in Tegra 3, or the A7 low-power cores as it is already the lowest-power and lowest-die-size option for Intel. It’s not as big an issue for tablets where a little more power consumption isn’t as significant as in a phone. But Intel Android tablets won’t run games which use the Android NDK, and performance is only similar. And are Intel going to sell Medfields for $25 or less?

            • NeelyCam
            • 8 years ago

            [quote<]And are Intel going to sell Medfields for $25 or less?[/quote<] Of course they will - otherwise they can't compete. Intel is in this for the long run - short-term margins can be sacrificed to break into the new market. I find it funny how people think Intel's stuff is bound to cost so much more than, say, TSMC's. Silicon is silicon, and looking at the profit margins I'm sure Intel has a solid focus on cost. Intel CPUs are priced high only because they outperform the competition... In the low-margin high-volume cell-phone chip market Intel will adjust the pricing to match the others. As for you getting bored with the "next year Atom will win" thingy... well, the wait is soon over. Nokia debacle certainly delayed Intel's cell phone aspirations, but the chips are coming out now. Yes; I keep forgetting Samsung's Exynos - that's certain to be the top ARM chip in 2012 in performance (I'm sure A6 will again be tweaked for lower power instead), and it will likely beat Medfield in power efficiency (not performance, though). But you still keep missing the point: even though everyone (including you) have said that x86 is awful and inherently power-inefficient, Medfield shows that it's in the same ballpark as ARM, and Intel's manufacturing lead will help it beat ARM chips - even Samsung-manufactured ones.

            • sschaem
            • 8 years ago

            Most ARM SoC provider have to pay a license to ARM (CPU and modules), a license for the GPU… and then the fab cost.

            Intel keep all that money to itself… Not only that but it seem that medfield will include auxiliary chip not normally part of design like Tegra. So for device maker its a saving that Intel can charge for.
            Medfield also seem to be significantly better then dual core cortex A9 soc..
            So this might be worth a $ or two for high end tablet and phones.
            We might see ARM become the AMD of mobile devices (bottom feeder)

            Intel is also the only one so far with 32nm anlog design, and that might be a win in term of fab cost.

            Intel margin compared to nVidia on SoC might be night and day. No need to pay for TSMC, no need to pay ARM, no need to license third party modules, direct integration of wifi, ..

            I wouldn’t be surprise if Intel goes from 0 to 20% of the mobile market in 24 month after medfield release, mostly grabbing the high end market.

          • cegras
          • 8 years ago

          [quote<]Also, people should focus on energy consumption instead of power consumption.[/quote<] On a smartphone, that's a crap metric. A smartphone spends more time idling than anything else.

          • raddude9
          • 8 years ago

          Really? [quote<]Medfield beats current 40nm ARM chips in performance[/quote<] You get one leaked/biased/unverified benchmark and you state this as fact. Aren't you the one who usually says "wait for the benchmarks" [quote<]OEMs know this is happening, and will slowly migrate to Intel chips.[/quote<] OEM's care about cost above all else.

            • NeelyCam
            • 8 years ago

            This is a third-party benchie, although they botched the idle power.

            And as I already said, ARM has no cost advantage if Intel doesn’t want it to have one. In fact, Intel has a cost advantage – if Intel were to sell at cost, ARM licensees would be dead since TSMC still expects the same profit margin as always.

            • raddude9
            • 8 years ago

            An unverified benchie you mean, and we have very few details on the medfield chip itself. And of couse they are not going to deliberatly leak a benchmark that shows their chip in a bad light, they are going to cherry-pick the most favourable.

            [quote<]In fact, Intel has a cost advantage[/quote<] becuause all of their fancy fabs were cheap right! Living on the bleeding edge of silicon fabs is very expensive, they have to pay this back by selling high cost chips. I don't believe Intel as a company is capable of selling low cost chips. Selling chips at ARM-competitive prices is the real challenge for intel here, and in this respect their history as a company is working against them.

            • NeelyCam
            • 8 years ago

            You are so wrong about cost. The only way ARM would have a cost advantage is if the chips were made in 55nm, but then they would have zero chance competing on performance or performance/mW

            • raddude9
            • 8 years ago

            Evidence please. If I’m wrong then intel would have cheaper chips than ARM, but they don’t, not by a very wide margin.

            • NeelyCam
            • 8 years ago

            How do you know how much Intel charges phone OEMs for Medfield chips? How do you know what NVidia/Qualcomm charges?

            List prices are worth the money they are printed on.

            • raddude9
            • 8 years ago

            And you would know how exactly?

            I’m basing my assertion on history (intel has never made cheap chips, ARM has always made cheap chips), you are basing yours on absolutely nothing.

            • NeelyCam
            • 8 years ago

            Fair enough. We’ll let the future decide. I’ll bet on my version.

            • sschaem
            • 8 years ago

            Looking at the pletora of Intel lga 775 ddr3 motherboard (with IGP) selling at under 50$ with free shipping, I’m guessing Intel doesn’t charge much for their chipset. (some interestingly have RISC processor in them, I would love to find the link stating this.. anyone ? )
            So Intel price what ever they can get from their product… And if they want they can undercut EVERYONE and still make more profit.

            So Intel can sell Medfield for less then the competition and still make more profit then ALL of them combined.
            (Talking about ARMH, TSMC, nVidia when selling a Tegra3 compared to Intel selling a Medfield SoC)

            It seem Intel got vast 32nm production that can be switched to 32nm SoC, already paid for with SB, since IB is going 22nm trigate.
            What cost more to fab ? a SoC at TSMC 28nm, or medfield at 32nm at Intel facilities? I can only guess…

            Intel timed medfield to match its fab capacity.

            • sschaem
            • 8 years ago

            Nothing official, but its more then 1 benchmark.
            [url<]http://www.anandtech.com/show/5262/intel-shows-off-competitive-medfield-x86-android-power-performance[/url<] Intel will reduce the amount of chip required for a given design. And Intel collect all licensing fee normally associated making a tablet/smartphone SoC. The cut that goes to TSMC, they collect. The cut that gost to ARMH, they collect, the cut that goes to TI or qualcomm or whoever for the modem, they collect, the wifi? they have built in at 32nm in the SoC, nand flash& controller, etc.. etc... And look at Intel 32nm fab capacity that can be tweaked over time to SoC production.

      • MadManOriginal
      • 8 years ago

      I don’t think people are as much ARM fanboys as they are Intel haters.

      • blastdoor
      • 8 years ago

      Good analysis… unless the iPad 3 launches in March with an A15-based SOC. Then it’s back to the drawing board once more for Intel, with promises that they’ll get it right next year.

      If Apple were to decide to spend the $$ to put their A# chips on the best fab tech available, Intel will never be able to catch up. Intel’s only advantage is process tech. Admittedly that’s a huge advantage, but if they lose it, they’re f-d, because design-wise they’ve got nothing.

        • Hattig
        • 8 years ago

        Intel’s problem is that they’re making Atom and SoCs on 32nm. Samsung now have a 32nm process line that is surely being used for Apple’s A6 chip, as well as Samsung’s 1.8GHz Exynos.

        I don’t know if the A6 will be A15 based however, it is quite early still. Quad-A9 or fast-dual-A9 is my guess. If they do manage fast-dual-A15 then it’s just going to make it harder for Medfield.

      • jensend
      • 8 years ago

      Hey look! Even better than what you cited, Sandy Bridge-E beats ARM in benchmarks [i<]all across the board[/i<]! Surely this means Intel is TEH W1N for the mobile space!!! There's this little problem called power consumption. An Atom SoC which idles at over 2W is SOL in a market where nobody else consumes that much [i<]at load[/i<] and where competitive figures for idle power tend to be around 30mW (less than 2% of the Atom SoC's idle power requirement). [quote<]"it's entirely possible for Intel to make improvements to Atom that will boost performance and drop power consumption"[/quote<]Then why don't they? It's seen very very little improvement in three years, and it's not as though there hasn't been market demand for such improvements. ARM makers aren't standing still either. Before any benchmarks of the original Atom showed up, people hyped it as having Pentium M performance in an ARM power budget. It failed spectacularly in fulfilling those expectations and is still a long way from hitting either of those claims.

        • Hattig
        • 8 years ago

        I’m presuming that this Atom can also enter deep sleep states … however 2W idle is underachieving in this market. “Typical” power consumption (of the entire SoC when display and CPU are active) I could understand.

        • sschaem
        • 8 years ago

        You really believe Intel require 2w to have Medfield chip sit idle ?1?1?

        The entire tablet (wifi, 10″ display back light, ssd, CPU, GPU, etc…) will consume 2.6watt at peak power usage, and at the same time beat ARM cortex 9 performance.

        Medfield also use milliwatt at idle.

        ARM will still have an edge, but it wont be enough to keep x86 out of the mobile market anymore. The more core, the more shaders, the more cache, etc.. the more Intel will be in control. Where Intel failed is at the ‘pocket watch’ CPU market.
        And this market is done. people want true power in their smart phone and tablets.

        Expect ARM stock to do a NFLX in the next 6 month! Because this English King is butt naked.

          • NeelyCam
          • 8 years ago

          [quote<]Expect ARM stock to do a NFLX in the next 6 month! Because this English King is butt naked.[/quote<] Top comment of the day. +1

          • jensend
          • 8 years ago

          Yes, I more than just “[i<]believe[/i<] Intel requires 2W to have the Medfield [i<]SoC[/i<] sit idle" - I know it. You apparently didn't read the article or have poor reading comprehension. The figure I gave for ARM wasn't just a CPU either; an entire ARM-based platform idles below 30mW. I couldn't care less whether you can quote me some sub-1W figure for just the CPU as that's irrelevant to the comparison I'm making. There's no way on earth the figures on that page include the display; just the backlight alone of a 10.1" screen requires ~2.5W. The power figures are for the SoC. Low power is not "done" as a market. If you personally want to carry around a two pound battery for your phone, that's your perogative; most people don't want to. Battery technology, despite all the billions in resources being poured into the field, only progresses at a slow linear rate; there's no Moore's Law type exponential gains in future power delivery to allow high-wattage devices in your pocket down the road.

            • sschaem
            • 8 years ago

            Do you even realize how outlandish it is to believe that a Medfield SoC idle at 2.6watt?

            Intel publicly disclosed to investors that the Medfield reference smartphone design idle power usage already beat the majority of shipping ARM based smartphone.
            In short x86 tablet & phone, with the same battery as ARM based model, will last longer.
            Like you mentioned, smaller battery is a plus, and Intel will deliver over ARM SoC design,

            And take a peek at an ARM tablet teardown like the asus transformer… talk about a mess.
            Chip after chip after chip… So far it look like medfield will actually deliver a better SoC solution then Tegra.

            You also have to look at Intel 32nm wifi technology, nand controler, power regulator, etc…
            to understand why Intel is about to enter the SoC market with a bang.

            I never shorted a stock in my life… but ARMH is beckoning as they have everything to loose.

            • jensend
            • 8 years ago

            Sorry to burst your bubble, but there is a real world out there outside of Intel press releases. I guess that world is what you call the outlands, since you label the facts outlandish?

            Real world testing, not Intel marketingspeak, tells us that Medfield’s SoC currently idles at 2.6W and takes 3.6W at load. That’s an improvement over previous generations of Atom- that’s not much more power than Lincroft’s CPU+GPU alone required- but it’s not a miracle. Medfield, being based on 32nm, isn’t going to pull any miracles and isn’t suddenly going to make Intel competitive in smartphones etc. Note that even in the performance comparison- discounting power completely- Intel only wins against previous-generation 40nm chips (and not even against Tegra3) while Samsung and others are already close to market with 32nm designs.

            A Medfield-based phone with a normal battery [url=http://www.dailytech.com/article.aspx?newsid=23611<]would last roughly two hours on a charge[/url<] [i<]when completely idle[/i<]. Your claim "Intel will deliver (superior battery life) over ARM SoC design" must be based on some mythical or fanciful future delivery because they've never delivered a single smartphone SoC before and Medfield sure isn't one either. It's a very small incremental improvement on the previous generations of Atom, and you're expecting it to be the be-all and end-all, the most miraculous technical achievement of the ages. Give it up. Maybe if Intel actually starts throwing engineering resources at Atom it'll start to be competitive on 22nm tri-gate.

            • DavidC1
            • 8 years ago

            I hate to burst YOUR bubble but lets look at this again.

            2.6W and 3.6W is for the ENTIRE device. If they reach 2.0W/2.6W with final device, it would mean with flash HD playback, it would have battery life of 10 hours using a 25WHr battery.

            10 hours is what iPad does for video.

            It would be awesome if comments work like stocks and can be shorted.

            • sschaem
            • 8 years ago

            Seriously ? That link (Jason Mick) is as uneducated as you are.

            Beating ARM is not ‘miraculous’, it just happen when a company like Intel decide to make a SoC for tablet/smartphone.

            So far it even seem that Intel got the edge. Intel 32nm technology seem to include analog RF
            [url<]http://www.intel.com/content/www/us/en/pdf-pages/32nm-soc-platform-technology-paper.html[/url<] that explain the statement that the 32nm Medfield SoC as built in wifi. Stop being stupid, Atom idle power is not even measure in milliwatt but in MicroWatt ... Phone idle time is not measured in hours but in days. You are gullible to even think Medfield would only deliver a 2 hour idle time for a smartphone.

            • NeelyCam
            • 8 years ago

            [quote<]Atom idle power is not even measure in milliwatt but in MicroWatt [/quote<] I think you're exaggerating a little bit...

            • sschaem
            • 8 years ago

            This is a result of the S0i3 state.

            “Building on the C6 state in the original Intel Atom processor design, the SoC incorporates new ultra-low-power states (S0i1 and S0i3), which take the SoC to 100 micro-watts” : Intel measured data at S0i3 using an Intel second gen Atom 45nm “Moorestown” beta test platform.

            • jensend
            • 8 years ago

            Again: you’re parroting [url=http://www.intel.com/pressroom/archive/releases/2010/20100504comp.htm<]an Intel press release with erroneous figures[/url<]. [url=https://techreport.com/articles.x/18866/4<]Intel themselves tested Moorestown's S0i3 at 21 milliwatts for the SoC.[/url<] Further, S0i3 is not an idle state, it's a standby/sleep state: the processor is completely turned off and state is stored in SRAM instead of the functional units of the CPU. Quoting standby figures when we're talking about idle power is apples-to-oranges. I'm having a hard time finding recent figures for ARM smartphone standby power, but BeagleBoards are at 8 mW in standby and there are non-phone-related ARM SoCs with standby power all the way down to <2 microwatts. It is good to see you admit that your talking points come from old Lincroft/Moorestown yakety yak. Medfield really is only an incremental improvement on Moorestown. If Moorestown's power consumption was already so great, why didn't it already see widespread adoption? LG did one prototype (GW990), with heavy Intel encouragement, and quickly gave up and canceled it. (It was really too big to be pocketable anyway, and most people decided to call it a MID rather than a smartphone because of the vast size difference. Its battery was about 3/2 the size of any shipping phone's.) No other manufacturer was willing to even touch it.

            • DavidC1
            • 8 years ago

            Before I continue with my post, I’m going to address this to couple of people, not just you.

            100uW figure is correct, but that’s just for the SoC. The 21mW is for the test phone platform.

            If the figures are correct to the final device, Medfield uses about 1/2 the power of Moorestown. The problem with LG and Moorestown were numerous. Not only the power consumption was in the high end of the range, but Intel have been flip flopping between multiple OSes. Moblin, MeeGo, Android. The GW990 was a Moblin device, and one of the reasons for being canceled was due to switching the OS to MeeGo.

            They were also late on the OS development with MeeGo and Android. The needed the latest, cutting edge versions to be competitive, but those ones were at least few months away from being optimized to the point of being useful.

            For example, the figure from the Oak Trail Tablet:
            [url<]http://translate.google.com/translate?js=n&prev=_t&hl=en&ie=UTF-8&layout=2&eotf=1&sl=nl&tl=en&u=http%3A%2F%2Ftweakers.net%2Fnieuws%2F74844%2Fcomputex-eerste-benchmarks-tonen-trage-atom-versie-honeycomb.html&act=url[/url<] Those figures are absolutely terrible, except for Sunspider. They note that optimizations to Honeycomb have improved them tremendously. Figures of 5-7x is quoted for Linpack and Caffeinemark. About the power figures. The only real Moorestown based device is the Cisco Cius. [url<]http://www.cisco.com/en/US/prod/collateral/voicesw/ps6789/ps7290/ps11156/data_sheet_c78-609507.html[/url<] They used to claim 8 hours of battery life. Now on that same page they claim max of 6 hours. From few reviews, that's only achievable when idle. Intel's own presentations show 6 hours as well, on a similar 25WHr battery as the Cius. From few reviews out there, the battery life will quickly go down to 3-4 hours. Even the unoptimized 3.6W figure of this Tablet is almost half, and that's playing a flash video. Let's hope they can achieve their target of 2.6W, that would be less than 2/5 and something that will get them competitive.

            • NeelyCam
            • 8 years ago

            You seem familiar… where have I seen you..? Oh, that’s right:

            [url<]http://www.youtube.com/watch?v=FMEe7JqBgvg[/url<]

            • jensend
            • 8 years ago

            Looked at other places’ figures for Moorestown, and in one way you guys are right: the 2.6W/3.6W figures really are too high to be for just the Medfield SoC. But they really are way too low to be for the entire system unless the backlight has been turned so low that the screen is practically black; as I said, a 10.1″ LCD display’s backlight alone takes close to 2.5W at normal brightness. So this means the figures in the article are practically useless.

    • Arclight
    • 8 years ago

    I must be between the few that still doesn’t feel the need for a tablet…is someone else here with me?

      • chuckula
      • 8 years ago

      Me… although once a tablet with good performance that is also hackable becomes available I may give in.

      • madmanmarz
      • 8 years ago

      Phone+Netbook+Desktop for me
      Tablets are still simply big phones, so they don’t really bring much to the table. I really don’t see any real productivity anytime soon out of anything smaller than 10″.

      My phone gets used for: casual gaming, communication, GPS, pictures, WIFI, etc. Plus it’s waterproof so I can put it anywhere and not have to worry.

      The netbook serves as a replacement for an aging laptop for work/school/emails as well as casual games/movies on the road. It also triples as my HTPC for the living room.

      And of course the PC lets me play BF3 and Skyrim, as well as multitask like a boss =)

        • cygnus1
        • 8 years ago

        Like a boss!

        • madmilk
        • 8 years ago

        Just like me!

        Well, my “netbook” is really a CULV laptop (Acer Aspire 1430z), but it was $380 so I call it a netbook. It handily beats netbooks in everything except battery life. I threw in 8GB of RAM for $30 and it even does virtualization pretty well. No VT-x though, so no 64-bit VMs.

          • NeelyCam
          • 8 years ago

          I put an SSD in a similar CULV. Speedy bugger.

      • Deanjo
      • 8 years ago

      Ya, there are others like you. I think they are all at the floppy disk swap meet this week however.

        • srg86
        • 8 years ago

        No, it’s because we like to actually do things with our computers (even mobile ones). For me a table is utterly useless.I’d rather have a netbook.

          • Deanjo
          • 8 years ago

          See that is where you are thinking about a tablet wrong. It is not there to replace a computer, it is there to compliment it. As far as being able to “actually do things” there are plenty of things that you can do on a tablet for example I regularly use mine for IT work such as maintaining/migrating/installing servers. A tablet works fine for these as most of the time a person is remote administrating the servers in the first place. We also have several clients of ours that are using tablets in a corporate environment as well operating their software through HTML5 interfaces as well. The uses for tablets are many and if you can’t see a use for them then really you have a narrow view as to what can be done with them without any real practical experience or plan of deployment. Seeing examples like “using my netbook as a HTPC” doesn’t really impress me much as tablets can do the same as well (and wireless to boot).

            • Arclight
            • 8 years ago

            [quote<]for example I regularly use mine for IT work such as maintaining/migrating/installing servers. A tablet works fine for these as most of the time a person is remote administrating the servers in the first place. We also have several clients of ours that are using tablets in a corporate environment as well operating their software through HTML5 interfaces as well.[/quote<] If that's not niche, idk what niche is.....

            • Deanjo
            • 8 years ago

            You obviously do not know what niche is. That was just one example of literally thousands of uses on a tablet. Heck by the time you fire up your favorite copy of photoshop on that anemic netbook a tablet would have launched and fixed your favorite photo of Aunt May’s red eye.

            • Arclight
            • 8 years ago

            [quote<]niche=A situation or activity specially suited to a person's interests, abilities[/quote<] If managing IT for a company is not niche....then why aren't we all doing that? Heck let me call my grandparents to see if they finished work on that e-mail server that was causing problems.....You got thousand uses for a tablet? Well i got all kind of uses that were ever invented for the desktop PC before tablets ever appeared. Bet your arse it beats your tablets...

            • Deanjo
            • 8 years ago

            I would hardly consider IT a niche market out there considering there are more ITs out there employed then there are with many traditional jobs and has been the trade with the largest annual growth for two decades now.

            BTW, any application would fall into a “niche” with your definition as they all pretty much have one particular use.

            • Bensam123
            • 8 years ago

            Nah, you pretty much described a niche scenario that fits your life perfectly. IT administrators aren’t all over the place. Fluke makes customized laptops specifically for that sort of job for instance…

            • Deanjo
            • 8 years ago

            No I used the example to point out that pretty much anything can be done with a tablet including specialized tasks. Apple is right when they say “there is an app for that” for almost any task one could think of. It is especially true when comparing a net book to a tablet. There is a good reason why netbook sales have been tanking since tablets have come out in mass.

      • GreatGooglyMoogly
      • 8 years ago

      Hear hear. And I actually have one (a Galaxy Tab 8.9). I don’t use it.

        • adisor19
        • 8 years ago

        That’s because it’s not an iPad.

        😀

        Ok ok i’ve hit my trolling quota for today. 😛

        Adi

      • dragosmp
      • 8 years ago

      There is no need for a tablet as there is no need for motion-sensor controlled sliding doors – but they are useful. Take for a test drive an iPad or a Cyanogen-mod powered Android tablet (not bloated/limited stock Android installs) and you may find them pretty interesting. The caveat with Android tablets is as with Windows laptops – pick the one that either suits you @stock or the ones best supported by the modders. Tablets have three advantages when compared to similarly priced laptops (~2-500$): far better screen quality (Kindle, Nook, TP have superb IPS displays), lower weight and longer battery life. All these make them good media consumption devices around the house and great travel companions.

      • khands
      • 8 years ago

      Tablets, to me, are pure entertainment consumption vehicles minus the clutter. Movies, books, most of the internet, and simple games all work well because the idea of the tablet is to get rid of the weight and awkwardness of a laptop out of the way. If you have some one where that’s 99% of what they do, well then they’ll probably find use for a tablet. They’re not working machines, they were never designed to be.

      • Alexko
      • 8 years ago

      Very few people actually *need* tablets, they just want them.

      • Cuhulin
      • 8 years ago

      I think the question is about tablet accessories. Once a Windows 8 tablet has Transformer Prime type convertability, I think it becomes a more useful netbook. There are times, like in meetings, where a tablet is a less intrusive form factor than opening a notebook or netbook, and times, like on planes, where it fits better. Having the keyboard will give it netbook usefulness, and x86 compatibility will allow use of standard PC software in a lot of cases.

      • Bensam123
      • 8 years ago

      Yup, there really is no need for a tablet unless you have an inflated wallet, money is burning a hole in your pocket, or you can’t afford a laptop or even a cheap netbook.

      From what I’ve seen people seem to use them as ornaments on tables and such. I guess it makes them feel sorta like Jean Luc sipping his earl grey and reading up on the latest klingon invasions.

        • cygnus1
        • 8 years ago

        damn klingons

Pin It on Pinterest

Share This