AMD AM4-based system spotted in the wild

AMD announced the consolidation of its current trio of desktop APU/CPU sockets into a unified AM4 socket all the way back in January at the Consumer Electronics Show. At the time, AMD disclosed that future "Summit Ridge" and "Bristol Ridge" chips would arrive to fill this socket, but did not offer any additional information. 

In the intervening months, AMD has clarified that Summit Ridge represents the upcoming 14-nm Zen architecture-based processors and Bristol Ridge refers to seventh-generation APUs based on the Excavator CPU core and GCN 3.0 graphics architecture. 

reddit user starlightmica spotted this AM4-based HP desktop PC for sale at Costco

PC enthusiasts waiting for Zen have a bit longer to wait, but reddit user starlightmica posted that those wishing to explore the performance of AMD APUs in conjunction with DDR4 memory can head to Costco and pick up the evocatively-named HP 510-P127C Pavilion desktop computer based on the HP "Willow" motherboard

The HP Willow microATX motherboard is built around AMD's Promontory Fusion Controller Hub, bringing with it support for PCIe 3.0 and DDR4 memory. HP's motherboard support page details support for Bristol Ridge A8-9600, A10-9700, and A12-9800 APUs, all of which are quad-core 28-nm chips with 65W TDP. Some future Summit Ridge CPUs have TDPs up to 95W, though this board appears to be limited to 65W units.

The board offers a single PCIe x16 slot and and A-keyed M.2 slot, which seems to imply support for PCIe x2 SSDs. DDR4 memory as fast as 2133 MT/s is supported in configurations up to 16GB spread across two slots. The Willow motherboard is suspiciously lacking USB 3.1 ports, Type-C or otherwise. Whether this is related to mid-summer rumors that AMD and Asmedia were dealing with signal integrity issues with USB 3.1-related circuitry remains unknown. 

The specific PC available at Costco includes a 1TB magnetic hard disk, 16GB of DDR4 memory at an unspecified clock speed, a top-of-the-line A12-9800 APU, an 802.11ac Wi-Fi adapter, and a DVD recorder for just under $600.

Comments closed
    • Shobai
    • 3 years ago

    A little late to the party, but another thing that intrigues me is the heatsink mount. I know that the HPs and Dells of the world like to utilise weird and wacky heatsink mounts, but this specimen looks very much like your common garden variety Intel square mount – such as you’d see on a [url=https://techreport.com/review/30030/msi-z170a-sli-plus-motherboard-reviewed<]Z170 board[/url<], for instance. Did HP do this with inventory management in mind? Do we know what to expect for AM4?

    • deruberhanyok
    • 3 years ago

    [i<]those wishing to explore the performance of AMD APUs in conjunction with DDR4 memory can head to Costco and pick up the evocatively-named HP 510-P127C Pavilion desktop computer based on the HP "Willow" motherboard [...] DDR4 memory as fast as 2133 MT/s is supported in configurations up to 16GB spread across two slots.[/i<] Not going to see any performance exploration there. Prediction: as fast as it would be with DDR3-2133, which Kaveri already supported. APUs will benefit from the much higher memory speeds capable with DDR4, but I don't think the Kaveri - > Excavator change is going to provide a big advantage with the same amount of system bandwidth available.

    • raddude9
    • 3 years ago

    So, any chance of a review of these chips any time soon, or do I have to rely on leaks from questionable Korean web sites:
    [url<]http://www.bodnara.co.kr/bbs/article.html?num=134612&mn=4[/url<] The leak looks plausible, CPU performance up 5%, iGPU up 20% and power consumption down... a lot. I'll be taking it with a pinch of salt until I see a reputable review though.

      • ET3D
      • 3 years ago

      Thanks for the link. I imagine that yes, we’ll have to rely on foreign sites until the DIY version is released. Hopefully not too much longer to wait.

    • jihadjoe
    • 3 years ago

    Feels so strange to look at a motherboard and not see any oversized heatsinks, bright, high-contrast ‘gamer’ colors or RGB LEDs.

      • Anonymous Coward
      • 3 years ago

      It can’t possibly work.

    • NTMBK
    • 3 years ago

    Wow, I’d forgotten just how junky OEM boards are. I guess with a modern SoC integrating so much, you don’t need an awful lot on the board any more.

      • Anonymous Coward
      • 3 years ago

      You should clarify “junky” in more detail. I’m curious to hear what features would impress you.

        • BurntMyBacon
        • 3 years ago

        What would impress NTMBK? How about a built-in thermal system that converts CPU waste heat into PCB “lava” that is guided to the edge of the motherboard where it expands board space at a rate dependent on how often it is used. Also, auto generating circuits to develop the new board space into usable features. This would allow for a computer that grows with your needs. Finally, a side window to watch it in action. I would find this impressive (you didn’t ask for realistic ;’) ).

        • Voldenuit
        • 3 years ago

        [quote<]You should clarify "junky" in more detail. [/quote<] Only 2 SATA slots? Good luck adding an SSD drive. The motherboard looks to be unnecessarily large - it's longer than an ITX board, also has the added height of a DTX board but no second expansion slot. 24-pin port is poorly situated. My advice is to make sure no one you know buys this dud.

        • NTMBK
        • 3 years ago

        It only has 2 SATA ports, no M.2 or mSATA, only a single PCIe slot, completely minimal power circuitry with no cooling on it, no integrated wifi that I can see… it’s cut every corner that it can do, and leaves absolutely zero room to expand.

        That lava thing that BurntMyBacon suggested sounds pretty awesome though.

          • just brew it!
          • 3 years ago

          The article says it has an M.2 slot.

            • RAGEPRO
            • 3 years ago

            It does, and you can see it. Top right part of the board in the picture.

            • Shobai
            • 3 years ago

            It also says it’s a A keyed slot; wookiepedia tells me that means PCIe x2 + USB + I2C + DP x4. Even if [and that’s a big if] HP has got the OEM [Pegatron, maybe?] to put PCIe lanes to the socket, I find it massively more likely that the slot is there for a WiFi card or similar, than an SSD – did you notice how short a card the slot can accept?

    • ronch
    • 3 years ago

    Every time I see these APU-powered systems I’m reminded of those cheap cars like the Nissan Versa or Chevrolet Sonic. Cheap, cheerful, gets you from Point A to Point B and not much else.

    • DrDominodog51
    • 3 years ago

    Those scumbags at HP blurred out the text on the chipset in the only picture zoomed in enough to read it. My guess is that it is the cheapest chipset AMD offers on the AM4 platform.

    • ronch
    • 3 years ago

    Has anyone noted the socket itself? So it seems AMD is finally ditching the fundamental Socket 939 lineage that saw a few pins removed or added over the years to prevent wrong CPUs from fitting in. It’s been what, 13 years! Is this a variation of the FM2+ socket?

      • Zizy
      • 3 years ago

      Umm, it is in the title. AM4. 1331 pins by someone that bothered to count, I didn’t.
      This one is probably the cheapest junk you could buy with AM4, but it might even kind of work.

      • Krogoth
      • 3 years ago

      AMD has ditched the Socket 754/940/939 ZIF socket layout since the move to Phenom Is a.k.a Barcelona.

      It looks like another ZIF socket with far more pins (support for PCIe 3.0 and DDR4). It is probably one of the reasons why AMD cannot beat Intel at power efficiency. LGA sockets are superior at power and thermal efficiency at the cost of fragile pins that motherboard vendors have to deal with rather than CPU manufacturers. AMD only has LGA sockets for their current multi-socket platforms. I kinda surprised that they didn’t start it on the desktop/laptop front.

        • ronch
        • 3 years ago

        AMD has stuck with the same basic socket layout from K8 all the way to Zambezi. You could put an Athlon 64, a Phenom, and an FX side by side and they’re all pretty much the same save for a few pins differences between them.

          • Krogoth
          • 3 years ago

          The packaging is the same, but pin layouts are vastly different.

          Intel has done the same thing with their desktop chips. They hasn’t been that much change in LGA11xx socket since Lynnfield came out. HSF solutions for LGA1366/LGA11xx are compatible with each other. The only major change has been LGA1366 to LGA2011 on the prosumer/professional-side due to the jump to quad-channel DDR3/DDR4 memory and having an on-die PCIe controller with 40 lanes. LGA2011 required a new mounting bracket for HSF solutions.

    • ronch
    • 3 years ago

    I think it’s interestly to note the modularity of computer hardware which allowed the overlap between different AMD CPU generations and chipset generations or platforms since K8. The 700-series chipsets supported K8 as well as K10 (AM2+). K10 worked with AM2+ to AM3+ (admittedly both platforms shared chipsets and it’s just the socket that changed and product monikers from 800-series to 900-series). AM3+ supported K10 and FX. Now, FX CPUs won’t plug into an AM4 mobo but the same fundamental core works with AM4 and AM4 also supports Zen.

      • BurntMyBacon
      • 3 years ago

      [quote<]K10 worked with AM2+ to AM3+ (admittedly both platforms shared chipsets and it's just the socket that changed and product monikers from 800-series to 900-series).[/quote<] It wasn't just the socket that changed. AM3/AM3+ also feature DDR3 memory, where AM2+ used DDR2. AM2 processors couldn't plug into AM3/AM3+ boards due to lack of a DDR3 controller. AM3 processors (K10), however, could be used on some AM2/AM2+ boards that received the appropriate BIOS update, due to the AM3 processors possessing both a DDR2 and DDR3 memory controller. AM4 precludes this possibility due to differing pin counts and layouts. The upside is that by breaking this compatibility, they have given their newer chips access to more data lanes (allocated to PCIe I believe) which will become increasingly important as high speed storage moves away from SATA. To your point, it is curious that while certain K10 models could be used in both AM2+ (DDR2) and AM3 (DDR3 ), the benefits of the increased memory speed were significantly lower than expected. This suggests that the modularity of the memory controller and its consequential independence from the core architecture may have had a negative impact on how fully and optimally it could be integrated. Latency of their integrated controller, if I recall correctly, roughly matched that of Intel's (not integrated) Core 2 memory controller.

      • just brew it!
      • 3 years ago

      700-series chipsets were even compatible with AM3+/FX, as evidenced by the slew of crappy budget AM3+ motherboards that were released with 760G chipsets on them.

        • Concupiscence
        • 3 years ago

        Yep, that pretty much [i<]is[/i<] the uATX market for AM3+, and it stinks.

          • just brew it!
          • 3 years ago

          Near as I can figure, it was a half-assed attempt at market segmentation. My guess is that AMD didn’t want low-end AM3+/FX systems to compete with the APUs, so they somehow convinced the mobo makers to produce only crippled uATX AM3+ boards.

    • ronch
    • 3 years ago

    The board looks real boring but that AMD chipset looks kinda cool. No heatsink plus that console-esque metal shroud on it makes it look modern.

      • Anonymous Coward
      • 3 years ago

      I sense a new trend in the making.

    • Pancake
    • 3 years ago

    Eww. Gross.

      • ronch
      • 3 years ago

      Guess you’re not used to seeing computer innards. Yeah, they’re really gross at first but you’ll get used to it.

        • BurntMyBacon
        • 3 years ago

        Some are prettier than others, though.

        • Pancake
        • 3 years ago

        You guess wrong. I haven’t designed a PC motherboard but used to design add-in cards (video frame grabbers) so I do know what I’m talking about. This is cheap, nasty garbage. Fer crying out loud. It’s a Costco special. You would have to be in a pretty bad way to get something like that.

        And no, you never get used to looking at rubbish.

          • Kretschmer
          • 3 years ago

          (It was a joke. You were commenting on the quality/choice of components; he was implying that you found electronics innards disgusting.)

    • JosiahBradley
    • 3 years ago

    600$ for a machine that will eat through web pages and office documents and has modern connections like Wi-Fi ac is a great thing. This thing beats my previous gaming rig already. Lots of RAM upfront without the user having to supply more and mismatch sticks etc. Hardrives are actually faster these days due to platter density and caching and are plenty fine for casual systems. People will always find a way to complain. I just want to see benchmarks against a similarly priced and specced Intel desktop.

      • Voldenuit
      • 3 years ago

      $600 for that piece of junk? 1 PCIE slot and 2 DIMM slots. I paid $399 for my hp Athlon II X4 system 5 years ago and it was more expandable. A friend recently asked for a budget gaming box to build from and I pointed her at a lenovo refurb desktop with an i5-6500 (6500!) for $379.

      $600 will buy many peanuts.

    • Kougar
    • 3 years ago

    An x2 limited M.2 slot would be very disappointing to see for a brand new chipset, hopefully that is just due to HP’s typical OEM decision in laying out a low-cost board.

      • BurntMyBacon
      • 3 years ago

      Given that the chipset is blurred out in the picture, I’m guessing that this is the lowest end chipset AMD is making for AM4. If this turns out to be true, then an x2 limited M.2 slot isn’t so bad.

        • Shobai
        • 3 years ago

        Do we know that the socket even has PCIe lanes electrically? We have seen it before where OEMs only run USB to such slots, and use it for WiFi cards.

          • Kougar
          • 3 years ago

          Wiki claims 24 lanes are supported by the FM4 socket.

          The FM4 socket is meant to fit both APUs and high-end CPUs, so Zen will require FM4 to have 16 minimum. Wiki says it will have four different chipsets paired with it as well so I’ll just assume this was a budget model.

            • Shobai
            • 3 years ago

            What the AM4 socket can interface, what the chipset on this board can interface, and what the OEM has connected electrically to the A keyed M.2 slot on this specific board are all very different things.

            It may very well be that higher spec’d boards than this will implement M keyed M.2 slots with 4x PCIe lanes, but that’s a question for another day. The questions for today can only be levelled at what we see in front of us.

            [edit: The position also makes me wonder, if the M.2 slot does have PCIe lanes, whether they’re connected to CPU or chipset]

            • Kougar
            • 3 years ago

            I agree they are, however it’s the max supported lanes by the socket that is important here. If the source is correct then 24 PCIe 3.0 lanes is plenty, far more than the 16 you get from Intel’s Skylake platform.

            I’m not remotely interested in budget AMD platforms so I don’t care about specifics for budget chipsets or FM4 boards. I’m solely focusing on the max capabilities of the platform for when AMD’s best processor gets plugged into it next year.

            24 lanes is a promising number and could make for some interesting comparisons because on Intel platforms that crazy-fast M.2 960 PRO is going to route through the PCH’s DMI to the CPU. On an AMD Zen platform it will probably route directly into the CPU and bypass the latency and potential DMI bottleneck.

            • Shobai
            • 3 years ago

            Yep, half again as many is a nice selling point – hopefully they’re able to make them available! Kaby Lake should be releasing in the same sort of timeframe, so it’ll be interesting to see whether Intel up the number of lanes.

      • Shobai
      • 3 years ago

      According to Anandtech’s preview linked in the shortbread, the A12-9800 ha only 8 lanes of PCIE 3.0

    • Welch
    • 3 years ago

    Just start putting 128GB drive in machines and state them in KB instead… Consumers apparently don’t know the difference.

    • albundy
    • 3 years ago

    wow, a dvd recorder and two whole memory slots? amazing!

    • derFunkenstein
    • 3 years ago

    edit: someone else got to the typo first.

    Well this is kind of fun. That hard drive configuration is kind of dumb, but the M.2 slot means that you can shove your own in there without having to worry about drive bays. Hopefully the platform supports NVMe.

      • fullbodydenim
      • 3 years ago

      The platform is suppose to support bootable NVMe drives but what is pictured is certainly not the high end X300 series AMD Promontory chipset. It could be the B350 but more than likely without any USB 3.1 ports its the low end entry level AMD A320 chipset.

    • Chrispy_
    • 3 years ago

    AM4 by itself is as dull as watching paint dry, but what makes this more intesting is that people can review the [s<]southbridge[/s<] [i<]Promontory Fusion Controller Hub[/i<] ahead of the Zen launch. Either it's good, and will inspire confidence in people waiting hopefully for Zen to launch, or it's bad, and AMD has some time to address the criticism and feedback before Zen launches.

      • gerryg
      • 3 years ago

      By calling it “Promontory” does that mean they use water block cooling?

        • ronch
        • 3 years ago

        Should’ve been called ‘Purgatory’.

      • kalelovil
      • 3 years ago

      With the caveat that this is probably using the low-end and limited A320 ‘southbridge’, while decent Zen boards will be using the more extensive X370 ‘southbridge’.

    • just brew it!
    • 3 years ago

    [quote<]1TB magnetic hard disk[/quote<] Every time an OEM ships a system with a brain-dead configuration like a quad-core CPU, 16GB of RAM, and a [u<]mechanical HDD[/u<], somewhere a baby seal gets clubbed to death. C'mon HP, this is 2016!

      • travbrad
      • 3 years ago

      That way people will think their PC is slow and they need to buy a new one relatively soon.

        • Acidicheartburn
        • 3 years ago

        The sad part is that there is probably some truth to that, somewhere.

      • Concupiscence
      • 3 years ago

      It’s brain-dead, but it’s relatively cost-effective and allows HP to boast of reasonable storage capacity. Install too big an SSD and it veers outside the price point; install two drives, and you increase your support call volume because it’s confusing for users who resolutely fail to learn much about the PCs they use and rely on (and believe me, they are a HUGE percentage of the people still buying desktops and laptops). I wouldn’t burden a system I use with just one rotary hard drive, but you and I aren’t the target market for this kind of system 99% of the time.

        • just brew it!
        • 3 years ago

        Knock 8GB off of the base RAM configuration and swap the 1TB HDD for a 250GB SSD. Same price point, better performance, and plenty of storage for the typical cheap prebuilt PC use case.

        Most people store the photos in the Cloud and stream their music and video these days.

          • Firestarter
          • 3 years ago

          install GTA 5 and one other game, and now your SSD is already kinda full

            • gerryg
            • 3 years ago

            If you’re buying a gaming machine at Costco then there’s already a huge problem …

            • evilpaul
            • 3 years ago

            Although I’ve never done it personally, NTFS does allow you to mount a drive partition in an empty directory. So they could probably at least stick Documents/Downloads/Pictures/etc on the mechanical storage.

            • nanoflower
            • 3 years ago

            That assumes that most of the space is going to those elements. What if they installing a number of games that fills up the SSD? Too many possible uses that might fill up a HD for HP to figure out the right place to mount a drive partition that would work for everyone.

            • bhtooefr
            • 3 years ago

            Doesn’t work that way in practice – if you use junction points like that, major Windows updates will fail.

            Really, Microsoft needs to solve this by making storage spaces bootable, and enabling tiered storage spaces on workstation releases of Windows instead of just Server.

      • Chrispy_
      • 3 years ago

      [url=https://en.wikipedia.org/wiki/The_Seal_Cub_Clubbing_Club<]The Seal Cub Clubbing Club[/url<] will be registering their interest in these brain-dead configurations.

      • barich
      • 3 years ago

      The average person buying a computer like this will look at two otherwise identically priced and configured models, one with a smaller SSD and one with a larger HDD, and will have no idea of the performance difference between them. They’ll see that the one with the HDD has a bigger number and therefore must be better.

      • Meadows
      • 3 years ago

      With that line of thinking, you’re either going to disappoint the end user because they can barely fit a few games and movies in there, or you’re alienating the end user because it won’t cost $600 anymore.

      • ozzuneoj
      • 3 years ago

      Makes me think of the beyond-stupid HP Liquid Platinum Ci7 that Wal-Mart sells…

      [url<]https://www.walmart.com/ip/HP-Laptop-Pavilion-Gaming-15-au018wm-Silver-Touch/51397791[/url<] i7-6500u, 12GB of "memory" (who knows if its DDR3 or DDR4), a mechanical 1TB hard drive, a totally redundant waste of a GPU GT940M (it has to be incredibly close to the performance of the HD 520 graphics of the 6500u) and *drumroll*... a 15" 1366x768 TN panel. For a whopping $800... A nearly $400 cpu (according to Intel) combined with more RAM than most people would need for Youtube and Facebook... but lets not make it TOO much and give them 16GB... and then shoehorn a battery draining GPU that won't make games any more playable and hook it up to the worst screen imaginable... probably the same one used on their $279 model with a 1.6Ghz Celeron N.

        • dyrdak
        • 3 years ago

        This PoS has the U type CPU. Marketing this as i7 is an “honest” lie. Pure profit for Intel though.

      • ronch
      • 3 years ago

      Wait, quad core processor? Have we finally figured out whether or not the dual core modules in there are truly dual core processors?

        • just brew it!
        • 3 years ago

        They’re “dual-ish”. 😉

        IIRC the Steamroller/Excavator modules are closer to being a full-on dual than the Bulldozer/Piledriver ones (dedicated decoders per core, among other improvements). AFAIK unless you’re cramming a massive double-precision floating point workload down all of the cores’ throats at once, it should be effectively a “real” quad core.

          • Anonymous Coward
          • 3 years ago

          Its a shame that resource-sharing between cores didn’t work out a little better. Really seems like an idea that could work in real-world situations where power efficiency is important, and heavy FP is a bit rare.

            • just brew it!
            • 3 years ago

            I don’t think it was a *bad* design, per se; but AMD was at a massive disadvantage in terms of process technology. Consider that the Piledriver FX CPUs were manufactured on GloFo’s 32 nm SOI process, while Ivy Bridge was being manufactured on Intel’s 22 nm FinFET process. The comparatively poor performance/watt, and the inability to ramp clocks high enough to make up for the IPC deficit are not surprising. TBH the surprise is that they were in the game at all.

            • Anonymous Coward
            • 3 years ago

            Well, I note that they discontinued it completely, so that says something. Maybe shared FPU will show up in ARM some day, there just has to be life in that concept.

      • Krogoth
      • 3 years ago

      It is because OEMs get better bulk deals from HDD manufacturers then they do from SSD manufacturers.

      SSDs are still considered to be “premium”-tier options in eyes of OEM vendors. Despite the fact that SSDs are closing onto HDDS in terms of GB/$$$$ ratio while offering superior performance (Yes, even *dog-slow*, cheap SSDs).

      • Anonymous Coward
      • 3 years ago

      A 16GB “hybrid drive” would fix that 95% of the way, and would be my choice.

      • ET3D
      • 3 years ago

      It’s a reasonable configuration. If it came with 4GB of RAM I would have said that an SSD is a must, but 16GB of RAM means that most users will not get any kind of virtual memory related disk thrashing, so the HDD will mostly affect boot time and software loading time.

      • kamikaziechameleon
      • 3 years ago

      I was amazed how even my first cheap SSD changed the sensation of speed on my machine. Makes it feel like 10 times the pc when on the desktop and clicking around. Its such a no brainer.

      • Captain Ned
      • 3 years ago

      I’m forced to use HP ProBooks at work. The upcharge for an SSD compared to its actual cost is obscene.

        • just brew it!
        • 3 years ago

        There’s a bright side to all the corporations and government agencies using HP: Lots of nice deals on off-lease HP refurbs. I currently own two (a ProBook and an EliteBook). As with my desktops (where I tend to build based on hardware from a generation or two back), I really can’t see paying for bleeding edge laptops.

    • Lord.Blue
    • 3 years ago

    Small typo – “DDR4 memory as fast as 2311 MT/s is supported in configurations up to 16GB spread across two slots.” – I think you meant 2133 MT/s.

      • morphine
      • 3 years ago

      Thanks for the heads-up, fixed.

    • K-L-Waster
    • 3 years ago

    That photo looks more like a mITX mobo than a mATX. Only has 1 PCIE slot and 2 DIMM sockets.

      • Lord.Blue
      • 3 years ago

      HP, Dell, and many other OEMs use limited mainboards to lower costs – this is too large to be an ITX board, and there are several lower cost Micro-ATX boards with only 2 DIMM slots.

      • chuckula
      • 3 years ago

      Yeah, it’s kind of mITX-like memory/PCIe/expansion ports but on a larger mATX sized PCB.

      • Chrispy_
      • 3 years ago

      But you can see the four mounting holes that define the edge of mITX, and there’s loads of board to the right and below those holes.

      As Lord.Blue says, it’s because OEMs are cheaping-out and making the smallest, most cut down board they get away with. To hell with giving the consumer upgrade options, if they only offer one slot in the configurations they want to sell, then screw you and your future desire to pop another card in there, they made an additional $0.18 by short-changing you the expected slots and ports.

        • just brew it!
        • 3 years ago

        The case is probably an oddball form factor as well.

      • Shobai
      • 3 years ago

      It looks to be much more like DTX than anything else that’s been mentioned – a 2 slot design that conforms to ATX standoff placement.

      [edit: DTX and mini DTX were first announced by AMD back in ’07; you only really see boards in the form factor from the OEMs]

    • smilingcrow
    • 3 years ago

    None seen in the Snowdonian National Park today although maybe they were hibernating.

      • LostCat
      • 3 years ago

      Don’t see any out my window either. Though I guess I usually have to go outside to see wild things.

      Meh.

        • SkittlesTheHamstar
        • 3 years ago

        Looked under this table. Still none there. : (

    • Yan
    • 3 years ago

    Also [url=http://h20386.www2.hp.com/CanadaStore/merch/Product.aspx?id=X6F69AA&opt=ABL&sel=DEF<]found[/url<] on HP's Canadian site, with an A12-9800 processor, but marked as "out of stock".

    • chuckula
    • 3 years ago

    Review sample received?

    Incidentally, while the sign says “2GB video card” that is likely rather misleading. The IGP in the A12 9800 is called a “Radeon R7” so the sign is mislabeling the integrated graphics as a “card”.

    See: [url<]http://www.pcper.com/category/tags/a12-9800[/url<] I'll check at my local Costco on next month's run to see if they have any of these in stock.

      • Concupiscence
      • 3 years ago

      I really wonder – apples to apples – how it’d compare with the i3 4170 I built last year.

        • chuckula
        • 3 years ago

        The onboard graphics of the A12 should be faster than the midrange IGP of an i3 (assuming you are using the IGP that is).

        Don’t hold your breath on the CPU side of things though.

          • Concupiscence
          • 3 years ago

          Tighter driver optimizations for AMD’s IGP + GCN 1.2(?) + DDR4 would smoke the HD 4400 part in the i3, I’ve got no doubt of that. That’d pull performance decent enough you could get away with calling it something like an R7 355 or its like – worse than a 360, but definitely better than the 7730-class performance of the fastest FM2+ APUs.

          Assuming Excavator has a late Intel-style performance nudge over Steamroller, which enjoyed a similar boost over Piledriver, you’d still wind up with mediocre per-clock performance. Ganging up all four cores on the same task wouldn’t be horrid, though.

      • Chrispy_
      • 3 years ago

      I don’t have an issue with them labelling the IGP as a card, since the concept of a graphics card is what consumers understand.

      What I have an issue with is labelling the system as 16GB RAM whilst simultaneously labelling 2GB graphics.

      The PC either has an R7 graphics card/doohicky/gizmo and 16GB of RAM or it has a “2GB graphics” and “14GB of RAM”.

      Selling one feature solely on the amount of dedicated RAM means that the 2GB cannot be included in the system RAM spec, otherwise it’s not dedicated RAM.

        • travbrad
        • 3 years ago

        I believe there was even a company with a Green logo recently that had a class action lawsuit against them for misrepresenting their memory configuration. 😉 And in that case they even had the full amount of memory they claimed, it just wasn’t all fast.

          • Tirk
          • 3 years ago

          Intel and AMD systems have listed the entire system ram capacity for YEARS despite the system reserving some of it exclusively for the igpu. Its not just an AMD thing like Nvidia’s case, the whole industry generally agrees with this viewpoint.

          In fact, system ram is flexible and has been partitioned to exclusive tasks because that is how the system uses it. Would you like to sue Windows as well for partitioning system ram exclusively for the Operating system?

          The total system ram is 16 GB they just decided to also indicate that by default 2GB has been reserved for the igpu. When you use a discrete gpu and disable the igpu, whether its a AMD or Intel chip, that 2GB doesn’t disappear it goes right back to being part of the full speed 16GB non-allocated system ram for the system to later allocate to something else. What Nvidia did and what Intel and AMD do are two completely different things.

          *edit*
          Here’s Intel explaining it if you don’t believe me:
          [url<]http://www.intel.com/content/www/us/en/support/graphics-drivers/000020962.html[/url<]

        • toastie
        • 3 years ago

        The specs are correct. A close look at the photo of the box shows it has an R7 card.

        • Anonymous Coward
        • 3 years ago

        Do integrated graphics reserve large memory pools these days? Seems primitive and wasteful if so. I would expect something dynamic.

Pin It on Pinterest

Share This