Make your graphics card weep with Philips’ 328P8K monitor

How was your Labor Day, gerbils? I spent the entire day in front of my PC enjoying my shiny new MSI GeForce GTX 1080 Ti Gaming X. It burns through the games I play in 4K with ease, so maybe it's time for me to start looking at 8K monitors. There's Dell's UP3218K, of course, but soon it won't be the only offer in town. Philips just let the world know of the 328P8K professional IPS monitor with a 7680×4320 resolution. This bad boy spreads its 33 million pixels across a 32" diagonal.

According to the Blur Busters blog, Philips' display makes use of a 10-bit IPS panel that can reproduce 100% of the wide Adobe RGB color space. The monitor's backlight is nothing to scoff at either, as Philips says the display can shine at up to 400 cd/m². That's bright enough to qualify it for AMD's FreeSync 2 standard, although Philips didn't say anything about support for variable refresh rates.

The company does say that the 328P8K supports the "HDR 400" specification, which is a term we've never heard before. The monitor's maximum brightness of 400 cd/m² isn't enough to qualify it for HDR10 support. It's possible that Philips' take means the 328P8K can accept and display an HDR10 signal and display it within its brightness range.

Overall, Philips' new 8K monitor is pretty similar to Dell's previously-announced UP3218K. Like that display, the Philips 328P8K requires a pair of DisplayPort 1.3 connectors to work in 8K mode. There's also a USB hub with USB Type-A and Type-C connectors that supports charging of external devices. Finally, Philips packed in a pair of speakers and a webcam, too.

The company didn't say how much the display would cost, but we'd expect it to be in the same ballpark as the $3900 UP3218K. Philips says the 328P8K is coming in the first quarter of 2018.

Comments closed
    • NovusBogus
    • 2 years ago

    –“Professional monitor”
    –16:9 aspect ratio

    …not sure if serious? 16:10 or 1.90:1 please, this “HD” foolishness is best left to the silly consumers with their games and their not-cinematic movie editing. </smugbastard>

      • GrimDanfango
      • 2 years ago

      The “professional” market is far less concerned with the difference between 16:9 and 16:10 aspect than a lot of people seem to think. They’re far more interested in minimizing off-angle colour-shift, a full set of per-channel hardware calibration controls, and a 14-bit LUT.

      Last I checked, modern TV broadcast standards were all 16:9 aspect, and almost all movie formats are even wider, with 2.35:1 being the most common.

    • kamikaziechameleon
    • 2 years ago

    cool panel density but otherwise what a waste. The base is ugly, the webcam bump is ugly, the aspect ratio wastes this density… heck the monitor size actually wastes the resolution. this should be a 43″ ultra wide monitor, dump the webcam, dump the speakers.

    • Tumbleweed
    • 2 years ago

    Seriously, skip the 400 nit 8K display and get an HDR 4K display.

    • southrncomfortjm
    • 2 years ago

    Can an IGPU handle 8K well for desktop apps, or is a low end graphics card needed? Just wondering since 8K is 16x the resolution of 1080p and all.

    • JulianNott
    • 2 years ago

    Someone humor me and explain the benefit of this resolution across 32 inches?

    I have an HP Spectre x360 laptop with a beautiful screen but at 15.6 inches I can’t see the slightest difference between 4K and 1080. My main monitor is a 4K Philips BDM4350UC. It is 43 inches but I can’t envisage the image would be any better at higher resolution [nice monitor by the way].

    Enlighten me please!

      • kuraegomon
      • 2 years ago

      Re Philips BDM4350UC: Nice monitor – I owned one for an hour once (I realized after clicking “Purchase” that it didn’t support HDCP 2.2, but Best Buy wouldn’t let me cancel the purchase – so I had to pick it up from the courier, and drive it to a Best Buy to return it :-/ )

      • meerkt
      • 2 years ago

      Maybe now that they can produce 260dpi screens it’s not a problem, and they’re just letting the market decide?

      Anyway, maybe it doesn’t hurt as long as nearest-neighbor is supported for integer divisions, and there’s no extra latency.

        • JulianNott
        • 2 years ago

        I’m all in favor of letting the market decide. But my screen is about 100 dpi and unless you put your nose against it, you can’t see the dots. But maybe it is like cars that will go 160 mph, people buy them even though they can never use the speed.

          • bronek
          • 2 years ago

          graphic designers etc. sometimes DO put their noses against display. I think that’s for them

      • dme123
      • 2 years ago

      If you cannot see the difference between 4K and 1080P on a 15″ laptop screen then you might need to see an optician. I use two Dell Precision 5520 laptops with the two different screens and the difference is not subtle. Perhaps not as extreme as going from an old school iPhone 3G to a 4 was, but not far off it.

        • JulianNott
        • 2 years ago

        You might like to compare apples with apples. Try the 4k screen at full resolution and then at 1080.

      • jts888
      • 2 years ago

      Seconded, sort of. I’d really rather see 8k in the 40″-45″ range too.
      If you have average or better vision you should be able to pretty easily distinguish ~100 and ~200 dpi though.

    • derFunkenstein
    • 2 years ago

    [quote<] The company does say that the 328P8K supports the "HDR 400" specification, which is a term we've never heard before. The monitor's maximum brightness of 400 cd/m² isn't enough to qualify it for HDR10 support. It's possible that Philips' take means the 328P8K can accept and display an HDR10 signal and display it within its brightness range.[/quote<] well obviously HDR 400 is better than HDR 10. The number is like 40 times more. So this has to be an improvement.

      • smilingcrow
      • 2 years ago

      I’m going to hold out for HDR-11 as the colours have infinite ‘sustain’ even when the screen is powered off.

        • derFunkenstein
        • 2 years ago

        Why not just make 10 brighter?

          • smilingcrow
          • 2 years ago

          But this goes to 11!

        • meerkt
        • 2 years ago

        Infinite sustain? I wanna see that. Particularly the 12th pixel on the 4th line.

          • jihadjoe
          • 2 years ago

          But the conductor wants you to play it in pixellisimo

    • Chrispy_
    • 2 years ago

    Can the resolution be changed without interpolation filtering ruining the image?

    No matter how much better dpi-scaling is getting in Windows and applications, sometimes you still just can’t use it. When I have to fall back to 1080p on 4K displays or similar, everything looks utterly appalling thanks to the mandatory bilinear filtering.

    If this thing could disable filtering, it would be ideal because pixel doubling provides a 4K resolution, tripling it provides 1440p and quadrupling it provides 1080p without any need for hideous blur that nobody asked for and nobody wants.

      • CuttinHobo
      • 2 years ago

      Excuse me, but *I* asked for hideous blur.

        • Chrispy_
        • 2 years ago

        My apologies, I did not mean to presume on your behalf!

        Could you please in future smear petroleum jelly over your eyeballs so that the rest of us can enjoy monitors without interpolation? That would be a good solution for everyone and the upshot is that then you get bilinear-filtered vision even when you look [i<]away[/i<] from the monitor! [i<]* side effects may include blindness, excruciating pain and inflamed eyes. If your eyes are literally on fire please consult your GP immediately[/i<]

          • CuttinHobo
          • 2 years ago

          I tried that but the constant need for reapplication was a challenge. It was easier to convince display manufacturers to save $0.04 per unit.

          I did the same thing like a decade ago so OEMs would stuff 1368×768 displays in nearly mainstream laptop. And I’ll do it again with 1080p laptops until budget phones have 8k displays!

          [Maniacal laughter]

        • derFunkenstein
        • 2 years ago

        Leave it to the hobos.

          • Liron
          • 2 years ago

          They saved Middle Earth!

          • mcarson09
          • 2 years ago

          I wondered what that funky smell was…

      • psuedonymous
      • 2 years ago

      [quote<]Can the resolution be changed without interpolation filtering ruining the image?[/quote<] At high DPI, [b<]yes[/b<]. The 'resolution independence' of CRTs was just because the phosphor dot-pitch was high, while the 'pixel' of the scanning electron beam could cover many physical dots. With very high DPI LCD displays, you can get exactly the same effect regardless of integer or noninteger scaling. It is only when the pixel pitch and dot pitch are very close that you start to see aliasing artefacts. ::EDIT:: for example, the much vaunted Sony GDM FW900 has a DPI of ~110 (0.23mm dot pitch). The 328P8K has a PPI of ~263, beating it quite handily in both addressable and physical pitch.

        • RAGEPRO
        • 2 years ago

        You rather understated there, haha. CRT phosphors were absolutely minuscule, and they weren’t laid out in a grid pattern the way LCD subpixel clusters are. There’s no LCD on the planet, even those in smartphones, that comes close to having the kind of multisync capability that CRTs did.

          • SecretSquirrel
          • 2 years ago

          The phosphor particles were tiny, but the R, G, and B phosphor dots certainly were not. Might want to read up on shadow masks and aperature grills used in CRTs.

            • juzz86
            • 2 years ago

            I remember when I got my first Trinitron with an aperture grille. Pissed me off for ages, couldn’t not see the damn thing.

            Read too many reviews 🙁 Ended up a beautiful unit, still got it in the shed.

            • BurntMyBacon
            • 2 years ago

            I remember when I first got my hands on an NEC Diamondtron (aperture grille similar to Trinitron). I got it from a guy who, like you, read too many reviews and couldn’t not see the lines. Consequently, he decided not to tell me it was an aperture grille when he gave it to me.

            I remember thinking the screen was beautiful and wondered why he was getting rid of it. It took me about thirty minutes before I noticed the lines. I didn’t know a thing about aperture grille displays at the time, so I thought it was a defect in the screen. After spending a little time looking up what might cause such a problem and how to fix it, I came across some reviews and enlightened myself. Interestingly, I felt rather better about the monitor after that. While the lines never went away, I was no longer bothered by them. Still have it in fair working condition (though I’m pretty sure it isn’t quite as bright as it once was).

            • juzz86
            • 2 years ago

            Not telling you is a bit lousy, but I can believe it mate.

            I was lucky enough to make a sizeable resolution jump (1024×768 [or maybe even 800×600, unsure] to 1600×1200) as I came across from shadow mask, which lessened the pain somewhat – never mind the price I paid for the thing!

            Ended up serving me faithfully for a good period of time, similar to you I never couldn’t see them, but the benefits outweighed the cost by a long margin!

            It was also black, and I was migrating away from beige at the time – double-win!

          • psuedonymous
          • 2 years ago

          [quote<]CRT phosphors were absolutely minuscule, and they weren't laid out in a grid pattern the way LCD subpixel clusters are.[/quote<] The phosphor dots in a Shadow Mask CRT (or stripes in an Aperture Grill CRT) were not that small (generally 1/4mm pitch was fairly standard), and many LCD and OLED panels now have pixels smaller and more closely spaced than those dots/stripes. In arrangement, Aperture grill phosphor layout is extremely similar to the parallel vertical bars of typical LCD panels. Shadow Mask triads are still arranged in a regular grid as with LCD or OLED pixels, but use more of an offset than most LCDs do (they are similar to pentile OLED layouts, but pentile OLED uses an RGBG sequence).

        • meerkt
        • 2 years ago

        Still, would be nice to have integer scaling left untouched.

        Though at 8K it probably matters little.

      • jihadjoe
      • 2 years ago

      IIRC the option to do all scaling on the GPU instead of the monitor has been in Nvidia and AMD’s drivers for a long time time.

        • Chrispy_
        • 2 years ago

        Indeed – I’ve bought monitors without scalers in the past knowing that I could rely on the GPU.

        Even then, there’s still no way to disable the bilinear filtering 🙁

          • rutra80
          • 2 years ago

          To make things even worse, most of current systems (Windows 10 with AMD or Intel GPU) don’t obey the display driver’s fullscreen / keep aspect ratio / centered scaling settings.

    • blahsaysblah
    • 2 years ago

    Something is off,
    any old 1080p panel, 8-bit colors: 256 shades
    4k(edit: 2160p) panel, all good panels are real 10 bit(1024 shades). That’s an improvement.

    I would have thought you’d want more than 10 bit colors (aka 1024 distinct shades) for
    a 7680 by 4320 pixel dimensions. That same ratio as the 8-bit panels from decades ago. Not what a professional monitor should be using.

    It sounds like they are cutting the 8k from current 4k tech. Like how early and even now there are still a lot of cheap/crappy 4k made from four old 8-bit/1080p tech.

      • chuckula
      • 2 years ago

      [quote<]I would have thought you'd want more than 10 bit colors (aka 1024 distinct shades) for a 7680 by 4320 pixel dimensions. That same ratio as the 8-bit panels from decades ago. Not what a professional monitor should be using.[/quote<] ???? There is no inherent dependency between the number of pixels on a screen and the color depth of the pixels. I'm not sure what you are trying to say.

        • blahsaysblah
        • 2 years ago

        Color banding… How do you have a smooth gradient across 7680 pixels with only a thousand distinct shades of each color?

          • chuckula
          • 2 years ago

          Actually color banding would be noticeably better with a higher-resolution monitor because it allows for far more effective color dithering when there are more pixels to work with.

            • blahsaysblah
            • 2 years ago

            4k + HDR/10 bit color + Rec 2020 is real 4k.
            A four times cut from decade old 8-bit TN panels do not make a 4k monitor.

            To me its not 8k unless it has post 10 bit color/HDR, post Rec 2020 colors…

            I guess im only one that feels that way.

          • willmore
          • 2 years ago

          Because the human eye cannot distinguish between the 1024 brightness levels in a 10 bit pannel. Heck, 8 bit is almost at the limit. Going to 10 bit was convenient and provided some extra margin for the people with ‘golden eyes’.

      • RAGEPRO
      • 2 years ago

      I’m with chuckula here. What?

        • blahsaysblah
        • 2 years ago

        Whats the point of 7680 pixels if you cant have smoother color changes…

          • just brew it!
          • 2 years ago

          The main point of 8K is not to have smoother color changes (though that can certainly be one benefit, since higher resolution also facilitates finer dithering). The point is to be able to display finer detail without getting aliasing artifacts or blurring.

          Being able to display a perfectly smooth gradient across the entire width of the monitor (where each pixel is exactly 1.000125 times as bright as the previous one) seems like a ridiculously contrived case, and doesn’t seem like it would be terribly useful for predicting how good the monitor will look in practice. It’s effectively a visual synthetic benchmark that doesn’t correlate with real-world use cases in any meaningful way.

          • meerkt
          • 2 years ago

          Same as a 600dpi laser printer instead of 300dpi.

      • UberGerbil
      • 2 years ago

      We had 8bit panels going back as far as 640×480, so that meaningless ratio you’re concerned about has been in decline for decades… which makes sense, since the tech approached the human retina’s chroma resolution limits long before we got close to its spatial resolution limits (its tonal limits are higher, but unless you’re viewing everything in b&w that doesn’t really matter). So, to echo chuckula and RAGEPRO…. uhm, what?

      • derFunkenstein
      • 2 years ago

      8 bit is 8 bits per channel. 256 possible options for Red, Green, and Blue, which works out to 256x256x256 = 16.78M colors. 10 bits per channel is 1 billion colors.

      Most IPS displays these days, especially the cheaper ones, use 6-bit panels good for 262,000 colors and more with dithering.

      • Chrispy_
      • 2 years ago

      [url=http://www.hit-karlsruhe.de/aol2mime/images/VerlaufRot.png<]Here's a graphic showing 8-bit colour against 7, 6, 5, 4 and 3-bit colour[/url<]. Humans can distinguish around 4 million colours so the 256 shades of 8-bit colour is more than the 160ish shades of colour we can deal with. The only reason the industry moved to 10-bit is because when you work in video production and your compositing images from multiple different 8-bit sources, the addition of mulitple layers that are limited to the 256 steps of 8-bit can add up to deviations that are larger than the 160 distinct shades we can detect. Typically it is safe to do compositing of at least 6 10-bit image sources and it still be "perfect" to the human eye. As a consumer simply viewing content on a screen, 8-bit is already adequate and 10-bit is simply overkill (but marketable to the gullible, and therefore profitable).

        • GrimDanfango
        • 2 years ago

        It’s entirely possible to edit higher precision data than your screen can reproduce… you certainly don’t need a 10-bit panel to edit 10-bit images. Compositors frequently work with 32-bit-per-channel floating point image data, and there definitely isn’t a screen on Earth that can display that. The results from compositing a bunch of 32-bit layers together can be displayed perfectly well on an 8-bit panel.

        10-bit panels exist only because we *can* perceive the differences between shades.

        Any time you read a source claiming “the human eye can only perceive #” – you can pretty much write that source off as nonsense.

        Within the limited range of a typical standard-dynamic-range monitor, it certainly could be argued that *most* people will have difficulty differentiating more than 256 shades, but even then, it may be easier to discern differences between darker shades than lighter ones, as our eyes have a massively non-linear response to light… plus in reality, there’s a far greater dynamic range of light our eyes can “see” than a typical monitor can display, even than the best of the best HDR monitor can currently display.
        In the future, as HDR gets more and more advanced and reproduces ever more of the range we can actually see, it might become entirely useful to have 12-bit, 14-bit, 16-bit per channel.

        As it stands currently, 8 bit is juuust about adequate for an SDR display, while 10 bit is juuust about adequate for a typical current HDR display.

          • brucethemoose
          • 2 years ago

          There are video encoding advantages too:

          [url<]https://gist.github.com/l4n9th4n9/4459997[/url<] [url<]http://x264.nl/x264/10bit_02-ateme-why_does_10bit_save_bandwidth.pdf&ved=0ahUKEwjyu_zQ3Y7WAhUGySYKHTDlAEgQFgglMAA&usg=AFQjCNFMz2uRagi3D6F4vBNSYEKs1jzkKQ[/url<]

          • blahsaysblah
          • 2 years ago

          This, i thought instead of getting more pixels, you wanted to improve on all aspects of a display.
          Not just 4k monitor made by cutting a bigger piece from a sheet making monitors 10 years ago.

          Same with 8K, if its just a bigger cut from a 4k panel, that’s not real 8k. It should be moved to at least 12 bit colors/even better HDR… Even bump up from Rec 2020 to even higher than 76% of human perceivable colors.

          Of course no display even supports Rec 2020 output yet, only input.(as far as i am aware)

            • GrimDanfango
            • 2 years ago

            As has been pointed out elsewhere – 8-bit has been around a LONG time at this point, and 10-bit panels have only just started to enter the upper-mainstream. Whatever the reasons may be, I get the feeling we won’t be seeing 12-bit panels for a few years yet. I’m not even aware of 12-bit making inroads in the professional end of the market yet.

            As much as it may have a measurable benefit, the move from 10 to 12 will still have diminishing returns over the move from 8 to 10… HDR will have to hit quite significant dynamic-range before any manufacturer considers it worth moving up to 12.

            • DPete27
            • 2 years ago

            You might be underestimating the technical challenges in producing a “sheet” of 33 million pixels at the scale of a monitor/TV without having a lot of dead ones….

            just sayin’

      • shank15217
      • 2 years ago

      Whatever, I used to roll with CGA graphics

    • ptsant
    • 2 years ago

    I’m not yet convinced that I need 4k, at least before having to mount the monitor to a wall. Much more interested in HDR.

    OLED screens are impressive…

      • tay
      • 2 years ago

      OLED is not coming to monitors anytime soon. QC and burn in isn’t there yet.

        • southrncomfortjm
        • 2 years ago

        Burn-in isn’t an issue anymore far as I know.

          • Hallucinosis
          • 2 years ago

          I believe it’s still an issue, but as anecdotal evidence: I know someone with a poke-ball burned into their Samsung Galaxy S7 OLED display.

            • southrncomfortjm
            • 2 years ago

            Maybe for cell phones, but I think this has been solved for the TVs. Either way, it’s all academic, OLED isn’t coming to monitors anytime soon.

            • UberGerbil
            • 2 years ago

            Reduced, but not solved. From [url=http://www.rtings.com/tv/reviews/lg/c7-oled<]rtings[/url<]: [quote="rtings"<]The C7P OLED TV is prone to image retention just like previous models. Fortunately, the image retention is less strong than what we have measured on 2016 LG OLED TVs. If you find out that your TV has some image retention after playing video games over a long time for example, there is a function in the 'Picture settings' page, under 'OLED Panel Settings' named 'Pixel Refresher' that will 'recalibrate' the screen to get rid of any imprinted images that may still visible. This procedure lasts around one hour and the TV needs to be shut off for it to work. This can usually take care of any image retention. Another feature is also available on the same settings page named 'Screen Shift' that will 'move' the screen slightly (you can't really notice it) to make the image retention less problematic. For our test, this feature was turned on but there is still some image retention.[/quote<]

        • mcarson09
        • 2 years ago

        The Dell UP3017Q 30 UltraSharp OLED Monitor tells you that: You’re Wrong. ;D

        [url<]http://www.dell.com/en-us/shop/accessories/apd/210-aiei[/url<] A little bit more info here: [url<]https://pcmonitors.info/dell/dell-up3017q-4k-uhd-oled-monitor/[/url<]

    • willmore
    • 2 years ago

    Okay, I think I need to go find my old posts from a few years back where I lamented how displays were stuck at 1080p and were all crappy TN.

      • Wirko
      • 2 years ago

      Remember when TR had a [url=https://techreport.com/blog/21305/what-pc-makers-think-is-happening-at-retail<]comic section[/url<]?

        • derFunkenstein
        • 2 years ago

        “everybody” complained about them. Somewhere on this site I believe Cyril was revealed to be “Fred”

          • Mr Bill
          • 2 years ago

          Amusing comments there. I think we should start up the comic section again. Give folks something to er, discuss, between tics and tocks.

          • LostCat
          • 2 years ago

          I only complained about <1080p screens. Silly laptops.

        • tipoo
        • 2 years ago

        I ‘member

        • mcarson09
        • 2 years ago

        I long just for TR to get their reviews out on time ;D

      • UberGerbil
      • 2 years ago

      While you’re at it, can you find the ones where folks like me complain that there only like two decent mechanical keyboards on the market and we were all trying to get our old PS/2 Model Ms work reliably on USB?

        • dpaus
        • 2 years ago

        <hangs head in guilt…

        …and realizes that hanging my head in guilt leaves me staring at the very-, very-well-worn spacebar of my Steelseries 310. And I smile>

      • Mr Bill
      • 2 years ago

      Remember when only TBSC sound cards would work in an SMP setup?

        • derFunkenstein
        • 2 years ago

        I don’t, but I’m guessing we’re talking about the S370 or Socket A days where VIA’s southbridges had PCI issues.

          • Mr Bill
          • 2 years ago

          Yeah it was the dual celeron 370’s (MOSFET BP6 days) and dual MP setups. Turtle Beach Santa Cruz had drivers that were perfectly happy in a dual socket SMP system.

            • derFunkenstein
            • 2 years ago

            Huh..would not have expected a 440BX motherboard to have issues with anything, but I guess that’s as much the fault of the audio vendor as anything.

        • MOSFET
        • 2 years ago

        I do. BP 6 days

      • GrimDanfango
      • 2 years ago

      I recall there were a few years during which every… single… laptop shared the exact same 1366×768 TN panel. I’m so happy to type this on a 1080p IPS laptop 😛

    • chuckula
    • 2 years ago

    Yawn. Wake me up when it’s 16K, which matches the kilobytes of RAM in my 1980’s TRASH-80.

      • Neutronbeam
      • 2 years ago

      My grad school roommate had a TRASH-80 and I had the original 128K Mac…I don’t miss those days; nostalgia is not what it used to be.

        • derFunkenstein
        • 2 years ago

        Playing old computer and console games on original hardware is fun, but they take up a lot of space. And for sure I wouldn’t want anything with moving parts or a CRT that could burn out. Probably better off using an emulator for the majority of this stuff these days.

      • trackerben
      • 2 years ago

      Ha, my 1996 NX-6000 256KB bests TandyAppleIBM’s launch units. Only Fortune’s beat it. Sad!

      • UberGerbil
      • 2 years ago

      8K? 6.40K is enough for anybody.

      Edit: (Fun fact: in 1982 I mowed lawns all summer to earn $200, enough to buy a 16KB upgrade my Apple ][+ from 48KB to 64KB. Yes, $200 for 16KB that ran at 1MHz. You think RAM prices are bad now. 😉 But it paid off: by the summer of 1983 I had a job writing games for the Commodore 64.)

      • dpaus
      • 2 years ago

      Sneer all you want; I wrote my first computer-aided dispatch system on a TS-80 with 16K in 1980. It loaded it’s BASIC from a cassette drive. Depending upon your definition ‘microcomputer’ (IMHO, a PDP-11 is NOT a ‘microcomputer’), it was the first microcomputer-based CAD system in the world.

      • USAFTW
      • 2 years ago

      I assume (and hope) that that moniker was not an official thing. If so, that would be a more overt mistake than AYDS weight loss candy. OTOH, it would be a fitting descriptor in a comparison with PCs of the modern era.

        • UberGerbil
        • 2 years ago

        It occupied the exact same conceptual/official space as “Windoze”

        But the candy was named before the disease was recognized (and quite probably before it started spreading outside Africa). So it’s a bit silly to label that trademark a mistake unless you think the people involved should have been capable of seeing into the future.

Pin It on Pinterest

Share This