Virtual Retinal Display beams images onto your eyeballs

Everyone seems to be pretty excited about the Oculus Rift VR headset. The folks at Avegant are working on an even more intriguing head-mounted display. Dubbed the Virtual Retinal Display, the headset eschews LCD screens in favor of a micro-mirror array that beams reflected light directly into the eyes.

Avegant CEO Ed Tang spoke to CNet about the technology, which purportedly eliminates eye strain while also producing more realistic images than traditional LCDs. He describes the difference as being akin to watching something through a window versus on a screen. You can see CNet’s Tim Stevens try out the headset in the video below.

The current prototype offers a resolution of 1280×800 per eye, but the perceived resolution is supposedly much higher. According to Tang, the micro-mirror tech eliminates the pixelization and screen door effect associated with traditional displays. Stevens says the resolution and color are "quite good" and notes that the picture is comfortable for his eyes to view.

Somewhat surprisingly, gaming isn’t Avegant’s focus. The company thinks people will use its Virtual Retinal Display to watch video content. There don’t appear to be any barriers to gaming on the thing, though. The prototype can already pump out images at 240 frames per second.

Although the existing versions of the Virtual Retinal Display look a little funky, a more polished, consumer-oriented product is set to debut in the first quarter of next year. Avegant is planning a crowd-funding drive to fuel production, and it promises a "consumer-friendly" price.

Comments closed
    • 0g1
    • 6 years ago

    This is crap compared to Oculus Rift because the field of view is so tiny, probably about 10 degrees. This is because they have to project directly onto your eye, the individual pixels have to be cast from the display, much like a projector, but projected straight through your eyeball. From entering your eye, it then expands to their stated 40whatever degrees. But you first have to look at this tiny “objective” of around 10 degrees. The eye’s attention FOV is about 5 degrees (what you are currently looking at). While you can see about 180 degrees FOV, only that 5 degrees is what you process, the rest you have to make guesses on. You have very little freedom to move your eyes when looking at the “objective” of 10 degrees FOV to get the whole scene of 40ish degrees with this device. So that’s why its only good for video. Because you’re expected to keep your eyes pretty much stationary the whole time, looking at the center of the view.
    Most movies try to be centered, but they never really are. That’s why you never sit at the front row in a movie unless you want fatigue from constantly moving your eyes/head to different parts of the view.

    Games are much less “centered” than movies. That’s why everything is in focus and Depth Of Field is rarely used. Because in games you can look wherever you want. Typically, looking at your monitor, its about 45 degrees FOV. Oculus Rift is 90 degrees. The difference with Oculus is like sitting real close to your monitor (or front row at the cinema) — you basically have to move your eyes a whole lot more to look at whatever detail in the game you want to see. That’s why competitive gamers will never replace monitors with the Rift — cos sitting back to a 45 degree FOV allows much quicker view of action. Let alone being able to see your keyboard (and real world) quickly.

    Still, the Oculus Rift would be well served to use a 3 chip LCOS running at least 240hz, but it would be a bit too bulky, projecting onto a screen, and requiring pixel alignment. Hopefully Carmack will enforce OLED to be used at 120hz with minimal pixel persistance.

    • CityEater
    • 6 years ago

    Is the display technology something similar to a DLP system. Kind of a pair of LED Pico DLPs projecting into either eye?
    It would be great if Avegant could partner with someone like Natural Point to expand the gaming capabilities. As a Trackir user I like the way they implement 6DOF in the rare games that support it. I think Oculus faces an uphill battle for game support once its commercialized and practically I’m not going to be as interested in 1:1 headtracking in most games. I haven’t gotten to try the OR though so I don’t know what its like.
    If the price is right and the technology is up to scratch I would expect them to sell a ton of these.
    Its exciting to see science fiction become consumer hardware.

      • 0g1
      • 6 years ago

      Yes, it is similar to DLP in that it is reflective. But due to the high resolution needed in such a small size, I think they would be using LCOS, which is reflective too. They would probably be using a single LCOS with sequential color provided by 3 color lasers. 3 chip LCOS would take too much space, so while they say 240hz, its effectively 80hz, which is quite low for sequential color. To completely rid of the “rainbow effect”, you’d want about 540hz.

    • tanker27
    • 6 years ago

    as opposed to Retina Display……. >.>

    • Khali
    • 6 years ago

    What about us poor folks that have to wear glasses? Will this device be adjustable to take into account our vision problems or will it work with our glasses still on our heads?

      • avegrant
      • 6 years ago

      There will be adjustments for myopic vision.

    • Arclight
    • 6 years ago

    How safe exactly are direct beams of light as opposed to natural incidental (or w/e it’s called, i’m no physicist) light coming from objects in real life?

    I’m just a layman but to me the unknown that comes from new ways of doing things worries me a bit. In this instant i’m questioning the long term safety for prolonged use of such a “display”.

    We have used projectors for centuries but we never set them up to project onto our eyes.

    I suppose that the company has collaborated and tested extensively with the help of doctors but I’m still curious to see the evidence and maybe some contra indications or warnings (not in the promotionals/advertisements ofc, but in a lengthy pdf or w/e easily located on their site).

      • kravo
      • 6 years ago

      +1

      Imagine a system failure with these things on. A BSOD right into your eyes.

      • GrimDanfango
      • 6 years ago

      It makes me wonder just what qualifies something as “beaming light directly onto your retina”… seeing as all forms of display emit light, which travels as light rays, passes into your eye, and lands on your retina. That’s kind of how vision works.

      Lasers are dangerous because they focus otherwise unremarkable light rays along a parallel path, which means their intensity is barely diminished when they reach their target, rather than traditional diffuse light which falls off exponentially over distance.

      If they’re projecting parallel beams of light (aka lasers), there’s no reason they couldn’t calibrate them in such a controlled environment to only carry the tiny amount of light they’d need to replicate the effect of ordinary vision… but you’d certainly want a hardware imposed maximum limit, or the idea of a system crash burning your retina doesn’t sound that far-fetched.

      If they’re not using something analogous to laser light, then I’m not sure what would differentiate it from a normal emissive screen technology like in the Rift.

        • avegrant
        • 6 years ago

        We’re using ultra-low power LEDs and a fancy micromirror array. It’s 100% safe.

        Really the crux of the invention is in the combination of the micromirror and LED along with some proprietary optics. The display is truly unlike looking at an LED/LCD/OLED display — much more bright and vivid. Hopefully you’ll be able to try it out soon.

          • GrimDanfango
          • 6 years ago

          Interesting, thanks for taking the time to fill in some details.

          Edit: Err, I was asking about what formed the image, but I realized you already mentioned it was an array of micro-mirrors.
          I can’t quite imagine the principle though – does it have single R/G/B LEDs, and the micro-mirrors are literally in a per-pixel/per-subpixel array? Do they rotate to modulate the amount of light reflected, or am I way off? 🙂

          • Arclight
          • 6 years ago

          Thanks for replying. LEDs do indeed sound safe and have been used for years for displays. That said, how was it determined that the light bounced off the micromirrors is safe and suitable for such an application? In the article it says that the device had/has medical applications but I’m doubting they were used 14 hours a day or more (I sure know I would if I were to use it).

    • Aphasia
    • 6 years ago

    So alongside Oculus Rift comes Virtual Retina Display, and yes, I love the fact it is reminiscent of the implementation in Snow Crash, but I would love a gaming implementation of it with slightly larger FOV.

    The bigger question though is when do get MORE 😉
    [url<]http://www.youtube.com/watch?v=cCeeTfsm8bk[/url<]

    • Bensam123
    • 6 years ago

    I actually like the look of that exiting prototype, with the plexiglass over the electronics. Definitely needs a headband though.

    I wonder how this would work with glasses, if at all.

    This definitely sounds like a better version of VR then the rift. Beaming images onto your eye is definitely different then looking at a essentially flat screen and more reminiscent of the way things work in real life. In life you absorb light from around you instead of looking at something that emits light (although they sound the same). I look forward to see where this goes.

      • Pwnstar
      • 6 years ago

      If I understand the tech correctly, there would be no need for glasses with this. The headset ARE your glasses.

        • faramir
        • 6 years ago

        Precisely.

        “The company thinks people will use its Virtual Retinal Display to watch video content. ”

        The company is stupid. People spend gajillions every year to offset their physical deficiencies (such as impaired vision). A high resolution display system combined with good quality camera and image processing will alleviate even problems nowadays glasses cannot alleviate (such as nystagmus), given that the response time of this system is already extremely high (24 fps rate was mentioned). This thing will print money once they get it approved for medical use …

          • faramir
          • 6 years ago

          Forgot to add other uses for such a product:

          1: night vision, IR vision, etc. (anything we can make a detector for)

          2: magnification (optical and digital, provided that the image sensor resolution is high enough)

          3: input normalization (= dimming in cases of extreme flashes which woudl damage your eyes if you looked at them directly)

          4: information overlay (a replacement for HUD)

          5: teleoperation

          and these guys envision people only watching movies using one of these ?

            • avegrant
            • 6 years ago

            NB, we’re not trying to target just video consumption, we’re trying to be media agnostic in the first series of prototypes. Meaning you can plug any signal you want (Games, endoscope, drone and yes MOVIES) into the headset. The FOV was just chosen to accommodate the broad market instead of locking us into VR.

            Later we will be able to add cameras to the front of the device and it’s definitely on our radar. One step at a time though. 🙂

        • Bensam123
        • 6 years ago

        I can’t imagine that being right. They’d need some pretty ridiculous technology to compensate for light and how it enters your eye. It would have to calibrate to everyone’s eyes separately, essentially the same thing as those monstrosities you see in WalMart for instance that calculate starting points for the optometrist.

        There is no way for it to bypass your cornea and lens and beam the images directly onto the back of you eye. Everyones is different, especially if you have glasses. Astigmatism would make it even harder to compensate for.

        Maybe there is calibration software when you first set it up, but that wouldn’t be automatic. You’d have to go through what everything your optometrist does to find the right settings or you’d have to input your correction for your glasses.

          • Diplomacy42
          • 6 years ago

          you are forgetting one key difference between you and your optometrist. you know whether the thing you are looking at is a clear image or not.

          consider a microscope. you just keep turning the knob until it looks right.

            • Bensam123
            • 6 years ago

            But the device does not. The device needs to know what sort of a image to project in order to make it look right inside of your eye.

            • Diplomacy42
            • 6 years ago

            the “device” doesn’t need to know anything about your eye. it uses mirrors and lenses, just like, now follow me here, a microscope.

            Does the microscope need to “know” what your vision is? no. you tune it yourself, and then the focal point remains fixed unless you tune it again.

          • liquidsquid
          • 6 years ago

          I don’t see why it cannot work.

          Think of the continuous correction mirrors in astronomy, send up a reference point (laser) see what bounces back. According to the sensor, it should always see a single dot, and the mirror adjusts as best it can to maintain that.

          I would venture a guess the technology does something similar where it can see the reflection of what it is sending and makes adjustments as needed. Maybe not perfectly accurate, but it could be far better than your eye is naturally as it could optically correct as it goes. If so, that is very cool tech!

          I am thinking more along the lines of the headset used by preditor. Check out the world around you in UV, IR, etc. Night vision being the coolest!

          Also being in the electronics industry using very small parts, something like this for electronics assembly and re-works would be amazing if it had accurate positioning. Especially if it is truly wearable for 8+ hours of a work day.

            • Bensam123
            • 6 years ago

            Yeah, this would be pretty cool if it could do all of this without glasses. I just don’t see it being able to do it very well, especially with astigmatism.

            • avegrant
            • 6 years ago

            Hey Bensam123,

            Yes, you can use it without glasses — the diopters adjust for a variety of prescriptions, and all you have to do after that is adjust the papillary distance. On the prototype it’s a bit clunky but the final prototype will have a clean design.

            Astigmatisms will have to be corrected with special lenses. It’s doable, but not in this first round design.

            Cheers,

            Grant

            • Bensam123
            • 6 years ago

            Neat… Does that mean this will work with contact lenses then if you have a astigmatism?

            Are you part of the development team for this? If you are you should consider identifying yourself as such (for a variety of reasons).

            I didn’t even correlate the name till I read another post stating that was the company name. Perhaps a CS rep title in the posts you put up or something.

            • avegrant
            • 6 years ago

            Yes, I work on the developmental team. I mentioned it somewhere down in the threads.

            If your contacts correct the astigmatism then you should be able to see fine with the display.

        • WaltC
        • 6 years ago

        I would think so, too. Look at the difference between glasses and soft contacts, for instance. you can of necessity have to wear the proverbial “coke-bottle” glasses, with “inch-thick” lenses, but a soft contact designed to the same degree of correction can be thin enough to float transparently on top of the eye…;) Your focal point with the contacts also is changed to more realistically resemble what natural, correct vision sends to the brain. Eg, the closer the focal source to the retina, the less is required in the way of correction. Eh?…;)

      • indeego
      • 6 years ago

      The way things work in real life is our eyes see things one way (upside down), and our brains interpret that data. The eyes and brain frequently get it wrong.

    • jessterman21
    • 6 years ago

    Wake me when I can plug AV straight into an implanted port at the base of my skull. 😉

    • GrimDanfango
    • 6 years ago

    This looks interesting, but claims that it increases perceived resolution turns it into a bit of a snake-oil pitch, for what could otherwise be quite a promising technology.

    There simply isn’t any way you can “perceive” more resolution than there is… the core definition of “resolution” is the smallest measurable detail that can be resolved. If an image is made of pixels, those pixels are implicitly the size of the smallest detail that can be resolved.

    The most that can be done is to blur it, to reduce perception of jagged edges… but that absolutely will not increase the perceived resolution, it will just make the lack of resolution less conspicuous.

    To claim otherwise is nothing more than hollow advertising nonsense.

      • Pwnstar
      • 6 years ago

      “Perceive” refers to the psychology of vision, where it is possible to trick the brain into thinking it is seeing a higher resolution image than it really is. That’s what he means. You are right in one sense but wrong in the other.

        • GrimDanfango
        • 6 years ago

        It might have to do with the psychology of jargon, in so far as people will describe the effect as them perceiving a higher resolution, because they’re not correctly identifying it as perceiving less aliasing but identical resolution.

        Resolution isn’t really up for debate, it’s a way of quantifying the finest measurable detail in an image. You can’t increase resolution in an image that doesn’t have it. Any “perception” that this is the case is a misunderstanding of what resolution means. You aren’t going to perceive extra detail that doesn’t exist.

        There is lots that can be done to cause people to perceive less aliasing, or increased vibrancy, or sharpness, or any one of numerous other entirely ambiguous qualities, but resolution isn’t ambiguous.

    • UberGerbil
    • 6 years ago

    People have been [url=http://www.hitl.washington.edu/projects/ivrd/<]trying to do this[/url<] for over twenty years. [url=http://www.hitl.washington.edu/publications/p-95-1/<]Here's a paper[/url<] on one of the early ones (and you can tell because the webpage looks like it's right out of 1995, which it is). After repeated failures the company created to commercialize it, MicroVision, eventually adapted it into a "pico projector" instead. But technology marches on (even if the optics and biology do not) and maybe these guys will have better luck.

      • Billstevens
      • 6 years ago

      Proving that optics physics hasn’t changed much in the last 20 years. Also the path they are taking is an obvious but difficult one. Physics nerds of course want a true VR display to mimic the way our eyes receive light in real life. Like ray tracing its good to know people are still working on bringing this tech to consumers and tinkerers.

      • avegrant
      • 6 years ago

      We were able to solve the issues that the initial technology pioneers faced. 🙂

        • blanchjd
        • 6 years ago

        I was wondering when someone would bring up Microvision. Having worked there a number of years ago, I’m curious about how Avegant solved their issues. Their biggest issue of all was marketing and price point. As far as creating mind-blowing prototypes and technology, Microvision was world class, but its hard to keep a business afloat when you can’t make much money.

        I’m assuming this is using a different scanning method and light source. Microvision used a single large MEMS mirror to beam laser light through a microlens array.

        • Scrappy1850
        • 6 years ago

        What were those issues that you are currently trying to tackle? How will you overcome them?

    • Laykun
    • 6 years ago

    Wouldn’t mind knowing what the input latency is like on this device. Also wouldn’t mind knowing how a 3D rendered image looks compared to the oculus. Although I get the feeling that’s WHY they are targeting multimedia applications and not games.

      • avegrant
      • 6 years ago

      Extremely low and extremely realistic. 🙂

      It’s hard to explain what the image looks like in words… the analogy that we use is the difference between looking at your television screen and out a window.

      Check avegant.com soon though, we may be rolling out some consumer demos later this year. Very soon.

      Cheers,
      Grant

        • Laykun
        • 6 years ago

        Thanks for the reply. Do you have any numbers you could quote for :

        * Field of view — any plans for full 100+ field of views?
        * Input Latency
        * Screen response time (akin to LCD response time)
        * Latency for round trip (time from leaving graphics card to appearing as a pixel)
        * Contrast ratio

        I understand you’ve effectively eliminated the black pixel borders/seperations which sounds great and I’m really keen to see. But with any product like this clairty/detail is still dependant on the resolution of the device (particularly with rasterised vector content like in video games). As I understand it since you have a 40 (?) degrees field of view it’s much easier to have a high PPI, but can the device’s optics scale up the field of view backed by resolution?

    • jstern
    • 6 years ago

    Makes me wonder how old content, 90s, 80s would look, since the perceived resolution is higher. That’s what has my interest.

    • Billstevens
    • 6 years ago

    The OR has a clear target market and also clear goals on what they wan’t to do better than past attempts to popularize VR, better head tracking and FOV. Though the screen door effect is currently awful on the Demo kit the experience is still impressive given the FOV and head tracking quality.

    I just don’t see the market for video content generating enough hype to sell Avegant’s device especially if they are over $500. I don’t imagine a lot of people want to put on a head set to watch video in their home that simulates a big screen. Watching movies is a shared experience, putting on eye covering glasses kills that. Maybe for in flight movies but most people want to be aware of their surroundings while consuming media, or at least optionally aware. I can see why Google abandoned this kind of tech in favor of something like Google glasses. Full vision encompassing VR makes sense for computer gaming which tends to be an individual experience or virtual interaction based. But that paradigm doesn’t have a foothold in any other market.

    I wish them luck but the pessimistic me sees them being bought out by a Sony or Microsoft if they have any marginal success so they have some proprietary vision technology to go after companies like OR if it pans out well in the gaming market. Or fading away until we exhaust simpler screen based options.

    • Entroper
    • 6 years ago

    [quote<]Somewhat surprisingly, gaming isn't Avegant's focus. The company thinks people will use its Virtual Retinal Display to watch video content. There don't appear to be any barriers to gaming on the thing, though. The prototype can already pump out images at 240 frames per second.[/quote<] The reason for this is because the screen has a small FOV. The company has described the view as being like an 80-inch display at a distance of 8 feet, which is less than half the FOV of the Rift. This also contributes to the clarity of the image relative to the Rift, since the same number of pixels are crammed into a smaller area on your retina.

      • mcnabney
      • 6 years ago

      Small FOV? 80″ screen @ 8′ would be a 40 degree viewing angle – which is pretty big. If OR is going to provide 80 degrees that are really going to need to crank the resolution way way way up. In order to hide the pixels they will need the equivalent of 4,800 lines (human vision can see about 1/60 of a degree of detail) which is going to require better than 4K panels in the headset. If not, prepare for jaggie city.

      • avegrant
      • 6 years ago

      Clarity comes as a function of the VRD technology in this device; it is the same irrespective of the FOV. We intentionally chose that 40 degree FOV based on our target market (not virtual reality for this prototype) and headset size.

        • Entroper
        • 6 years ago

        Yeah, my comment kind of implies the cause and effect of the FOV are reversed. That wasn’t my intent. 🙂

    • PopcornMachine
    • 6 years ago

    Sorry, but the image in the youtube link looks like “hey, I found a new way to get brain cancer!”

    • cjava2
    • 6 years ago

    The castAR Kickstarter launched today:

    [url<]http://www.kickstarter.com/projects/technicalillusions/castar-the-most-versatile-ar-and-vr-system[/url<] There's now a "VR clip-on" that allows for full immersion.

    • Parallax
    • 6 years ago

    I’m curious about how using micro-mirrors eliminates pixelization. AFAIK the mirrors are always laid out in a grid pattern, leading to the same sort of blocking artifacts as an LCD, unless they are severely blur the image. Micro-mirrors are not the future for viewing comfort in any type of display unless the flickering can be pushed into the 10+KHz range, since it relies on RGB color cycling, FRC for bit depth, etc….

    There are a bunch of little inconsistencies in the article too, but I’m not sure if these are due just to CNet or not.

    • glacius555
    • 6 years ago

    Do you know what they’ll call their first game for this display?

    Evolver! 😀

    • eitje
    • 6 years ago

    David Brin’s Earth: Tru-Vu Lenses.

    • Meadows
    • 6 years ago

    I hope the final product will also be made out of transparent plastic. That prototype looks smashing.

    • JohnC
    • 6 years ago

    Such a crude, low- quality contraption… Wake me up when these will be available:
    [url<]http://planetarbitrary.com/wp-content/uploads/2011/08/sarif1.jpg[/url<] P.S: Tim Stevens got finally kicked out of Engadget? Didn't even notice that, but "good riddance" anyway.

    • Tubby
    • 6 years ago

    Better Than Life anyone?

      • drfish
      • 6 years ago

      Well that’s a question of content isn’t it?

    • nico1982
    • 6 years ago

    Any chance it does fix myopia, too?

      • mcnabney
      • 6 years ago

      In theory each image could be fed by a high res camera on theight other side. If the aperture is tight enough it could keep everything in focus. Your eyes just need to be able focus on the micromirror array.

      • Wirko
      • 6 years ago

      Sure, if you’re farsighted.

      • avegrant
      • 6 years ago

      Yes.

    • Captain Ned
    • 6 years ago

    [url<]https://en.wikipedia.org/wiki/The_Game_%28Star_Trek:_The_Next_Generation%29[/url<]

      • Krogoth
      • 6 years ago

      Shut up Wesley!

    • DeadOfKnight
    • 6 years ago

    This is the first exciting contender to the Oculus Rift, but I still wonder if the proprietary manufacturing of a device like this will truly be affordable enough to be competitive. The Oculus Rift is targeting $300 and aims to bring the price even lower while focusing most of their attention on responsiveness and partnerships with game developers, something I don’t think this will have anytime soon. In these markets, timing is everything.

      • superjawes
      • 6 years ago

      I don’t think timing is going to be as key as usability is. The main issue I see with the Oculus Rift is where the weight is and how long you can watch/play without getting tired. This VRD appears to be a bit lighter, but off balance. It also seems to require a fair bit of adjustment for eye spacing and focus.

        • DeadOfKnight
        • 6 years ago

        Well one big pro for the Oculus Rift is that it appears to be quite usable while wearing glasses. I dunno if the same is true for this new headset.

          • superjawes
          • 6 years ago

          That’s one of the first things I thought of, too.

          I think this one is ok because you can focus each display. I would imagine that someone who needed glasses could get a clear image that would look blurry for someone with good vision.

          So I guess what really interests me is a different take on the same idea. The end result should be similar (VR display), but having different solutions could offer alternatives to consumers or even open up to new ideas and applications.

        • avegrant
        • 6 years ago

        That above demo is just a prototype. The consumer-facing version will be much lighter thank this and will have better optical adjustments. To answer the question below, yes, diopters do adjust the focal distance to accommodate a range of myopic situations.

        Source: I work for Avegant 🙂

          • DeadOfKnight
          • 6 years ago

          That makes sense. I did a double-take when I saw your name.

          However, astigmatism might cause some problems there?

            • avegrant
            • 6 years ago

            You’d have to get a custom lens, but it’s doable.

    • drfish
    • 6 years ago

    Now we’re talking Snow Crash…

      • CityEater
      • 6 years ago

      I’m sharpening my glass swords as I type.

        • Grigory
        • 6 years ago

        “In doing so, he put great lengthwise rents into both of T-Bone’s femoral arteries, and his entire blood supply dropped out of him. Like slicing the bottom off a styrofoam cup.”

    • Stochastic
    • 6 years ago

    This just makes me wish Avegant and Occulus teamed up.

      • Thrashdog
      • 6 years ago

      The magic of the Rift really is in the positional tracking. There’s no reason they couldn’t team up for a future product with Avegant’s display tech and Oculus’ low-latency tracking mojo combined.

        • Namarrgon
        • 6 years ago

        The other major advantage of the Rift (for immersive VR) is the much larger field of view. Can Avegant’s DMD projection tech be applied to FoVs of >90 degrees, or is it inherently limited to smaller views?

      • gigafinger
      • 6 years ago

      Avocculunt?

        • Grigory
        • 6 years ago

        Ocavegaluscunt?

        Edit: Awww, I had to make it one word! Stupid filter! 😉

    • HisDivineOrder
    • 6 years ago

    I can’t wait until we’re all wearing these things 24/7. We’ll never see the real world “untouched.” We’ll call it, “impure.” Our governments will encourage the use of them to keep us all “even.” Doctors will use these to filter the world in such a way as to remove problem stimuli. When we want to dump someone, we’ll just login to a new version of Facebook called Faceblock and just remove ourselves from the person’s registry. Suddenly, we really don’t exist to that person.

    If we’re not friends with someone, we can turn them into an anonymous person that has no features and has little more than a generic outline. We won’t sit down to watch TV or a movie. We won’t sit down to read a book or have a conversation with a friend, what few we have. We’ll do that wherever. In the shower, in the bedroom while having what little real sex we have, and on the commode. It won’t make no nevermind.

    The world will be as we perceive it and we will perceive it whatever way we (and our government, corporate masters, and hackers) will it.

    Why just see things when you can SEE (Sight Edited & Enhanced) instead?

      • albundy
      • 6 years ago

      it wouldn’t surprise me since our corporatist government is powerful enough to force people to do anything! and “governments will encourage use” ALWAYS means dictatorship. but if you think people wont revolt against this, you’re sadly mistaken.

        • Liron
        • 6 years ago

        You can’t “force” a monkey-marmot to do anything! . They’ve got to want to do it themselves.
        -Varrick

          • l33t-g4m3r
          • 6 years ago

          Which is why today’s modern fascist society isn’t enforced through force, but brainwashing. Thank Edward Bernays for that.

Pin It on Pinterest

Share This