Basilisk mouse aims to be the king of Razer’s serpents

Razer isn't shy about throwing around the word "ultimate." The company is using the superlative again to describe its Basilisk wired gaming mouse. With a 16,000 DPI optical sensor, a dial for adjusting the resistance of the scroll wheel, and a sniper button, the new pointer might at least be the serpent king of Razer's bundle of poisonous snakes.

The optical sensor can track at up to 450 IPS, and it should stay accurate at up to 50 G of acceleration. The outside of the Basilisk is scaled with eight buttons and highlighted with RGB LED illumination. The light show and the array of buttons are controlled by Razer's Synapse 3 software utility. The Basilisk measures 4.9" long, 2.9" wide, and 1.7" tall at its highest point (12 cm x 7.5 cm x 4.3 cm), and weighs in at 3.8 oz. (107 g), not including the cord.

The removable side sniper button (that Razer calls a "DPI clutch") changes the sensor's sensitivity when held down. Razer says the drop in sensitivity is configurable. Gamers can up its DPI setting for faster turns or bring it down for more precise aiming. A dial on the serpent's underbelly allows the user to tune the resistance of the scroll wheel to suit his or her preference.

Razer's Basilisk is available now for preorder at Amazon for $70. The manufacturer backs the mouse with a two-year warranty.

Comments closed
    • Quiet Sun
    • 2 years ago

    I have owned two Razer mice. On both, the left buttons wore out much too soon with normal use. I’m not likely to give them another chance, although it seems to be a recurring problem with other brands as well.

    • DeadOfKnight
    • 2 years ago

    Clearly a rip-off of the Corsair M65, although it’s about time they did. I have an M65 myself and I didn’t think I could ever recommend another mouse. This might actually be a worthy alternative, except for the fact that Razer drivers are notoriously divisive among consumers.

    • Ifalna
    • 2 years ago

    16K dpi? I really don’t get it. What is the point?

    I use 1.6K DPI, no acceleration and need ~3 cm of movement to get from one side of the screen to the other. With 16K that would be 3mm. What human being is capable of reliably controlling that sensitivity, esp on smaller movements?

      • Chrispy_
      • 2 years ago

      There is no point; Actual usable DPI by human beings (taking the professional eSports champions as a sample) is between 200 and about 1200dpi. I quite like 1600dpi but it’s not accurate enough for highly competitive stuff because a single millimeter out in your wrist flick translates to a missed shot. The higher dpi users tend to be RTS players because they’re constantly making rapid whole-screen drag-boxes and don’t need the headshot precision of an FPS player.

      Most of the CS:Go pros stick with 400dpi for this reason, and I use 800 because i’m old and lazy.

        • synthtel2
        • 2 years ago

        It looks either bimodal or like the distribution has an exceptionally long tail on the high-CPI end. Most people seem best at 20-30 cm/360, but I’ve known guys who were great at under 5 cm (in CS:GO of all games), and reports on the internet confirm that players like that aren’t so rare.

        At 20 cm/360, an HFOV of 90, and 1920×1080, you need around 1000 CPI to get one count to equal one pixel moved in-game. At 5 cm/360 and 2560×1440, that’s over 5000 CPI. One pixel per count isn’t as useful a goal as complete linearity and lack of jitter, but numbers that high could reasonably be useful to someone somewhere.

        I can measure out 1-pixel movements at 3.2k CPI if I really try, and my wrist biomechanics are far from ideal for that sort of thing. With a heavier mouse, longer sarcomeres, more idle tension, and better local feedback for position-holding (that doesn’t have to travel to the brain and back), it should be easy enough. 16k CPI remains entirely ridiculous, though.

          • Cannonaire
          • 2 years ago

          I use 36 cm/360, HFOV 103 (Overwatch and Killing Floor 2) at 1920×1080. In order to get the precision I want, I need to use >800 CPI. I use 950 currently, which is by no means high. In order to get the same per-pixel precision at 4k, I would need ~1950 CPI IIRC.

          I used to use a Razer Copperhead (best mouse I ever owned, and for 8 years). I had no trouble using 2000CPI when I used that mouse, and I didn’t have any trouble with jitter, even though it used a laser sensor.

          16k CPI seems like overkill for [i<]anyone[/i<]. Maybe if you're playing at 8k with very low sensitivity? I don't know.

            • Chrispy_
            • 2 years ago

            This “horizontal resolution affects the number of pixels/mouse counts your game requires to perform a 360 spin” nonsense has to stop.

            Seriously, no. This is a total fallacy that needs to be shut down for good. I’ve worked on game engines and that’s just not how games deal with mouse input [i<]at all[/i<]. Don't believe me? Change from 4K to 640x480 in your FPS of choice. Your sensitivity would jump by a factor of 6 which would make the game feel suddenly unplayable, but (of course) it has no effect. Here's the thing, it didn't even change things back in Quake 1, because FPS game input has never been written based on the desktop resolution. Back in those days, games ran in DOS and there [i<]was no desktop resolution[/i<].

            • Cannonaire
            • 2 years ago

            [quote=”Chrispy_”<]This "horizontal resolution affects the number of pixels/mouse counts your game requires to perform a 360 spin" nonsense has to stop.[/quote<] Thank you for the clarification in your post. That's not exactly what I meant, though. From my understanding and experience, games coded properly to use raw input will have an arbitrary (but consistent) in-game sensitivity number which will determine the rotation amount per count received from the input device. Lowering the in-game number will lower how much the camera rotates per count, giving your input device finer control over how much is rotated. This is an objective fact. Keeping the in-game sensitivity constant, increasing the CPI on your mouse will allow you to rotate more degrees in-game per physical unit moved. Whether or not humans are even capable of taking advantage of much finer control is less objective - but it is a fact that higher CPI will allow finer control at lower sensitivity. Further thoughts: You can set an in-game sensitivity high enough that the steps are so big you won't be able to put your crosshair over something you want - just like in old DOS games using pageup and pagedown to look up and down, in which a tap would pitch the camera several tens of degrees. Final (subjective) thought: It's important to note that [b<]I don't dispute your information about the distances, or capabilities and limitations of human input.[/b<] That said, I have found that having higher CPI with lower in-game sensitivity makes turning feel less jumpy and more precise. This could be placebo, but placebo [i<]can[/i<] affect how a [i<]human[/i<] functions. *Edit* - Fixed BBCode. *Edit* - I absolutely agree - 16,000 is a ridiculous, jittery mess. I use between 800 and 2100.

            • Cannonaire
            • 2 years ago

            @Chrispy_
            I realized I replied more to your other post than your response to me. To address your post above, having a higher screen resolution will allow you to see the rotation increments more clearly, [i<]most noticeably at a long distance.[/i<] But no, [b<]I agree with you it's true that having a higher render resolution does not affect how many degrees you turn per count.[/b<] **Edit* - Grammar. I should really proofread my posts better before posting.

            • Chrispy_
            • 2 years ago

            [quote<]Lowering the in-game number will lower how much the camera rotates per count, giving your input device finer control over how much is rotated. This is an objective fact. Keeping the in-game sensitivity constant, increasing the CPI on your mouse will allow you to rotate more degrees in-game per physical unit moved.[/quote<] Yep. I work in Unreal and Unity and normally those engines demand raw input be mapped to Euler angle increments or worse, Floating Quaternions. In an ideal world, the game would have very tiny angle increments for totally step-free movement at any FOV, and you'd adjust your hypothetical high-CPI mouse to the desired CPI value to get the sensitivity you prefer. Sadly, camera angle isn't the only use of mouse input in games - there are menus, inventories, maps etc that require cursor or 2-dimensional plane scrolling. In these instances, you absolutely don't want a high-CPI mouse because not only is the cursor/pan behaviour crazy amplified by the high-CPI count, it's also breaking the muscle memory of the user's desktop mouse cursor speed. There is no perfect system, but beyond reasonable game programming anything above about 2500 dpi starts to cause huge problems simply because it's unusable in cursor mode, even by people who love crazy high CPI settings 🙂

            • Cannonaire
            • 2 years ago

            Interestingly, I’ve heard (sorry I can’t name a specific source) that Quake used a much better system for handling view rotation than Unreal. It could be worth looking into.

            For the last bit – one could conceivably want a higher CPI with higher resolution displays in 2D use cases. It could take an absurd amount of movement to get the cursor from one corner of an 8K display to the other at 400 CPI.

            Not that 8K is standard right now, but dual-4K is probably fairly common (relatively).

          • Chrispy_
          • 2 years ago

          Please see my reply to Cannonaire; The screen resolution argument isn’t valid reasoning.

          3200 CPI is one count per 8µm. That’s the width of a tiny red blood cell. If you moved the mouse to within the accuracy of a human hair (~100µm) you’d still be working to an error level of +/-12 counts, or 25x less accurate than the 3200 CPI mouse.

          No, what is really happening is that when you move a mouse very very slowly across a surface by flexing your grip ever so slightly, all the combined forces cause an increase in the compression of the soft tissues of your hand, and allow a very slow hand-eye feedback loop to allow you to select one specific pixel. This is of no value in a game, though – because you cannot do this at any speed since it requires conscious reflex-time limited hand-eye feedback.

          No, gamers who are amazing and far better than I ever was can noscope-360 flick and stop their mouse to within an accuracy of maybe the thickness of a human hair. No matter how impressive that sounds, it’s what youth and years of fine muscle memory development permit. Even then, their accuracy is only 250 CPI. That’s why some pro gamers often use old 400 CPI mice, because the sensor is linear and has no jitter, scaling or other issues associated with modern high-CPI mice.

          What’s really happening in games is that engines are overcompensating for silly-high mouse-CPI counts and adjusting with logarithmic/exponential expressions in their mousing code. I know because I’ve written the code myself more than once. Set your mouse to 400 CPI which used to be the standard, (and still is for many ordinary office/non-gaming mice) set the slider to the middle in windows and disable any acceleration. Every game you own will work at 400 CPI the same way it will work at 3200 CPI, only that the in-game increments will be 8x larger so you’ll need to increase your in-game sensitivity to compensate

          The only games where you’ll even notice a low-CPI mouse are bad implementations of sniper games where the mouse input is not scaled to match the FOV when you zoom in. If you’re changing your FOV from 90 to 18 via a scope and the game doesn’t compensate, you’re suddenly going to notice the 5x reduction in sensitivity, and your 400 dpi mouse now feels like an 80dpi mouse which is way less than you want – and you can only seem to move in steps, caused by the engine’s scaling of each mouse count by the in-game sensitivity factor. For these crazy scenarios, the obvious solution is a game patch but the clunky workaround would be on-the-fly sensitivity toggles to temporarily increase your mouse CPI by whatever the scope’s zoom factor is. Still, if I play at 800 CPI and picked up a hypothetical 8x scope in a game with really clunky input code, I’d still be trying to use a 6400 CPI mouse at that point, and it would be awkward in a game for the conscious hand-eye feedback loop we’ve both mentioned. At 6400 CPI I’d need to move the mouse 4 microns per step of the in-game view and that’s so fine a step that it is a farce to even talk about it.

            • synthtel2
            • 2 years ago

            Games don’t and shouldn’t work that way, but when I set things up such that one count equals about 2.5 in-game pixels, the jumpiness is definitely annoying. (That’s ~20 cm/360, 400 CPI, 1920×1080 – now I’m running 800 CPI and 2560×1440, which is imperfect but better.) If all games had good sensitivity adjustments and sensitivity being all wrong in 2D weren’t an issue, 1600 CPI would be a noticably better input experience due to this factor alone.

            If cm/360, linearity, lack of jitter, and so on were all perfect and stayed the same, and you didn’t have to worry about sensitivity being weird in 2D, would you rather mouse at 1600 CPI with low in-game sensitivity and make mouse-to-view mapping look like a continuous function, or use 400 CPI and high in-game sensitivity and deal with the discrete steps in mouse-to-view mapping? All else equal, I know which one I prefer.

            To be actually pixel-perfect at 3200 CPI, I need to bring eyesight into the loop, yes. The point was that the output side of the loop isn’t actually that far off. I need to rest my thumb on the mousepad as a brake for that, but I need to rest my thumb on the mousepad as a brake at any CPI (I can’t use mice with thumb shelves). Local feedback (proprioception only) is relatively easy to improve by amounts that just can’t be done for other properties of muscle. Also, I can’t back this one up at all, but I think some people do have longer sarcomeres than normal in forearm muscles, which pretty much acts as a direct multiplier for usable CPI.

            What real-world distance you can stop within depends what method you’re using to move the mouse. If the motion is from your elbow and shoulder, as it is if you’re using typical pro-gamer cm/360 values, it looks very different than if the motion is at your wrist. Elbow/shoulder probably still has better ultimate performance in this case, but most people’s forearm muscles are terrifically weak per their mass and generally under-performing.

            Engines don’t half compensate enough for silly high CPI values. In way too many games, I can’t even do 20 cm/360 at 800 CPI because the scale doesn’t go low enough. What’s log or exponential there, the sensitivity slider? Hardly anything else even arguably should be. I know game tech and would prefer full explanations over explanations that never make it more than halfway to the point.

            If the mouse input isn’t scaled for FOV when you zoom in, that results in what feels like extremely high sensitivity, not low. We’re setting a camera angle and pixels have nothing to do with it internally, as you yourself were saying in your reply to Cannonaire. A one-degree movement looks and feels like a whole lot more at an 18-degree FOV than a 90-degree FOV. I don’t know how to respond to the rest of your last paragraph because it’s all backwards.

        • Ifalna
        • 2 years ago

        Yup I mainly play slow games (MMOs a bit of strategy like) and use windows. Absolute precision is not needed.

        • Bensam123
        • 2 years ago

        Source? I play at 8200 myself along with pretty high sensitivity, 40 in Overwatch.

        400/800 is the norm because it’s what they know, what people think is ‘right’ because the pros are using it, and what most mice operate well at. A lot of mice don’t operate well at higher DPI and suffer from stutter, jitter, skipping and inaccurate movement. I largely believe mice manufacturers don’t even test their mice properly at these levels as they assume no one is going to use them or something like that.

        Any sort of problem with the sensor gets magnified 100 fold with higher resolutions, where as when you play at lower DPI if there is a problem with the sensor it’s largely covered up by the distance you cover to make up for it. I’ve tried tons of mice and tried some of the newer generation ones, both from Logitech and Razer, neither were stable last I checked. I’m still using a Taipan which has two sensors, one to make up for the inaccuracies of the laser sensor.

        Once again… Source. I’m sure as someone who is worked in the mouse industry such as yourself you have something for that, otherwise you’re just someone who’s trying to rationalize a very subjective topic with your version of the ‘truth’. Peripherals are totally different strokes for different folks.

          • synthtel2
          • 2 years ago

          If you’re somehow actually better at that sensitivity than lower ones, you’re an extreme genetic outlier. If so, good for you, but you can’t really generalize from there.

            • Bensam123
            • 2 years ago

            What does it have to do with genetics as much as practice and what you’re comfortable at? Assuming you can play at pretty much any DPI and get good.

            And yes, you can generalize from there, I’m not the only one that plays at high DPI, I’ve had plenty of people stop by my channel and tell me as much, even if it isn’t the norm. People play at low DPIs because they’re told that’s the best and that the ‘pros’ use it, who grew up on low DPI mice and never tried higher DPI ones as that’s what worked for them. Getting used to new and higher DPIs is a chore as well, who would want to do that? Conversely, going to a low DPI setting I would have the same problems, who would want to do that?

            Not saying someone who is performing well on low dpi will perform well on high dpi or vice versa, but rather those people do exist and pretending they don’t doesn’t change that they exist and play at higher dpi… which high dpi mice cater to. However, as I mentioned a lot of high DPI mice suck and they have all sorts of issues as they aren’t all that well tested. The lack of good quantifiable mouse testing also doesn’t help.

            • synthtel2
            • 2 years ago

            It has to do with genetics because 99.99%+ of people don’t and never will have such fine-grained control over their wrist position. 99th percentile sarcomere lengths just can’t do precision even approaching that; it’s almost exactly like you’ve got a higher lever ratio on your wrist than anyone else does. It probably isn’t purely genetic, but current sports science etc doesn’t know how to train that, and if you try to train that by just using a mouse at excessive sensitivity you’re going to hold unreasonable amounts of idle tension and likely screw up your wrist.

            I’m not talking about CPI here directly, I’m talking about cm/360, which sounds extremely low by your description. 8200 CPI and 5 cm/360 would be fine (though probably not advantageous) if many games could run a count -> angle mapping that low. If you’d like to make this more precise, what cm/360 do your settings result in?

            I was saying myself elsewhere in the thread that high-sensitivity players don’t get enough credit most places. It’s a difference of magnitude – plenty of people (not everyone) can use 3200 CPI and ~3 cm/360, but adjust those by another factor of 2-3 and you’re on the indistinguishable-from-zero part of the bell curve. Most people who are running 8200 CPI aren’t running it because it’s ideal biomechanically, they’re running it because it’s the highest value and “higher is better amirite?” For 8200 in particular, it may also be to get around ADNS-9800 laser issues at high surface velocity.

            I used to play at 2000 (highest my old mouse would run without jitter), went straight to 400 when I got a non-laser mouse that didn’t decelerate at high velocity, and was instantly a better gamer with no adaptation time. Being used to a certain sensitivity has got nothing on knowing and catering to your own biomechanics, most people who are paying attention will be able to tell pretty quickly if they’re not in the right ballpark, and pros wouldn’t use what they do if it weren’t actually better for them.

    • Alexko
    • 2 years ago

    It seems dumb to decrease DPI; it would make a lot more sense to adjust the gain function: you’d move slower but even more precisely.

      • DPete27
      • 2 years ago

      Sounds like you can use the DPI clutch to adjust DPI up OR down. I think it’s a neat idea actually.

      • Wall Street
      • 2 years ago

      Most mice at their highest DPI levels have a lot of smoothing and interpolation. Usually, the actually sensor inside the mouse is ~1600 DPI native and they use sub-pixel precision algorithms and scaling to get to the higher DPIs. The smoothing can even cause additional input lag at the highest DPIs because the input needs to be smoothed over many frames to cause the output to not jitter.

        • Alexko
        • 2 years ago

        I didn’t know that, thanks! But in any case, going below the sensor’s native resolution seems like a sub-optimal solution that should always be avoided, if possible.

          • synthtel2
          • 2 years ago

          Below native resolution is far better than above it, and it’s sometimes far from obvious what the native resolution actually is.

    • TwoEars
    • 2 years ago

    I personally hate thumb rest designs, but each to his own.

      • curtisb
      • 2 years ago

      You know, I was like you, but I ended up with [url=https://www.microsoft.com/accessories/en-us/products/mice/sculpt-ergonomic-mouse/l6v-00001<]one of these[/url<] for my work laptop. I thought I would absolutely hate it, but it was what was available to me at the time I acquired it. Turns out the opposite was true. It's one of the most comfortable mice I've ever used.

        • lem18
        • 2 years ago

        I’ve been using a Roccat Tyon and Leadr (basically the same mouse, Leadr being newer/better with its optical sensor – but reverted to the Tyon for now because of proper Linux support). These mice not only have a thumb rest, but a button under the thumb for Roccat’s EasyShift feature (basically lets you set an alternate function for every mouse button, effectively doubling the number of buttons on the mouse). Very comfortable, but also extremely versatile.

Pin It on Pinterest

Share This