Cyberpunk 2077 System Requirements: Spec out your Decks

Cyberpunk 2077 is one of those games. The games people watch and wait on. The ones we need the sysreqs for before we can upgrade our rigs. CD Projekt Red has finally published its minimum and recommended requirements for the game and they’re… not terrible. But they’re not the whole story, either.

Minimum Requirements

  • Windows 7 or Windows 10 with DX12 (64-bit)
  • Intel Core i5-3570K or AMD FX-8310
  • Nvidia GeForce GTX 780 3 GB or AMD Radeon RX 470
  • 8 GB RAM
  • 70 GB storage (SSD recommended)

Recommended Specs

  • Windows 10 with DX12 (64-bit)
  • Intel Core i7-4790 or AMD Ryzen 3 3200G
  • Nvidia GeForce GTX 1060 6GB or AMD Radeon R9 Fury
  • 12 GB RAM
  • 70 GB SSD storage

Even pretty old computers should be able to run Cyberpunk 2077 at its most basic settings, and you can get what CD Projekt Red considers the best general experience from the game with what is still a pretty inexpensive rig. As TomsHardware notes, CDPR is a little light on the information here. The first set of specs are for “Low settings and 1080p gaming” while the second are for “high and 1080p.” That doesn’t mention anything about frame rate for example.

But Raytracing

There’s also the fact that CDPR and Nvidia have touted Cyberpunk 2077 as a posterchild for raytracing for quite some time now. It seems strange, then, that the developer would recommend playing the game with a GTX card. CDPR could come out later with a set of “Best Quality” requirements that lay out what CPU and GPU you’ll want for the premium experience. This seems like something the company would release after AMD reveals its next generation of GPUs, which will feature raytracing for the first time.

The real recommended specs for the designer experience will likely involve an RTX 30-series card or AMD RX 6000-series card, a 10th-Gen Intel Core-i7 or AMD Ryzen 9 Zen 3 CPU, at least 16 GB of RAM, and maybe even an NVMe SSD if you really want to be careful.

We’ll have a better idea of what Cyberpunk 2077 can actually do with current and older hardware when it releases for the PC, as well as Xbox and PlayStation’s current and upcoming platforms on November 19, 2020.

2.3 23 votes
Article Rating
Notify of
Oldest Most Voted
Inline Feedbacks
View all comments
4 months ago

The most useless game of 2020.

3 months ago
Reply to  Borg

Sorry, got this confused with something entirely different. The game is good, despite the bugs has a lot to offer. I’d gladly recommend it.

6 months ago

The game is built for the old generation consoles. Of course three year old PCs will exceed the recommended spec. I plan on playing it on an i5 with a 1080 at 1080p60 and the creators appear to agree that I don’t need to drop $1500 on hardware to experience it at that resolution.

6 months ago

The truth is that it is no longer economically viable to create a PC-exclusive that pushes the envelope. It isn’t a surprise that this game gets by Maxwell/GCN 1.0 era hardware. You just miss out on the pseudo-ray-tracing effects (RTX mode) which only runs on Turing/Ampere hardware (support for it on RDNA2 remains to be seen).

6 months ago

The fact the game is coming to PS4/Xbone S likely means 1080p/30 for the minimum and recommended spec. Console-level graphics on ancient hardware. That’s also why the CPU requirements are so light.

To get 60 fps you’ll need more. My guess is that an RTX 2060 will play 1080p/60 without ray tracing or 1080p/30 without. You’ll need DLSS to get 60 fps on those cheap ray tracing cards.

6 months ago
Reply to  derFunkenstein

And I’m equally sure that Ryzen 5 2600 or Core i5-8400 will be needed to get 60fps out of the CPU.

6 months ago

The recommendation sans raytracing hardware is the only way they can hope to sell as many units as possible within the first month. If they recommend raytracing, people may delay buying it until they can save up for the required card.

6 months ago
Reply to  Adnan

Recommending proprietary pseudo-ray tracing standards is fiscal suicide. It’ll remain that way for a long while. This isn’t the 1990s.

Would love your thoughts, please comment.x

Pin It on Pinterest

Share This