Personal computing discussed
Moderators: renee, morphine, SecretSquirrel
Prestige Worldwide wrote:Looks promising, but worthless until frametimes are tested by TR.
Who knows how many of these could be fake / dropped / runt frames.
I am nonetheless excited to see something faster coming out .
JustAnEngineer wrote:spread by NVidia's team of compensated shills, not just by their army of loyal fanboys.
clone wrote:I don't really get it -- are you implying that Nvidia DOESN'T have better drivers than AMD, and that AMD's only failing is in terms of PR? Because this is objectively and provably false.Airmantharp wrote:Nvidia's never "gotten it right" for years, what they have gotten right is the PR battle which AMD lost a decade ago when they decided to include fixes in their release notes.I just want them to fix the stuff Nvidia's gotten right for years, apparently. If they can't get 4k down, well.
clone wrote:I don't see how you can blame anyone but AMD for this.when TR mentions Nvidia called together several of us in the press last week, got us set up to use FCAT with 4K monitors, and pointed us toward some specific issues with their competition which then leads to TR releasing an article titled "Here's why the CrossFire Eyefinity/4K story matters". well AMD's going to have a hard time improving their reputation.
Krogoth wrote:It looks like Hawaii is just Tahiti on roids, .
Chrispy_ wrote:My concern with this card is that it's brute-forcing the lead with an expensive 512-bit design and physically huge GPU, rather than architectural or process improvements that increase performance/cost.
Chrispy_ wrote:My concern with this card is that it's brute-forcing the lead with an expensive 512-bit design and physically huge GPU, rather than architectural or process improvements that increase performance/cost.
I'm not really concerned about the top SKU, that's for the 1% of people who are interested in performance, no matter the cost.
I'm interested in the sweet spot where you get 80% of the performance for 50% of the cost, and that's unlikely to be cheap given how large these dies are, and how 512-bit PCBs are likely to be more expensive to produce than 256-bit or 384-bit.
Chrispy_ wrote:My concern with this card is that it's brute-forcing the lead with an expensive 512-bit design and physically huge GPU, rather than architectural or process improvements that increase performance/cost.
I'm not really concerned about the top SKU, that's for the 1% of people who are interested in performance, no matter the cost.
I'm interested in the sweet spot where you get 80% of the performance for 50% of the cost, and that's unlikely to be cheap given how large these dies are, and how 512-bit PCBs are likely to be more expensive to produce than 256-bit or 384-bit.
CityEater wrote:Don't some of the crossfire issues in recent press point to something more fundamental architecturally than merely driver issues though? Or have I been reading it wrong? Presumably AMD are aware of the problem, what would it take to revamp crossfire at a hardware level, is there any reason they wouldn't?
Looking forward to the announcement though. I wonder if $599 gets you a copy of BF4 aswell...
I kind of wish they would bring a new feature set along to the table like HDMI 2.0 or game streaming to a tv dongle or something but that's probably wishful thinking. This is a genuine question but what makes these cards any more expensive to produce than a Kepler (If the leaks are true)?
Waco wrote:Chrispy_ wrote:My concern with this card is that it's brute-forcing the lead with an expensive 512-bit design and physically huge GPU, rather than architectural or process improvements that increase performance/cost.
I'm not really concerned about the top SKU, that's for the 1% of people who are interested in performance, no matter the cost.
I'm interested in the sweet spot where you get 80% of the performance for 50% of the cost, and that's unlikely to be cheap given how large these dies are, and how 512-bit PCBs are likely to be more expensive to produce than 256-bit or 384-bit.
I'm having flashbacks to buying my 1 GB GDDR4 2900XT...for $550.