Personal computing discussed
Moderators: renee, morphine, SecretSquirrel
Duct Tape Dude wrote:I found this today: http://www.guru3d.com/news-story/amd-r3 ... marks.html
Not sure how accurate the performance is, but the power consumption seems right! HAHAHA... haha... ha.
killadark wrote:Why do people even care so much about power consumption :S
so it draws 100 w more or less like its gonna make a huge difference
f0d wrote:killadark wrote:Why do people even care so much about power consumption :S
so it draws 100 w more or less like its gonna make a huge difference
it doesnt make much of a difference in america because the power is so cheap but the price of power is double and even quadruple or more elsewhere in the world
f0d wrote:killadark wrote:Why do people even care so much about power consumption :S
so it draws 100 w more or less like its gonna make a huge difference
it doesnt make much of a difference in america because the power is so cheap but the price of power is double and even quadruple or more elsewhere in the world
Jigar wrote:f0d wrote:killadark wrote:Why do people even care so much about power consumption :S
so it draws 100 w more or less like its gonna make a huge difference
it doesnt make much of a difference in america because the power is so cheap but the price of power is double and even quadruple or more elsewhere in the world
You are exaggerating, i live in Gujarat, India. Our state electricity department charges us the most compared to other states in India, but if i am in the market for Graphic card and if i can afford some thing like this card, i don't mind such issues. After all there is a reason these cards are called enthusiast cards. Also,Its not like we play games 24/7, its hardly a 2 - 4 hour session a day for a normal gamer, i only play during weekends and max would be a 2 hour session.
There are other equipments in house that draws far more power than a graphic card - example - Microwave, Iron, Geyser, Air conditioner, Hair dryer, etc. All this equipments draws more than 1000 Watts compare to 300 watts of this card.
EDIT: Also note while you are browsing or watching movies this card will not draw 300 Watts, so the only time you will be seeing this card suck power is during the gaming sessions.
sschaem wrote:US average to 12 cents, Germany one of the highest in the 'developed' worlds is 36 cents.
(note: the leak show the 380x to be almost equal to the GTX 980 in term of power efficiency)
But lets compare a 290x to a GTX 980, just ballpark numbers to get a sense of the cost involved.
Anand show a 60w different playing Crysis3 and 15w at iddle (desktop)
a) If you use your computer 6 hours a day and also game 10h a week the saving for a year is : 32.85kwh + 31.2kwh
Your yearly saving would be $7.6 saving if you live in the US or $23 in Germany.
b) If you use your computer 12 hours a day and also game 20h a week the saving for a year is : 65.7kwh + 62.4kwh
$15.3 in the US and $46.1 in Germany
Comes down to about a ~$10 saving a year in the US, ~$35 in Germany (by using a GTX 980 vs a r9-290x)
f0d wrote:
2-4 hours is nothing really - i play for 6-8 (or even more rarely) hours a day on weekends (about 4 hours weeknights)
microwave, hair dryer (lol really?) geyser? (what the?) and iron etc are only on for like 5 mins a day max - sure i have the air con on a lot and they are expensive enough as it is and i try to not use it if possible
not exaggerating on the price of power, i pay 26c a KWh and thats pretty cheap where im from in australia - americans pay as low as 8c a KWh
also if you look none of my comments were complaining about the power usage but i can definitely understand the people that do which is why i mentioned the huge difference in rates for power
f0d wrote:also as i previously said i wouldnt ever get the highest end card anyways (980 or 290x) - i usually get a notch or 2 lower
Jigar wrote:f0d wrote:
2-4 hours is nothing really - i play for 6-8 (or even more rarely) hours a day on weekends (about 4 hours weeknights)
microwave, hair dryer (lol really?) geyser? (what the?) and iron etc are only on for like 5 mins a day max - sure i have the air con on a lot and they are expensive enough as it is and i try to not use it if possible
not exaggerating on the price of power, i pay 26c a KWh and thats pretty cheap where im from in australia - americans pay as low as 8c a KWh
also if you look none of my comments were complaining about the power usage but i can definitely understand the people that do which is why i mentioned the huge difference in rates for power
I just checked my power bill i am paying around 12c a KWh, and i understand you are paying twice the money for power bills but this doesn't change the scenario much even if you are playing 6 hours every day.f0d wrote:also as i previously said i wouldnt ever get the highest end card anyways (980 or 290x) - i usually get a notch or 2 lower
Then why have a discussion on power consumption when you don't fall into enthusiast category.
Jigar wrote:f0d wrote:killadark wrote:Why do people even care so much about power consumption :S
so it draws 100 w more or less like its gonna make a huge difference
it doesnt make much of a difference in america because the power is so cheap but the price of power is double and even quadruple or more elsewhere in the world
You are exaggerating, i live in Gujarat, India. Our state electricity department charges us the most compared to other states in India, but if i am in the market for Graphic card and if i can afford some thing like this card, i don't mind such issues. After all there is a reason these cards are called enthusiast cards. Also,Its not like we play games 24/7, its hardly a 2 - 4 hour session a day for a normal gamer, i only play during weekends and max would be a 2 hour session.
There are other equipments in house that draws far more power than a graphic card - example - Microwave, Iron, Geyser, Air conditioner, Hair dryer, etc. All this equipments draws more than 1000 Watts compare to 300 watts of this card.
EDIT: Also note while you are browsing or watching movies this card will not draw 300 Watts, so the only time you will be seeing this card suck power is during the gaming sessions.
f0d wrote:
so i have to be an enthusiast (which i still am i believe - i dont need the latest highest end card to be one) just to have a discussion?
f0d wrote:and i completely disagree with the fact that it "doesnt change the scenario much" it certainly does if the difference is close to $50 a year going by sschaems maths and my gaming habits
f0d wrote:edit: just checked your signature and you dont even have the latest high end card so why are YOU having this discussion if you dont fall into the enthusiast catagory?
Pez wrote:Reviewers have generally reported on power consumption in the past because it was relevant to how much cooling and power supply capacity were needed when building a new enthusiast PC. Since large power supplies and capable cooling solutions are readily available, power consumption has never been a show-stopper. In the past twelve months, NVidia's evil marketing geniuses have hyped power consumption as the most important thing to talk about because it is an area where their Maxwell architecture succeeds very well.But yeah power consumption is really not something enthusiasts will be deliberating over I would have thought. The performance levels look well worth it imo!
Jigar wrote:I didn't mean that, what i wanted to tell you is that you are arguing on a point which doesn't concern you as you will never go for top tier category product. Its like arguing on mileage of a sports car that you will never buy.
Jigar wrote:$50, i could care less for $50, new games costs more than that when they are launched in the market and yes people do buy them.
Jigar wrote:heh, i bought that card when it was launched at $500 (Costed me more like $800 in India)
Jigar wrote:Also,Its not like we play games 24/7, its hardly a 2 - 4 hour session a day for a normal gamer, i only play during weekends and max would be a 2 hour session.
Jigar wrote:(;・∀・)2 - 4 hour session a day for a normal gamer
Duct Tape Dude wrote:Guys, we aren't honoring the AMD gods by arguing over international electricity costs. Can we get back to the card? Or at least argue over if its rumored performance will vary a lot based on how much a game benefits from memory bandwidth (ie: HBM)?
Jigar wrote:f0d wrote:
2-4 hours is nothing really - i play for 6-8 (or even more rarely) hours a day on weekends (about 4 hours weeknights)
microwave, hair dryer (lol really?) geyser? (what the?) and iron etc are only on for like 5 mins a day max - sure i have the air con on a lot and they are expensive enough as it is and i try to not use it if possible
not exaggerating on the price of power, i pay 26c a KWh and thats pretty cheap where im from in australia - americans pay as low as 8c a KWh
also if you look none of my comments were complaining about the power usage but i can definitely understand the people that do which is why i mentioned the huge difference in rates for power
ultima_trev wrote:This would be a grand return to form for AMD if true. However, I'll reserve believing these "leaks" until TR and Anand acquire this mythical beast of a card and conduct their own benchmark suite. If R9 390X really is 50% faster than R9 290X, that would be a generational leap not seen in several years and doing so while drawing only 4% more power? This would be a Maxwell killer for sure and I would be stoked to see what the R9 380/R9 370 series would bring! Looks like I'm going to put the pause on purchasing that GTX 960 for now...
JustAnEngineer wrote:In the past twelve months, NVidia's evil marketing geniuses have hyped power consumption as the most important thing to talk about because it is an area where their Maxwell architecture succeeds very well.