Personal computing discussed

Moderators: renee, morphine, SecretSquirrel

 
churin
Gerbil Elite
Topic Author
Posts: 738
Joined: Wed Nov 28, 2007 4:38 pm
Location: CA

What graphic card to view HD video

Fri Mar 29, 2013 7:34 pm

I have been using my machine with Phenom II x 970 and HD 5770 video card. I primarily use the machine to view moving or still graphics. Monitors are a 27' and 24' IPS panels, 2560 x 1440 & 1930 x 1200

Recently I built a new machine using FX6300. I temporarily, I thought, used HD6450 and an old 19' ISP panel to complete setting up the machine.

Then, I replaced the the 19' monitor with the above two monitors.

I tried a couple of HD videos but I was unable to see hardly any difference from that using the above machine which uses much more expensive video card. Maybe I do not know how to look to see the difference.
Comments are appreciated.
 
chuckula
Minister of Gerbil Affairs
Posts: 2109
Joined: Wed Jan 23, 2008 9:18 pm
Location: Probably where I don't belong.

Re: What graphic card to view HD video

Fri Mar 29, 2013 8:27 pm

Just HD video? You won't see a difference. Lemme put it to you this way: My 2008-era Core 2 duo notebook with Intel integrated x4500 chipset graphics can pipe HD videos just fine over an HDMI link... if that thing can play videos, then both of your machines are more than capable of handling it.
4770K @ 4.7 GHz; 32GB DDR3-2133; Officially RX-560... that's right AMD you shills!; 512GB 840 Pro (2x); Fractal Define XL-R2; NZXT Kraken-X60
--Many thanks to the TR Forum for advice in getting it built.
 
churin
Gerbil Elite
Topic Author
Posts: 738
Joined: Wed Nov 28, 2007 4:38 pm
Location: CA

Re: What graphic card to view HD video

Fri Mar 29, 2013 8:55 pm

OK, then what I should view to see the difference? Please exclude pc games. The purpose of this discussion is to determine how to pick a video card.
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: What graphic card to view HD video

Fri Mar 29, 2013 9:13 pm

Pick a video card for what purpose? If you plan to use it mostly to watch HD video, I think you've already answered your own question -- the 5770 is already perfectly capable of doing that.

More expensive cards will get you better gaming performance. If you don't care about gaming performance, then there's no need to upgrade your video card.
Nostalgia isn't what it used to be.
 
kc77
Gerbil Team Leader
Posts: 242
Joined: Sat Jul 02, 2005 2:25 am

Re: What graphic card to view HD video

Fri Mar 29, 2013 9:27 pm

Pretty much any card aside from the bottom of the barrel DDR3 card will work just fine. Newer cards for the most part add more formats that can be accelerated or improvements in audio delivery (bitstreaming, etc) but not much else when it comes to video output.
Core i7 920 @stock - 6GB OCZ Mem - Adaptec 5805 - 2 x Intel X25-M in RAID1 - 5 x Western Digital RE4 WD1003FBYX 1TB in RAID 6 - Nvidia GTX 460
 
chuckula
Minister of Gerbil Affairs
Posts: 2109
Joined: Wed Jan 23, 2008 9:18 pm
Location: Probably where I don't belong.

Re: What graphic card to view HD video

Fri Mar 29, 2013 9:43 pm

churin wrote:
OK, then what I should view to see the difference? Please exclude pc games. The purpose of this discussion is to determine how to pick a video card.



You use that word "difference." What does that mean? Do you want to do a pixel-by-pixel delta between two videos? A video is just a stream of data that is decoded and converted into pixels that are displayed on your screen. Most any modern hardware, including the parts you describe above, will do that just fine. The pixels and frame rates won't be different in any perceptible way. If you don't like what you are seeing, then it is likely one or both of 1. the video itself isn't encoded at high quality, and no video card will fix that, or 2. your monitor/TV/display isn't very good. You might be able to tweak color/brightness/contrast settings in the video card to help the display a little, but any modern video card can do that.
4770K @ 4.7 GHz; 32GB DDR3-2133; Officially RX-560... that's right AMD you shills!; 512GB 840 Pro (2x); Fractal Define XL-R2; NZXT Kraken-X60
--Many thanks to the TR Forum for advice in getting it built.
 
churin
Gerbil Elite
Topic Author
Posts: 738
Joined: Wed Nov 28, 2007 4:38 pm
Location: CA

Re: What graphic card to view HD video

Fri Mar 29, 2013 10:06 pm

I have HD 6450 and HD 5770. I am asking what graphics display I should view to tell the difference.
 
chuckula
Minister of Gerbil Affairs
Posts: 2109
Joined: Wed Jan 23, 2008 9:18 pm
Location: Probably where I don't belong.

Re: What graphic card to view HD video

Fri Mar 29, 2013 10:17 pm

churin wrote:
I have HD 6450 and HD 5770. I am asking what graphics display I should view to tell the difference.


Repeating the word "difference" does not mean you have defined what that word means. There is no difference in displaying video! A video will look the same on either card!
4770K @ 4.7 GHz; 32GB DDR3-2133; Officially RX-560... that's right AMD you shills!; 512GB 840 Pro (2x); Fractal Define XL-R2; NZXT Kraken-X60
--Many thanks to the TR Forum for advice in getting it built.
 
JohnC
Gerbil Jedi
Posts: 1924
Joined: Fri Jan 28, 2011 2:08 pm
Location: NY/NJ/FL

Re: What graphic card to view HD video

Fri Mar 29, 2013 10:26 pm

You won't see any difference in video quality between both of these cards, doesn't matter what video format/bitrate/resolution you'll use or which monitor... Not sure why you don't understand... All you might see is different CPU utilization - HD6450 has integrated UVD3 video accelerator (HD5770 has UVD2) which can provide hardware acceleration for MVC and MPEG-4 Part 2 codecs (in an addition to all of the codecs which UVD2 supports).
Last edited by JohnC on Fri Mar 29, 2013 10:30 pm, edited 1 time in total.
Gifter of Nvidia Titans and countless Twitch donation extraordinaire, nothing makes me more happy in life than randomly helping random people
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: What graphic card to view HD video

Fri Mar 29, 2013 10:27 pm

The main difference between those two cards is gaming performance. But you specifically said to exclude games. So I'm not even sure what you're looking for here; your question does not make any sense.
Nostalgia isn't what it used to be.
 
Voldenuit
Minister of Gerbil Affairs
Posts: 2888
Joined: Sat Sep 03, 2005 11:10 pm

Re: What graphic card to view HD video

Fri Mar 29, 2013 11:13 pm

Most modern graphics card use a dedicated video block to decode video. With few exceptions, even the slowest card will accelerate video playback as fast as a high end card (in reality, some of the ultra low-end cards have reduced acceleration capability, but in general, everything from a low end to midrange to high end card will have the same performance playing back supported formats).

From an anecdotal viewpoint, I've had less problems with compatibility on my nvidia 660 GTX than on my old Radeon 5770, though some people never run into any problems at all.

From a video encode standpoint, intel's quicksync typically encodes faster than radeons, which encode faster than geforces. The radeons and geforces don't encode faster than a decent CPU (sometimes even slower), and most hardware accelerated encoders are still less flexible and lower quality than a good CPU encoder.

My advice would be not to expect any difference in video playback, and not to expect miracles with encoding on GPUs.
Wind, Sand and Stars.
 
churin
Gerbil Elite
Topic Author
Posts: 738
Joined: Wed Nov 28, 2007 4:38 pm
Location: CA

Re: What graphic card to view HD video

Sat Mar 30, 2013 7:26 am

Thanks for all those responses.

Let me go off the topic but continue asking question which is still very much relevant:

Video card is for displaying images on monitor(GPU processing for non-display drelated is excluded for a moment). Assume the monitor in this argument is a given good one. What difference is perceived by eye when viewing what kind of images on the screen between those using inexpensive video card and expensive video card? Is the difference only when viewing PC games? Why one wants to buy $500 video card rather than $30 one?
 
chuckula
Minister of Gerbil Affairs
Posts: 2109
Joined: Wed Jan 23, 2008 9:18 pm
Location: Probably where I don't belong.

Re: What graphic card to view HD video

Sat Mar 30, 2013 8:00 am

churin wrote:
Thanks for all those responses.

Let me go off the topic but continue asking question which is still very much relevant:

Video card is for displaying images on monitor(GPU processing for non-display drelated is excluded for a moment). Assume the monitor in this argument is a given good one. What difference is perceived by eye when viewing what kind of images on the screen between those using inexpensive video card and expensive video card? Is the difference only when viewing PC games? Why one wants to buy $500 video card rather than $30 one?


NONE Please go to a good online reference source to understand how graphics work. A 2D graphical image, be it a video a picture a desktop GUI, etc. is simply composed of a bunch of pixels that are encoded as ones & zeros in binary data sitting on (for example) your hard drive. As long as your video card isn't a relic from before modern video standards were introduced, you won't see any difference! Modern video encoding standards like H.264 are well supported in both modern CPUs and GPUs, and you will not see a difference when you look at the same video (or static image) with two different GPUs.

Look, it's like if you came on here and asked about how you can tell the difference between two different CPUs when you want to compute 1 + 1. THERE IS NO DIFFERENCE, IT DOES NOT MATTER all the CPUs, be they fast, slow, 2/4/8 cores, AMD/Intel/ARM/etc. will say that 1 + 1 = 2. If I calculate 1 + 1 on a sheet of paper, the answer is exactly the same as if I used the world's fastest supercomputer to do the same calculation. No matter how much money you spend, the CPU will tell you that 1 + 1 = 2, and there is no difference between the answers to 1 + 1 between processors.

The SECOND part of your question is related to 3D graphics. That is a completely different area, and one that I and the other posters here didn't address because you specifically said that you weren't asking about it. In 3D graphics, your computer is responsible for taking basic bits of data like 3D polygon models, textures, light sources, etc. and generating a virtual environment where you see and interact with all of these objects. Unlike a 2D picture or video, the 3D environment is not fixed, which means that instead of taking the same set of bits and displaying the corresponding images over & over, your CPU and GPU are responsible for doing huge amounts of processing to generate a display of the 3D objects on your screen and typically animate the rendering at a high enough framerate to show motion. The difference between a cheap graphics card and an expensive graphics card is mostly in the computational power that the expensive graphics card has to render the 3D scene efficiently (i.e. at a high frame rate) and with greater graphical detail. That's why the video quality of higher-end graphics cards often looks better than lower-end cards: the high-end cards have enough hardware to apply high-quality graphical effects to the scene and do it quickly enough for smooth animation. Most of the low-end cards can probably generate 3D rendered images with the same quality if you give them enough time *but* in the process the frame rate drops so low that you really can't play the 3D game anymore, so people reduce the resolution or image quality settings to get more acceptable frame rates when playing 3D games. That is where the big difference is between a low-end card and a high-end card today.

Think about it this way: When you are watching a movie can you use your mouse to move the camera around the scene to view what is going on from any angle? Of course not, the movie is a static series of 2D images that are compressed and decompressed on your computer, and as long as your computer is relatively modern, the results of the output are the same no matter what video card is used to display the movie. In a 3D game, however, you are free to run around, change the view, move objects, etc. etc. in the 3D environment. Since this can't be precomputed ahead of time, your graphics card and CPU are responsible for crunching a huge amount of data to update the display in realtime. The faster and more expensive video cards can generate the geometry, apply textures, lighting/shadows, particle effects, anisotropic filtering, ant-aliasing, etc. etc., much faster than low-end graphics cards. Thus, the high-end expensive video cards are much better at applying all the graphical effects that make 3D games look good *and* do so fast enough that you can play the game with smooth animation.
4770K @ 4.7 GHz; 32GB DDR3-2133; Officially RX-560... that's right AMD you shills!; 512GB 840 Pro (2x); Fractal Define XL-R2; NZXT Kraken-X60
--Many thanks to the TR Forum for advice in getting it built.
 
Aranarth
Graphmaster Gerbil
Posts: 1435
Joined: Tue Jan 17, 2006 6:56 am
Location: Big Rapids, Mich. (Est Time Zone)
Contact:

Re: What graphic card to view HD video

Sat Mar 30, 2013 8:38 am

chuckula wrote:

NONE


Absolutely 100% correct.

Let me expand on the answer and tell you why.

HD is typically 1920x1080 at 32bit depth at 24 frames persecond.
This is also a 2D image.

The level of processing power required to display such an image was passed around 8 years ago (10 might be pushing it).
8 years in the computer industry was a heck of a long time ago, in fact so long ago that TABLETS with a low power processor can do it and they could do it 2 years ago. (ipad 2 or 3 can't remember which one)

The fact that the two cards in question both have HD in the name should have told you this as well.

Now computer games are doing a 3D image at 60FPS+ at 1920x1080 and higher and even my Radeon 4870 can do that if I turn down some of the quality settings.

HD video is really not that difficut to do. We have even gotten to the point that 2xhd and 4k video is not that hard to do either.
Main machine: Core I7 -2600K @ 4.0Ghz / 16 gig ram / Radeon RX 580 8gb / 500gb toshiba ssd / 5tb hd
Old machine: Core 2 quad Q6600 @ 3ghz / 8 gig ram / Radeon 7870 / 240 gb PNY ssd / 1tb HD
 
bthylafh
Maximum Gerbil
Posts: 4320
Joined: Mon Dec 29, 2003 11:55 pm
Location: Southwest Missouri, USA

Re: What graphic card to view HD video

Sat Mar 30, 2013 8:50 am

You're not even really asking the right question, but there are slight differences between the two cards' video decoders:

https://en.wikipedia.org/wiki/Unified_Video_Decoder

5770 has UVD 2.2, while the 6400 series has UVD 3.0. Unless you're playing videos encoded in the new formats that UVD 3 supports you're not going to see much if any difference.
Hakkaa päälle!
i7-8700K|Asus Z-370 Pro|32GB DDR4|Asus Radeon RX-580|Samsung 960 EVO 1TB|1988 Model M||Logitech MX 518 & F310|Samsung C24FG70|Dell 2209WA|ATH-M50x
 
My Johnson
Gerbil Elite
Posts: 679
Joined: Fri Jan 24, 2003 3:00 pm
Location: Dystopia, AZ

Re: What graphic card to view HD video

Sat Mar 30, 2013 9:23 am

Actually, the 6450 doesn't have enough shader power to do implement a lot of post processing effects. The 5770 does. On the other hand the 6450 has blu-ray support which the 5770 doesn't have.

Keep in mind that you are viewing a stream of compressed images with artifacts from the compression present. Post processing will clean it up. See Anandtech. They scrutinize this stuff pretty well.
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: What graphic card to view HD video

Sat Mar 30, 2013 9:42 am

The main difference between a $30 card and a $500 card is the quantity and complexity of the processing pipelines inside the GPU. More (and more powerful) pipelines = higher 3D frame rate and support for more realistic 3D rendering effects. If we ignore professional workstation products like FireGL/Quadro, once you get above about $100 or so it is *all* about gaming performance. (And for the workstation products it is about having drivers that have been tested and certified by the vendors of the various major content creation and CAD/CAE software suites.)

If you're dipping all the way down to $30 there will likely also be some corners cut on passive component quality (e.g. capacitors), but the quality of the image (assuming you're using the digital out, not analog VGA) will still be essentially the same when viewing 2D content.
Nostalgia isn't what it used to be.
 
Voldenuit
Minister of Gerbil Affairs
Posts: 2888
Joined: Sat Sep 03, 2005 11:10 pm

Re: What graphic card to view HD video

Sat Mar 30, 2013 12:38 pm

bthylafh wrote:
You're not even really asking the right question, but there are slight differences between the two cards' video decoders:

https://en.wikipedia.org/wiki/Unified_Video_Decoder

5770 has UVD 2.2, while the 6400 series has UVD 3.0. Unless you're playing videos encoded in the new formats that UVD 3 supports you're not going to see much if any difference.



bthylafth and My Johnson have a point; the 6450 video engine isn't powerful enough for some post-process effects (which are often handled in shaders rather than the decode block). So if you start using post-proce effects like noise reduction, deinterlacing, demosaicing, etc, then you might notice a difference between a 6450 and a more powerful card (hence my caveat about ultra low end cards, which is what the 6450 is).

Also, different video decoders will produce a slightly different image. It's not as absolute as 1+1 = 2. ffdshow vs LAV vs intel decoder vs radeon UVD 2 vs radeon UVD3 vs geforce decodrs will produce a slightly different image.

But yeah, the OP should NOT* expect any drastic differences between the 5770 and the 6450, especially not in favor of the 6450.

EDIT: Oops. missed the 'NOT' :P
Last edited by Voldenuit on Sat Mar 30, 2013 3:25 pm, edited 1 time in total.
Wind, Sand and Stars.
 
Flying Fox
Gerbil God
Posts: 25690
Joined: Mon May 24, 2004 2:19 am
Contact:

Re: What graphic card to view HD video

Sat Mar 30, 2013 12:42 pm

OP, I think you may still be thinking the old days of analog outputs on video cards. In the past, the RAMDAC (the digital to analog component) on the video card was important since VGA displays are prevalent. So back then there were differences from one card to the next. Matrox was regarded to have the best RAMDACs at the time, with then-ATI, Nvidia, and the new defunct 3dfx trading blows in analog output quality after Matrox. Since you are using digital capable monitors, the digital signal between chips should be the same, since a pixel is a pixel and it is going to be value "X". DAC conversion might have made a difference with interference, signal attenuation, and other factors. In the digital world, they don't exist anymore as the pixel data are sent, not some analog signal that the monitor has to reinterpret.

Now, if you do post-processing using GPU/CPU hardware, then as long as the GPU/CPU instructions are supported, then you will be able to do the same post-processing and the digital output of frames should still be the same. The "difference" is going to be % utilization of GPU/CPU and temps. Nothing in terms of what you can see on the monitor itself.

OP: I hope you are using DVI/HDMI/DP and not VGA. ;)
The Model M is not for the faint of heart. You either like them or hate them.

Gerbils unite! Fold for UnitedGerbilNation, team 2630.
 
churin
Gerbil Elite
Topic Author
Posts: 738
Joined: Wed Nov 28, 2007 4:38 pm
Location: CA

Re: What graphic card to view HD video

Sat Mar 30, 2013 2:02 pm

Let me summarize my understanding so far.

There are two different type of graphical images, a 2D and a 3D. The 2D is those created by camera whether it is still or moving image. The 2D graphical image is something viewer just passively watches, while the progress of the 3D graphical image on the monitor is controllable by the viewer in real time by using an input device connected to the computer. The 3D graphic is a software created by computer programming. PC games use the 3D graphic.

For 2D image, there is no difference in image reproduced on a monitor regardless of what video card is used. Exceptions follows:

Video Encoder:The video card comes with a video encoder and version of the encoder may varies depending on an individual card. Reproduced image may suffer if the video decoder is older than the version used for creating the image.

Shader Power: Ability of post processing effects which relates to the reproduced video quality tends to be greater with more expensive video card. The post processing cleans artifacts in the process of reproducing compressed image.
Blu-Ray Support: There are video cards which support Blu-Ray and which do not. If two cards randomly picked, even if the inexpensive one can happen to support, the more expensive one may not necessarily.

For the 3D, more processing power is required than for the 2D. What differs $500 card from $30 is this processing power for the 3D image. Greater the processing power, better the 3D image, depending on the individual pc games. If one does not play pc game there is no need to use video card costing higher than $100. Real inexpensive card may uses lesser quality passive components which tends to suffer from shorter hardware longevity.

There is another category of video card -- video cards for professional workstation -- which are 2D. The drivers for this type of card is optimized for many major CAD/CAE software suites. I assume they are AUTO-CAD, 3DSMax for examples. This type of card is generally more expensive.

Please correct me if there is any misunderstanding.

Edit: The above does not reflect the posts by Vadenuit and Flying Fox. These were posted while I was writing the above.
 
Flying Fox
Gerbil God
Posts: 25690
Joined: Mon May 24, 2004 2:19 am
Contact:

Re: What graphic card to view HD video

Sat Mar 30, 2013 2:38 pm

You are thinking way too much on this.

churin wrote:
There are two different type of graphical images, a 2D and a 3D. The 2D is those created by camera whether it is still or moving image. The 2D graphical image is something viewer just passively watches, while the progress of the 3D graphical image on the monitor is controllable by the viewer in real time by using an input device connected to the computer. The 3D graphic is a software created by computer programming. PC games use the 3D graphic.
In the end, on a typical monitor you are seeing 2D anyways.

churin wrote:
Video Encoder:The video card comes with a video encoder and version of the encoder may varies depending on an individual card. Reproduced image may suffer if the video decoder is older than the version used for creating the image.
Encoder has nothing to do with the output image that you see on screen. This is not quite related to your question. Older decoders usually just won't be able to use hardware to achieve certain features, but modern player software will automatically revert to software decode which will produce the same output, just a little slower (so not all frames are delivered within the smoothness threshold), or at a higher CPU/GPU %utilization because of all the "roundabout" work they need to do.

churin wrote:
Shader Power: Ability of post processing effects which relates to the reproduced video quality tends to be greater with more expensive video card. The post processing cleans artifacts in the process of reproducing compressed image.
Blu-Ray Support: There are video cards which support Blu-Ray and which do not. If two cards randomly picked, even if the inexpensive one can happen to support, the more expensive one may not necessarily.
Post-processing is usually a feature you enable in software (exception being hardwired postprocessing done at the driver level where we don't have control). It specifies certain algorithms to apply to the output image before letting it out to the monitor. Things like deinterlace, smarter interpolation, they should result in the same image between different cards if you run these in pure "software" mode. Remember there is always software fallback. So even if the shaders/decoder on a video card do not support Bluray, a powerful enough CPU can handle it, giving you the same image.

churin wrote:
For the 3D, more processing power is required than for the 2D. What differs $500 card from $30 is this processing power for the 3D image. Greater the processing power, better the 3D image, depending on the individual pc games. If one does not play pc game there is no need to use video card costing higher than $100. Real inexpensive card may uses lesser quality passive components which tends to suffer from shorter hardware longevity.
Better in this case is not based on the same video data like in a MKV/AVI file. There is AA, AF, occlusion, higher quality (read: larger) textures, which means the "input data" that the GPU has to deal with is a lot more. Video playback the input data is the same no matter what card. When you are playing a video file, you can't simply say "drop all green colours" in the name of optimization. The data need to be reproduced somewhat faithfully. In post-processing, you can say data is "invented/derived" so processing power will make a difference; but then again, you can always go software on that stuff, given enough CPU power.

Unless you are still outputting using analog VGA, in terms of playing 1080p or less video, there should be no discernable difference (ok, gamma, colour mapping and others aside, but that is another set of variables) between cards, especially from the same GPU vendor, if you are using the same monitor, same playback software with the same playback settings, the same video file, and the same set of post-processing effects applied.
The Model M is not for the faint of heart. You either like them or hate them.

Gerbils unite! Fold for UnitedGerbilNation, team 2630.
 
churin
Gerbil Elite
Topic Author
Posts: 738
Joined: Wed Nov 28, 2007 4:38 pm
Location: CA

Re: What graphic card to view HD video

Sat Mar 30, 2013 9:29 pm

Flying Fox wrote:
In the end, on a typical monitor you are seeing 2D anyways.

I understand this since the image is shown on a flat panel so it can not be 3 dimensional view. It looks like the type of graphic used for PC games is customarily called 3D graphics.

Flying Fox wrote:
Unless you are still outputting using analog VGA, in terms of playing 1080p or less video, there should be no discernable difference (ok, gamma, colour mapping and others aside, but that is another set of variables) between cards, especially from the same GPU vendor, if you are using the same monitor, same playback software with the same playback settings, the same video file, and the same set of post-processing effects applied.

This is very strong conclusive statement.

Can I assume that FX6300 is powerful enough to be able to take over video processing chore to keep video processing under control if the processing load goes beyond GPU's ability limit?
 
auxy
Graphmaster Gerbil
Posts: 1300
Joined: Sat Jan 19, 2013 4:25 pm
Location: the armpit of Texas

Re: What graphic card to view HD video

Sat Mar 30, 2013 10:41 pm

churin wrote:
Can I assume that FX6300 is powerful enough to be able to take over video processing chore to keep video processing under control if the processing load goes beyond GPU's ability limit?
It's powerful enough, but unfortunately it doesn't really work that way. The software will use the GPU or CPU; generally, it can't divide the work between the two dynamically.

Kinda silly, when you think about it, that nobody's done that with any success.
 
My Johnson
Gerbil Elite
Posts: 679
Joined: Fri Jan 24, 2003 3:00 pm
Location: Dystopia, AZ

Re: What graphic card to view HD video

Sun Mar 31, 2013 2:04 am

BTW, if you end up settling on using the 6450 be sure to reduce noise reduction from default. My friend has that card and I was incredulous at first when he told me it was skipping during 1080p playback. Default noise reduction is way too high.
 
Flying Fox
Gerbil God
Posts: 25690
Joined: Mon May 24, 2004 2:19 am
Contact:

Re: What graphic card to view HD video

Sun Mar 31, 2013 3:26 am

churin wrote:
Flying Fox wrote:
In the end, on a typical monitor you are seeing 2D anyways.

I understand this since the image is shown on a flat panel so it can not be 3 dimensional view. It looks like the type of graphic used for PC games is customarily called 3D graphics.
Holographic projectors are not quite here yet, so there is really no "true" 3D graphics output. Pure-software "3D graphics processing" from the days of Wolfenstein 3D to the latest game engines up until now has been a collection of mathematics problems mapping data in 3D coordinates (call it "scenes" if you may) onto a 2D "plane" so that it can be sent to typical monitors. So calling modern GPU processing "3D" a fraud is a little harsh. The game/app engines send 3D coordinates of polygons, light sources, viewing angle and perspectives, depth information, textures, and other stuff to the GPU to calculate the "scenes", which are all "data in 3-dimensions". The GPU's job is then to take all these data, plus special instructions on specific sets of data (those are your shaders), and "render a scene". In the end, there is a transformation to into an array of RGB/RGBA pixels that can be dumped onto the monitor as a "frame". The goal of course is to calculate, "render", and send at least 60 (or 24, or 30, depending on who you talk to) of these frames to the monitor within a second, hopefully with these frames not too far apart for 'smoothness' sake. (I have just described the big debate we are having on FPS vs frame times here. :)) And then to do the same for every second while the game/app is run.

auxy wrote:
churin wrote:
Can I assume that FX6300 is powerful enough to be able to take over video processing chore to keep video processing under control if the processing load goes beyond GPU's ability limit?
It's powerful enough, but unfortunately it doesn't really work that way. The software will use the GPU or CPU; generally, it can't divide the work between the two dynamically.

The OP really likes to stretch and dream stuff up. :P

The choice of either using GPU vs "software" aka CPU to decode video is usually a setting in the player software. GPUs that are released in the last 2-3 years (yes, even the relatively "weak" Intel ones) should have support for most video formats like WMV, H.264, and VC-1. It is the oddball ones like RMVB or VP8 that modern GPUs may not have "native" capabilities for. Your "older" card is a 5770, which is new enough as long as you deal with mainstream formats. Otherwise, the FX-6300 should be good enough to play those video. Still don't know why you are trying to insist there are differences. Buyer's remorse for your 5770 now that you have a much cheaper 6450?
The Model M is not for the faint of heart. You either like them or hate them.

Gerbils unite! Fold for UnitedGerbilNation, team 2630.
 
churin
Gerbil Elite
Topic Author
Posts: 738
Joined: Wed Nov 28, 2007 4:38 pm
Location: CA

Re: What graphic card to view HD video

Sun Mar 31, 2013 8:32 am

Flying Fox wrote:
churin wrote:
Flying Fox wrote:
In the end, on a typical monitor you are seeing 2D anyways.

I understand this since the image is shown on a flat panel so it can not be 3 dimensional view. It looks like the type of graphic used for PC games is customarily called 3D graphics.
Holographic projectors are not quite here yet, so there is really no "true" 3D graphics output. Pure-software "3D graphics processing" from the days of Wolfenstein 3D to the latest game engines up until now has been a collection of mathematics problems mapping data in 3D coordinates (call it "scenes" if you may) onto a 2D "plane" so that it can be sent to typical monitors. So calling modern GPU processing "3D" a fraud is a little harsh. The game/app engines send 3D coordinates of polygons, light sources, viewing angle and perspectives, depth information, textures, and other stuff to the GPU to calculate the "scenes", which are all "data in 3-dimensions". The GPU's job is then to take all these data, plus special instructions on specific sets of data (those are your shaders), and "render a scene". In the end, there is a transformation to into an array of RGB/RGBA pixels that can be dumped onto the monitor as a "frame". The goal of course is to calculate, "render", and send at least 60 (or 24, or 30, depending on who you talk to) of these frames to the monitor within a second, hopefully with these frames not too far apart for 'smoothness' sake. (I have just described the big debate we are having on FPS vs frame times here. :)) And then to do the same for every second while the game/app is run.

Thanks for the details. BTHW: Is "Computer animation" an alternative term for 3D Graphic?

Flying Fox wrote:
auxy wrote:
churin wrote:
Can I assume that FX6300 is powerful enough to be able to take over video processing chore to keep video processing under control if the processing load goes beyond GPU's ability limit?
It's powerful enough, but unfortunately it doesn't really work that way. The software will use the GPU or CPU; generally, it can't divide the work between the two dynamically.

The OP really likes to stretch and dream stuff up. :P

The choice of either using GPU vs "software" aka CPU to decode video is usually a setting in the player software. GPUs that are released in the last 2-3 years (yes, even the relatively "weak" Intel ones) should have support for most video formats like WMV, H.264, and VC-1. It is the oddball ones like RMVB or VP8 that modern GPUs may not have "native" capabilities for. Your "older" card is a 5770, which is new enough as long as you deal with mainstream formats. Otherwise, the FX-6300 should be good enough to play those video.

OK, the work load can not be divided dynamically between GPU and CPU: It has to be either GUP or CPU which do the job.
Does need to switch between GPU and CPU happens only for 3D? Since I use Windows Media Player and I never need to do such setting as mentioned above. I never played computer game but I sometimes watch non-interactive computer animation.
Flying Fox wrote:
Still don't know why you are trying to insist there are differences. Buyer's remorse for your 5770 now that you have a much cheaper 6450?

I started this thread to ask why I do NOT see difference. Originally I planned to get HD 7750 for the new FX6300 machine. The HD 6450 was taken out from my older secondary machine and used temporarily on the new machine, I thought. But it looks like HD 7750 is wast of money for me.
 
Flying Fox
Gerbil God
Posts: 25690
Joined: Mon May 24, 2004 2:19 am
Contact:

Re: What graphic card to view HD video

Sun Mar 31, 2013 2:49 pm

churin wrote:
Thanks for the details. BTHW: Is "Computer animation" an alternative term for 3D Graphic?
Any "moving" images on a computer screen can be loosely classified as "animation". So don't really know what you are getting at. If you follow this line of thinking, this is all about sending at least 60 (or 24, or 30, or whatever) images/frames to the screen, while making sure the interval between each image/frame is small enough and more or less the same in order to achieve "smoothness".

churin wrote:
OK, the work load can not be divided dynamically between GPU and CPU: It has to be either GUP or CPU which do the job.
For now, there is no software that can dynamically adjust/dispatch workloads between GPU and CPU well enough in the realm of video players (someone correct me my info is outdated). This was supposed to be the promise of AMD's Fusion concept when they acquired ATI. The main problem was that GPU processing of video data is way faster than using CPU, so the drastic difference makes it hard to make sure that if some stuff need to go to the CPU, it may not be able to finish the job in time (remember there is a time constraint when a "frame" needs to be sent to the monitor?). As CPU gets more powerful, they may be capable enough that eventually this can be possible.

churin wrote:
Does need to switch between GPU and CPU happens only for 3D? Since I use Windows Media Player and I never need to do such setting as mentioned above. I never played computer game but I sometimes watch non-interactive computer animation.
This is the part that may be confusing for you. In the past GPUs have been used exclusively solve that "3D data to 2D images" problem. As GPUs get more powerful and their designs get more generalized, vendors start to include "instructions" that can be used for rendering video data. There is nothing special about "3D" or "4D", in the end it is some silicon that do some calculations. So GPUs start to be able to decode videos, and in the past few years, more instructions are introduced to GPUs that can even be used for more "general purpose" problems (still mainly math problems though) and you get the term GPGPUs. People start to throw problems like encryption, specialized industry-specific calculations, etc's to the GPUs because of the performance they can bring compared to mainstream CPUs. Essentially GPUs have become like the co-processors of old.

churin wrote:
Originally I planned to get HD 7750 for the new FX6300 machine. The HD 6450 was taken out from my older secondary machine and used temporarily on the new machine, I thought. But it looks like HD 7750 is wast of money for me.
The same old rule applies: if you see slowdowns, then think about upgrade. If you are fine, then save the money for other things in life. ;)
The Model M is not for the faint of heart. You either like them or hate them.

Gerbils unite! Fold for UnitedGerbilNation, team 2630.
 
morphine
TR Staff
Posts: 11600
Joined: Fri Dec 27, 2002 8:51 pm
Location: Portugal (that's next to Spain)

Re: What graphic card to view HD video

Sun Mar 31, 2013 4:13 pm

Why do I get this feeling that we're collectively doing someone's homework/assignment?
There is a fixed amount of intelligence on the planet, and the population keeps growing :(
 
bthylafh
Maximum Gerbil
Posts: 4320
Joined: Mon Dec 29, 2003 11:55 pm
Location: Southwest Missouri, USA

Re: What graphic card to view HD video

Sun Mar 31, 2013 4:26 pm

Nah, OP posts a lot of clueless questions but I don't recall any of them having an obvious "homework" flavor.
Hakkaa päälle!
i7-8700K|Asus Z-370 Pro|32GB DDR4|Asus Radeon RX-580|Samsung 960 EVO 1TB|1988 Model M||Logitech MX 518 & F310|Samsung C24FG70|Dell 2209WA|ATH-M50x
 
auxy
Graphmaster Gerbil
Posts: 1300
Joined: Sat Jan 19, 2013 4:25 pm
Location: the armpit of Texas

Re: What graphic card to view HD video

Sun Mar 31, 2013 4:30 pm

morphine wrote:
Why do I get this feeling that we're collectively doing someone's homework/assignment?
Ehh.

Friend churin isn't a new poster and doesn't seem like a child, more a genuinely inquisitive person. Personally I think people have been a bit hard on him/her in this thread. ┐(‘~`;)┌ Seems like an English-as-a-second-language poster trying to understand the basics of computer display.

Who is online

Users browsing this forum: No registered users and 1 guest
GZIP: On