I think it’s funny, Fred. Don’t let the haters get you down.
I think the haters are the Green Team bois. The comic is funny, but not to them.
So a person can only like it or hate it? All polarization, no middle ground? Personally, I thought the comic was bland. I understand the point the author is trying to make, but the comedic timing is lacking and doesn’t carry the weight of the concept.
These things continue to be painfully unfunny. TR, I beg of you, please drop the comic strips, or at least get someone that can write.
Aw…someone has troubles reading acute humor.
Well go watch reruns of the Tom Green Show. Don’t hurt yourself.
Maybe it is time for a candid discussion with the guy holding the gun to your head making you read the comic.
I think gaming and being all about the money is or at least should be considered a truism now.
Eh, I got it but wasn’t very funny. Maybe try to poke fun at some more recent stuff like the quarrels between EA and Steam?
i don’t get it…
hahahaha this sux, did Intel pay you for this too?
But the good news is, you finally found a way to express all that brooding hostility. I think we’re ready to move forward to the Rorschach Logos! I have here a sequence of cards that feature the corporate graphics for several tech industry players. I want you to gaze at each one in succession, and tell me: What kind of feelings do you have?
You have to admit, having it as nVidia technology is way better than the stand-alone Ageia cards.
Once the lucid hydra type stuff is standard issue, you can have both brands working together to improve your gaming experience.
Normally, I dig Fred’s comics. Today, meh. Still, takes balls to put yourself out in front of everyone with the comment thread open.
seconded! Great work on the comics, keep them coming Fred =)
Yep, you (Fred) get my vote!
It’s to be expected that a company would try to keep an advantage point to themselves like physx. NVIDIA can’t rely to keep a performance gap all the time with AMD or a strong low-cost point. It’s normal that a company looks for a differenciation that adds value to it’s products. It’s microsoft with it’s directx, opengl or someone else who should be required to develop an easy computing layer for gpu-physix effects. NVIDIA may play the good guy-game to increase their reputation or keep it’s differentation. It’s just business as usual.
Bet a buck that the technologies in question are protected by patent. Thus no one else can duplicate them.
But patent is established specifically so that a technology can be SHARED without threatening the owner’s advantage. To take the protection, then not license, is contrary to the purpose of the law.
[quote<]But patent is established specifically so that a technology can be SHARED without threatening the owner's advantage. To take the protection, then not license, is contrary to the purpose of the law.[/quote<]
Incorrect. The patent is established as a temporary monopoly to protect the holder's interest in the invention for a limited period of time. Licensing is one possible outcome, but not required either in spirit or letter -- if the assignee wants to retain the invention entirely for their own commercial use, they are welcome to do so.
The "disclosure" part you speak of, is the patent filing itself. Once the patent protection is expired, the invention remains on record, and anyone may then examine the filing to see how it was constructed.
Someone else *did* develop an easy computing layer for physics. Sure, it wasn’t on the GPU at the time, but it didn’t prevent itself form working with any particular brands of video cards.
Then nVidia bought them and now it just works with their cards.
[Edit for proper English]
Agree, that’s why I say should be microsoft DirectX or OpenGL who should develop an open physx standart that nvida may not be able to buy. However, they still think there isn’t a strong market demand for higher quality physix effects generally speaking.
Where’s the Chronos group? Guys!
I got it, but it still wasn’t anywhere near funny.
I get it, it make me chuckle because of the inane fanboy wars. The fanboys always make the absurd claim that their company is the greatest thing since sliced bread. When they couldn’t be any further from the truth.
Miracle of miracles, “innate” works in this sentence (fanboys naturally argue) but I think you actually meant “inane” – silly, stupid, and insignificant.
Join me, derFunk, and we shall dominate [i<]together[/i<].
Oh, I don’t mean to do it EVERY time – there aren’t enough hours in the day – but in this case I thought it was worth pointing out that even the blind squirrel found a nut.
what about me bros? we can have a three way domination!
We’re just making sure that we don’t introduce any extraneous variables into video game development. Introducing a completely new concept like physics will detract from how much man power can be put on more important things… like corny dialogue, cliche stories, shitty consolized graphics, and line our investors pockets by not having proper QA.
This insures everyone is happy.
I don’t like to be mean…but this comic strip is painfully bad. And not even in a so-bad-this-is-good way like Bruce Campbell in Army of Darkness, this is just bad-bad like John Travolta in Battlefield Earth.
evil dead sucks. Not good in ANY way. it’s 90 minutes of agony.
VOTE ME DOWN! IT’S CALLED TASTE! SOME OF YOU BARBARIANS COULD USE SOME! ;p
If a man cutting off his own hellspawn-infected hand and replacing it with a chainsaw is wrong, I don’t want to be right.
then my friend, enjoy living in wrongyland.
I laughed out loud at this one
I don’t get it, i mean i don’t get the punch line. The left side i get but what’s the punch line on the right side? Who is George T. Gamers?
The nVidia rep says they want to ensure the best experience for gamers, so you think they’re talking about people who game. What he’s really talking about is an nVidia stockholder named Gamers and the “best experience” they want to ensure is a financial one.
OMG ^^ that is hilariouse!!!
Monday, eh? 😛
Hehe, true true. I need to wake up, but i don’t want to ()(0_o)()
His name should have been Pete C. Gamers and his wife Connie Sole-Gamers.
Actually a good idea.
Yes, because consoles get all the PhysX. You’re taking a pretty bad comic and making it worse.
derFunkenstein, you need to re-read that.
how so? It’s a comic about PhysX.
FYI, PS3 is a console 😉
Oh, i didn’t realize PS3 has PhysX support. The GPU is a GF 7800 relative so I assumed (wrongly, apparently) that the GPU couldn’t accelerate physics.
It’s probably running on the cell processor, not the GPU.
nvidia: we just want to ensure that amd-ati is crushed with our CUDA, PHYSX, stuff.
ATI : Go ahead with your Fermi’s and Keplers and delay your product line for another year. We will enjoy the better market share until then :).
It’ll be interesting to see if nVidia really does wind up months behind AMD due to the next cycle. nVidia’s mind seems to be on Tegra, Tesla, and Quadro when the bulk of their money is still coming from Geforce, but I’d be surprised to see them be as delayed as Fermi was.
If AMD shows up in October-November and nVidia previews in November only to show up in January-Feb, I doubt there will be much impact due to the release (by then) of Sandy Bridge-E and the imminent release of Ivy Bridge. Most people will wait. Those that don’t will be going for the bragging rights, the high end or the highest of the high end, which are the ones most likely to jump ship the second one company has a leg up on the other, so nothing lost there. Meanwhile, the rest of the public, if they’re even buying video cards, will probably 1) want a laptop where nVidia’s superior OEM support will help them make up any difference (or Llano will make the point moot) and 2) be as likely to buy a pre-built computer that includes whatever it includes and if they did happen to be savvy enough to install a new video card, they’d go for the best $200 value, whichever that is (and they won’t care who showed up first, just who showed up at the right price first).
If nVidia’s Kepler suffers a similar fate as Fermi, though, I think nVidia will have displayed the world a serious problem in their design phase since it’ll have happened the last two times they’ve done major changes to their architecture. Meanwhile, AMD is supposed to be making massive changes to its own video cards. If AMD pulls it all off without a hitch, I’d be pleasantly surprised. After all, this is the same company that made the Phenom, the 2600, and the Rage MAXX.
I’m not sure I follow your logic. Are you saying that people will want to build whole new systems instead of updating components as better parts become available? I’d agree to that to some extent. There is some value to doing all of ones upgrading all at once. But, the opposite is also true. If you upgrade one component at a time (as they become available), then there’s much less at any one time to debug and problems with the system will more strongly point to one likely culprit–which can greatly speed and simplify debugging of the system.
So true, the sad part is that the fanboys haven’t gotten the memo. The primary concern for any entrepreneur and corporation is to make a profit for their shareholders.
How about make an AMD version? (Eyefinity, OpenCL, persistant issues with certain drivers/GPUs [HD 5xxx]) 😉
Considering the green team has their own version of Eyefinity, and it’s not as well implemented (you need two cards in SLI to run more than two monitors, yet you can run at least four off a single AMD card, I don’t think making fun of Eyefinity would work lol. I’m not too familiar with the issues behind openCL or bad drivers though, so I dunno if that’d make a funny comic or not
Bad drivers and other quirks are sufficent enough.
It can go with this pitch, some things change, yet some things remain the same. The comic is comparing driver quality from 2000 to the present day. The common theme between the two different time periods is raging customers and BSODs 😉
Very small correction…you can run 3 monitors provided at least 1 is native displayport, otherwise you’re into DP-to-DVI active adapter (~$30) territory. And up to 6 displays off one monster card.
Sunburst, or sunspot, or whatever its code name was was shaky in the 5xxx cards, but my experience has been quite solid with my 6xxx card.
About the cartoon…worthwhile effort, it was cute. Humor is difficult.
With an Eyefinity card and DP monitors, you don’t need any adapters.
AMD have put some effort into Bullet physics, which does rigid body stuff well, and I believe they bought a company that did fluid/clothy type physics to integrate into Bullet but I forget the details.
Problem is, Nvidia like to give game companies a bunch of money to only use Physx in their games.
maybe you don’t understand what it takes to make a profit. It’s called being successful, and competing on the merits of the product. When you do underhanded crap, it’s a loss of value for the shareholders.
You don’t get the punchline. 😉
I would imagine some of us fanbois are also investors.
They aren’t fanboys. They are “shills”, there’s a difference. 😉
The comic corner needs it’s own section. BTW, ^ is funny.
I second that lol
I second the “needs its own section” but not the rest of that; this is unfunny and I think the rest of the ones I’ve seen have likewise made me wish I had the few seconds back that I spent reading them. I see in the RSS feed that something new has shown up at TR but then find it’s only an unfunny comic- give it its own section, give it its own RSS feed, etc just get it so those who think this stuff is funny can have their own corner to themselves and the rest of us can just get our news minus the groans.
All Rights Reserved. Copyright Tech Report.