I presume there's a logical answer to this question (and I have my own theories) but: Why is the fin density on water cooling rads so much greater than on "air" coolers?
I envision air/tower coolers' fin density to be pretty optimal. Having enough fins to maximize surface area without getting so dense that even high static pressure fans can't push air through them. On the other hand, water cooling rads seem to all have their fins packed so densely that even a high static pressure fan gets ~50% of it's airflow bounced back. Both types of coolers operate on the same premise, use a fan to blow air across some fins. So surely there's an optimal fin density regardless of the method you're using to get the heat into the fins in the first place....right? If so, then what's the reason for the existing density difference?
The heatsink fin density on graphics cards tends to vary more. I presume that's a direct result of the manufacturer's choice of fan and that fan's static pressure properties. In this case, it would seem that [at least some] GPU manufacturers are actually designing their heatsinks in close relation to the fan that will [more commonly] be forever strapped to them.
Main: i5-3570K, ASRock Z77 Pro4-M, MSI RX480 8G, 500GB Crucial BX100, 2 TB Samsung EcoGreen F4, 16GB 1600MHz G.Skill @1.25V, EVGA 550-G2, Silverstone PS07B
HTPC: A8-5600K, MSI FM2-A75IA-E53, 4TB Seagate SSHD, 8GB 1866MHz G.Skill, Crosley D-25 Case Mod