Personal computing discussed
Moderators: renee, JustAnEngineer
Bashiba wrote:Probably the source of your current problem.Phenom 9850 Quad Core OC'ed to 2.8
Bashiba wrote:Good!So I'm leaning towards a 3570K
Bashiba wrote:Bad. If you're buying a Core i5 and a big MicroATX or ATX board with 4 slots, then buy 16GB, because A) it's still pretty cheap and B) if you buy 4GB sticks then you'll be limited at 16GB without replacing sticks, which leaves you with 4GB sticks laying around.probably 8 gb's ram
Bashiba wrote:Here is my current rig
MSI K9A2 Platinum Board
8gbs Ram
Phenom 9850 Quad Core OC'ed to 2.8
Radeon 7770
Twin 640gb Western Black Hard Drives
along with a 1.5 Terabyte Green Drive
Antec P-180B Case
Corsair 520 Watt Modular PS
auxy wrote:THAT SAID, a 7770 is pretty underpowered for a Core i5 system. You'll be GPU-limited in everything, especially at 1080p. I wouldn't bother with a Core i5 if you're only going to have that GPU; a Core i3-3220 will serve you fine and still be miles faster than your current Phenom -- even in heavily threaded apps. It's not overclockable, but you should examine whether you really need to OC anyway -- and how many hours have you spent fiddling with a dodgy OC and then being annoyed when a game crashes right in the middle? Sure, I still do some OCing myself, but based on what you said ("a bit"), I'm probably "a bit" more hardcore with it than you.
flip-mode wrote:auxy wrote:THAT SAID, a 7770 is pretty underpowered for a Core i5 system. You'll be GPU-limited in everything, especially at 1080p. I wouldn't bother with a Core i5 if you're only going to have that GPU; a Core i3-3220 will serve you fine and still be miles faster than your current Phenom -- even in heavily threaded apps. It's not overclockable, but you should examine whether you really need to OC anyway -- and how many hours have you spent fiddling with a dodgy OC and then being annoyed when a game crashes right in the middle? Sure, I still do some OCing myself, but based on what you said ("a bit"), I'm probably "a bit" more hardcore with it than you.
I agree with what you said about OCing, which I think isn't often worth it anymore. But typically my opinion is always get the fastest CPU you can afford. This is a staggered upgrade, Bashiba already has the 7770. If he/she is just doing the CPU purchase now, get the fastest CPU that can be afforded, then when it comes to GPU upgrade time a year from now the CPU will still be totally up to the task. Additionally, CPU's are useful for much more than gaming. Bashiba is still riding a Phenom 9850. That's riding the same CPU for about 4-5 years now. Someone who rides a CPU that long should definitely get the fastest CPU they can. And again, I think the 8 threads of the E3-1230 V2 is potentially a big benefit here as games and other apps take more and more advantage of multiple threads.
flip-mode wrote:I don't ... really ... agree with this. The Xeon E3 is pretty great if you want HT, but HT isn't all that useful if you aren't doing some really seriously multithreaded workloads, and you lose out in the iGPU and to get one in a Xeon you end up paying Core i7 prices. (1245v2, since there is no 1235v2). A lot of folks probably think, "Ah, who cares; he has a discrete GPU!", but this is naive thinking -- Intel's QuickSync is the absolute fastest video transcoding tool around, and you can always use the iGPU for extra displays. I use mine to hook up two monitors, leaving the third for USB and only my single gaming display hooked up to my GPU.The 3570K is fantastic, but consider skipping it and getting the Xeon E3 1230 v2 for nearly the same price.
flip-mode wrote:I do mostly agree with this. I noticed a marked difference in a few games going from 3.4Ghz to 4.4ghz, and my boot time got a lot shorter, but as far as overall system usage, it really didn't make that much difference. Even with a Vertex4 for my system drive I'm still I/O limited before I'm CPU-limited in a lot of things.The 3570K is still great, but I've noticed that overclocking isn't what it used to be. These days it doesn't make my computing experience any better, all it does is boost benchmark scores.
flip-mode wrote:Thanks for the endorsement! Shame this board doesn't have a karma system.I agree with auxy that 16 GB is a good idea. Use 8 GB DIMMs so you leave DIMM slots open for the future.
flip-mode wrote:Me too! I've built machines with the Z77 Extreme6 and the Z77 Pro4-M lately and they've both been rock solid. I've just ordered a Z77E-ITX for myself!Also, I've had good experiences with ASRock motherboards in three builds I've done over the last two months.
flip-mode wrote:Agreed! I run SLI on -- well I used to run SLI on MicroATX!Those are all mATX boards. ATX boards are unnecessary.
flip-mode wrote:I ... don't. Simply speaking, "lol". Even still it is a minority (a large and growing minority, but still...) of games that will make full use of a quad-core CPU; you can see this in benchmarks where the 3570K matches or nearly matches the 3770K (and remember, the latter has more cache and a higher clock.) Even the mighty 3960X with twelve threads usually bends its knee to the 3570K in game or single-threaded benchmarks. I do realize what you're saying about holding onto the CPU for 5 years, and I recognize that in 5 years the market may have changed; still, Intel isn't planning to mainstream hex-core or multi-core (8+) systems with Haswell or Broadwell, and the 3570K is quite a lot faster than the 8-core Jaguar processors (which have lower IPC and a paltry 1.6Ghz clock) going into the next-generation consoles; I think it's very safe to say that the extra threads will be of no use for gaming in the next 5 years.I think the 8 threads of the E3-1230 V2 is potentially a big benefit here as games and other apps take more and more advantage of multiple threads.
Bashiba wrote:Photoshop just isn't demanding anymore. Processors are so fast now (starting with Nehalem, really, but especially the "Bridge" chips) that it annoys me a bit to continually hear it dragged out as a demanding task. The i3 would handle that just fine. I sell a lot of Trinity-based systems, and they handle "heavy" desktop loads like Photoshop and Premiere while multi-tasking just fine -- and the Core i3s are quite a lot faster than Trinity.I'm more worried about the CPU handling Photoshop and some other niche software well and being able to multitask, I wouldn't even consider the dual core recommendation as I tend to be running a lot of things at once.
Bashiba wrote:The 3470 is not a terrible idea, but if you plan on messing with video at all consider that the 3570K does have the fastest Intel HD 4000, which improves performance using QuickSync.The Xeon option looks pretty good, I'm also considering the 3470 as well just because it seems like a pretty bang for the buck as well, especially if overclocking the 3570k isn't producing much in the terms of real world performance increases.
Bashiba wrote:Not if you want to play games! Or, really, do anything that requires good single-threaded or good floating-point performance. The FX chips aren't terrible, but given that you're apparently not averse to spending ~$240 on a CPU, I wouldn't even consider them. They're a joke compared to that Xeon, and even the Core i5s can utterly destroy them in most tasks.I also haven't ruled out the AMD 8350 either, I have always had good luck with AMD chips and the 8 core's would probably serve me well.
auxy wrote:Bashiba wrote:Bad. (....) if you buy 4GB sticks then you'll be limited at 16GB without replacing sticksprobably 8 gb's ram
Chrispy_ wrote:I would be tempted by the E3-1230 V2, otherwise stick with the i5; Like others have said, overclocking a CPU isn't anywhere near as useful as it used to be. I aparrently have a 4.5GHz 2500K in this box, but I can't honestly say it's much benefit over the stock clock which would turbo up to 3.7 in my 2500K's case.
Firestarter wrote:I disagree, it depends 100% on the kind of application or game that you run. That 3.7 figure is only for one core, for 4 cores it's limited to 3.5 by default. If we ever end up getting games that actually fully utilize 4 cores and still benefit from high clocks, that overclock from 3.5 to 4.5 is going to make quite a difference.
auxy wrote:I don't ... really ... agree with this. The Xeon E3 is pretty great if you want HT, but HT isn't all that useful if you aren't doing some really seriously multithreaded workloads, and you lose out in the iGPU and to get one in a Xeon you end up paying Core i7 prices. (1245v2, since there is no 1235v2). A lot of folks probably think, "Ah, who cares; he has a discrete GPU!", but this is naive thinking -- Intel's QuickSync is the absolute fastest video transcoding tool around, and you can always use the iGPU for extra displays. I use mine to hook up two monitors, leaving the third for USB and only my single gaming display hooked up to my GPU.
flip-mode wrote:The 3570K is fantastic, but consider skipping it and getting the Xeon E3 1230 v2 for nearly the same price.
http://www.newegg.com/Product/Product.a ... 6819117286
It works with socket 1155 motherboards. It is not an unlocked chip, but it is hyperthreaded, so 8 threads.
The 3570K is still great, but I've noticed that overclocking isn't what it used to be. These days it doesn't make my computing experience any better, all it does is boost benchmark scores. I have 3570K builds that are running overclocked and others that are running stock and they feel exactly the same. I've stopped buying the 3570K and have been buying the regular 3570 laterly, but now that I've found out about the E3-1230 V2 I won't consider anything else. If you do multithreaded stuff, the E3-1230 V2 will hold a definite advantage over the 3570K. If not, then it's a toss up.
Do you know what is missing?*Since Xeon CPU is not desktop CPU model, some feature may not able to work on this combination. For detail, refer to support.asus.com
flip-mode wrote:No, Fox, I'm not sure. I'm going to look though, as I'm probably ordering one for a workstation build for work tomorow. Probably ASRock based instead of Asus, though.
Flying Fox wrote:VT-d is a standard feature on the Core-i series. It's only the K-series that lack it, in a bit of shameless artificial market segmentation from Intel...But a lot of consumer motherboards can enable VT-d anywas?
auxy wrote:Flying Fox wrote:VT-d is a standard feature on the Core-i series. It's only the K-series that lack it, in a bit of shameless artificial market segmentation from Intel...But a lot of consumer motherboards can enable VT-d anywas?
flip-mode wrote:Yes, that's correct. I suspect you are also correct about whether Bashiba needs or wants ECC.Edit: I'm growing more and more certain that it's ECC memory. I think to get ECC memory support you need a C-series chipset....
Edit2: As it relates to the subject of this thread, I'm doubting Bashiba cares about ECC...
flip-mode wrote:Currently I have two ASUS VS229H-P (22" 1080p eIPS) wall-mounted, a 32" Toshiba HDTV (1080p) sitting nearby, and a 7" MIMO Monitors 720-F USB-attached and USB-powered 480p touchscreen that usually sits on my lap. The HDTV and one of the VS229s are hooked up my iGPU (iHD4000) and the other VS229H-P is hooked up to my GTX460.I have two monitors connected to my video card :shrug: What's your setup?
auxy wrote:flip-mode wrote:Yes, that's correct. I suspect you are also correct about whether Bashiba needs or wants ECC.Edit: I'm growing more and more certain that it's ECC memory. I think to get ECC memory support you need a C-series chipset....
Edit2: As it relates to the subject of this thread, I'm doubting Bashiba cares about ECC...flip-mode wrote:Currently I have two ASUS VS229H-P (22" 1080p eIPS) wall-mounted, a 32" Toshiba HDTV (1080p) sitting nearby, and a 7" MIMO Monitors 720-F USB-attached and USB-powered 480p touchscreen that usually sits on my lap. The HDTV and one of the VS229s are hooked up my iGPU (iHD4000) and the other VS229H-P is hooked up to my GTX460.I have two monitors connected to my video card :shrug: What's your setup?
Displaying 2D takes an almost negligible amount of GPU power, but the GPU does still have to store that framebuffer in local memory, and I see no reason to use the quite limited 1GB on my GPU for my various desktops when I can use my 32GB system memory (by way of my iGPU) for that.
flip-mode wrote:Hehe. You're probably right; I just hate having inactive/useless ports on my motherboard. (✿◠‿◠)I'm going to go out on a limb and assume Bashiba isn't going to be doing all that and probably will do fine with just the video outputs from the discrete video card. If Bashiba is doing a bunch of encoding, then the IGP point is valid. That's a question for Bashiba.
Bashiba wrote:It's pretty much down to 3570K vs. Xeon E3-1230v2. (The "v2" is important! The Xeon E3-1230 is Sandy Bridge, whereas the "v2" designates the Ivy Bridge version.) If you don't need the integrated graphics and you don't *need* to overclock, then the Xeon is fairly clearly a better choice; it'll generally be more stable (due to more aggressive QC on the Xeons and an inability to overclock meaningfully, removing the temptation), run cooler, and has hyper-threading and more cache, for handling heavier and "wider" loads better.At this point I guess I'm still leaning towards the 3570k, it seems like its pretty hard to beat for the cash. But still thinking it over and listening to any other ideas.
flip-mode wrote:ATX boards are unnecessary.