So I've been running the 5670 since I was on a tight budget when I built my computer this time last year. Needless to say, I've got a hard time maxing anything out, and now I'm having trouble running stuff at the lowest settings (Sonic Generations lags on Chemcial Plant, Skyrim).
Well, the AMD 7xxx series is coming out (NVidia is releasing a new set too, for that matter) and I can't make my mind up with what I want to do. It's basically
A) Invest in a new card now (Probably the 6870) and regret it in 3-5 months when I could have saved money on the same card or gotten a new one
or
B) Not enjoy certain releases over the next 3-5 months, but be happier in the long run.
B seems like the obvious choice, but it's hard imagining going that long. If anyone is experienced with gpu releases (This is my first go-around, been a console gamer for 22 years prior), can you give me some insight? Will it really be worth waiting that long?
Before anyone suggests OCing my 5670, I have been with the maximum allowance that AMD Overdrive will let me. It helps a lot, but it just doesn't cut it with this fall's releases. I've got a 750 watt PSU, so that's not an issue (not looking to crossfire anything, either)
Help! xD
That's an endless loop. It's not like they will publish a new card and announce "this is it, this is the last card we will ever publish". There will always be new cards announced immediately after each other.
HD 6870 is a good card, GF 560 Ti is as well. They should be quite enough if you aren't hell-bent on combining huge resolutions with multisampling. It's cheaper to get a decent card every two years or so than investing lots in one "dream card" that will serve you just a few months more before the next game with killer graphics comes out. If anything, more powerful cards are often louder, larger, and need more watts to run.
Wonder if there is a mathematical stable point (equilibrium) between the two curves formed by inconvenience caused by the gadget's age and the inconvenience of knowing how much cheaper you could get the same gadget anew now than when you bought it.
Someone should study it.
You know what's funny? Back in 2000, when I got my first computer, assembling a good, balanced gaming system cost around $1000 (sans the software). Hardware has changed, games have changed, but assembling a good, balanced gaming system still costs around $1000.
If there's equilibrium anywhere, there it is.
OTOH, the cost curve is very exponential. In other words, if you are willing to compromise just slightly and not get the absolute top-of-the-line hardware, you can easily get an almost-top-of-the-line computer for a fraction of the price.
In other words, if you eg. buy a graphics card that is half as fast as the current best (but still way faster than the best 5 years ago), it may well cost one tenth of the price of the best one.
Of course in order to make such compromises one has to resist the urge of thinking like "it's only half as fast as the best one, that's nothing! It will barely be able to run any modern game!" In reality even the slightly older, but quite cheap, cards are able to run modern games pretty decently. They might get obsoleted faster, but OTOH you only invested on tenth of the money, so you can buy a better one for that same price in a year or two, and you are still way ahead in the benefit/cost ratio (compared to if you had bought the best card right away).
Sonic Generations lags on chemical plant with a 5870, so think nothing of it.
Realistically, hold off on upgrading until the 7xxx series if you can help it, you're more or less wasting time upgrading now.
Also, consider running games in windowed mode (for sonic generations you need to use an external program to window it, but it works fairly well) rather than trying fullscreen with maxed or high settings, as you'll get better performance.
Because if you use a fullscreen but not with max resolution then everything will look goofy, try 640x480 fullscreen on a 16:9 screen thats suppose to run 1920x1080, it has nothing to do with physics, it will just look bad, stretched, and ugly.
[EDIT BY DARKKOBOLD]totally inappropriate comment removed.[EDIT]
For instance on my laptop, Portal 2 does not allow 640x480 full-screen; does allow it in windowed mode though.
EDIT: For the record, I don't see any derailing whatsoever.
Oh, that's true, but I'm not talking top of the line, far from it. That's only counting the best-bang-for-the-buck hardware that is able to run any modern game without giving you eye cancer. That's just how the two industries are balancing each other, which shouldn't come off surprising in any way.
For instance, an average Sandy Bridge-based system here would cost:
a motherboard — ≈75$;
i3-2140 w/fan — ≈160$;
two 2 GB sticks of DDR3-1333 — ≈25$;
Radeon HD 6850 — ≈150$;
a couple HDDs (one for system and temp files, one for games and stuff) — ≈160$;
a 80+ standard 500+ watt PSU — ≈70$;
a DVDRW drive — ≈20$;
a roomy thick-walled tower case — ≈60$;
a set of minimally-comfortable peripherals — ≈30$;
a set of non-shitty headphones (because it's cheaper to get acceptable sound out of headphones) — ≈20$;
a TN-based 20"+ monitor — ≈150$.
Total: 920$.
That's a reasonably cheap rig made of above-average components that are expected not to fail before the next upgrade. It's not terribly comfortable, pretty, or silent, and the visual and sound quality are average at best, but it gets the job done.
Edit: Forgot an optical drive.
By looking at steam store page of sonic generation i see they recommand gtx460 or 5850(1gb), so that sound relativly normal it would run slow on 5670, you exceed the minimun req but that mean your suppose to run it with minimun settings like Atma pointed out.
According to several benchmarks that popped up, skyrim should run close to 60 fps at medium details on 1680x1050. What other titles are you experiencing trouble in?
Your current GPU is stronger than those found inside XBox360 or PS3, so any cross-platform title is going to work just fine on your GPU if you configure the detail levels similar to the console version. There shouldn't be any real need to upgrade until the next-gen consoles come along, unless you like playing PC-only interactive benchmarks like Crysis. (though funny enough, crysis and even crysis 2 will work on your GPU if configured accordingly)
On the other hand, you could also get a new GPU now, to enjoy some higher resolutions - and then upgrade again in 1-2 years when next-gen arrives. Whether or not you skip this step entirely depends on your budget and your willingness to spend 100-200 bucks on a bit of eye-candy (and ~30 bucks over two years on the larger power consumption of highend GPUs).
Since at any given time, there's always a shiny new version right at the horizon, I've found it to be useful to set certain requirements for an upgrade, then buy as soon as a product satisfies those. For example, my current CPU is a dual-core, and I'll upgrade as soon as I find "a quad-core with greater per-core performance and less or equal power draw." I may or may not stick to that goal and go with a 125W octa-core at some point, but at least this goal keeps me from obsessing over every new CPU. "Meh, too much power draw, not interested".
There's no need to get a new monitor for each upgrade, nor any of the other things I quoted.
An average 100$ 5.1 sound system will last more than 10 years and is well worth the price. In 10 years, you'll have bitten through enough headphone cables and rubbed off enough of the ear-protecting foam that the proper sound system may still be cheaper in the long run..
The assumption seems logical, but, unfortunately, it's completely detached from the practice. First of all, most of the demanding games run mostly at 30 fps, often dipping into 15-20 (I'm not kidding, this happens on both the X360 and PS3 even in very mundane scenarios). This sucks on a console you can't make faster in any way, but on PC, a platform where you have total control over performance, this is completely unacceptable. Then, a good deal of the demanding games don't run at 1080p natively, and are, in fact, upscaled. Weren't you wondering why the performance difference between 640x480 and 1920x1080 takes at least an order of magnitude on a PC, but is barely noticeable on a console?
Duh?.. I was talking about building a system from ground up. My last upgrade, for one, consisted of me swapping a GTX 460 in place of a 7600 GS. Instant upgrade!
Using modern consumer-grade hardware for over 4-5 years non-stop isn't advisable, anyway, as it has its own unavoidable wear. This is especially the case with PSUs and capacitor-heavy components, as well as mechanical ones such as HDDs and mice.
5.1 is a waste of space and money. A 2.0 system bought for the same price will sound much better and will have a broader focus. There are very objective reasons for that, as well as the fact that all professional sound equipment is based on stereo.
And for the record, during the last ten years I've gone through nine pairs of headphones, and not a single one of them died at home. :)
Well heck. That takes me back to wanting to purchase it right now lol. Glad to hear that it'll hold up, but I've gotta hold up and not cave in yet xD. Even if it's the difference of $20~30 when the 7xxx series comes out, or there's a better card, I don't wanna regret it later.
I know there are people out there who always want "the best", and that's why the 580/6990 are out there. But I'm not rich (or crazy) enough to spend $400~600 for something that'll be obsolete in 8-10 years. For that kind of money, I'd want it to last that long and I know it wouldn't xD.
Well hey, that's really good to know lol. I chalked it up to my GPU being "meh" rather than see if anyone else had the same problem. Took a quick look on google and found there's quite a few other people experiencing the same bit.
I'm going to try and hold off. I updated my drivers today and got Skyrim to work well enough that it's playable, so I'm not in as much of a hurry now.
Gotta say, I feel like I have to hold my breath with each fall release that I purchase hahaha.
arukAdo wrote:
Because if you use a fullscreen but not with max resolution then everything will look goofy, try 640x480 fullscreen on a 16:9 screen thats suppose to run 1920x1080, it has nothing to do with physics, it will just look bad, stretched, and ugly.!
Lowering the resolution didn't help much. The sections that would lag, lagged smoother xD. We had the same idea, but I appreciate your input buddy.
Tub wrote:
Snip
Well, I'm only listing games that lag at lowest settings. So far it's only been Sonic Gens, Skyrim and the extremely poor-port of Saints Row 2. But two out of those three are releases from this fall.
But hey, I like that idea a lot lol. I think I can wait it out until the next series, but I'll steal your mindset here and use it to keep from obsessing when other parts and cards get released afterward. I still get a little mad when I see a 2 GB SD card for $5 in the drug store (I paid $150 a little after they first came out in 2004/Early 2005).
I know there are people out there who always want "the best", and that's why the 580/6990 are out there. But I'm not rich (or crazy) enough to spend $400~600 for something that'll be obsolete in 8-10 years. For that kind of money, I'd want it to last that long and I know it wouldn't xD.
8–10? Try 3–4 if you want to be realistic. I'm currently using a factory-overclocked GTX 460 (the 1 GB model) and a 1600x900 monitor; so far the 460 has been quite sufficient for games I've played—which, admittedly, aren't numerous. If I looked for a new card now, I would have happily settled on 560 Ti; I don't use multisampling, it's too inefficient.
I've found that the best time to buy any hardware is after a significant price drop, or at least at some point where the price doesn't considerably fluctuate from month to month. New lineups usually trigger that kind of price drops, so Atma's suggestion is reasonable. Maybe, however unlikely, there will even be something new that performs better than 6870 at the same market price.
Also make sure to keep an eye on comprehensive tests that calculate per-$ efficiency; obviously the formulas will be different for every site, but generally they won't contradict each other. I use iXBT.com for the reference, but it's in Russian. The last table on the page is sorted by per-$ efficiency, with the rightmost column being the average price, and the one in front of it the performance index (all data from October '11).
Warp wrote:
Edit: I think I understand now: It's my avatar, isn't it? It makes me look angry.
Also make sure to keep an eye on comprehensive tests that calculate per-$ efficiency; obviously the formulas will be different for every site, but generally they won't contradict each other. I use iXBT.com for the reference, but it's in Russian. The last table on the page is sorted by per-$ efficiency, with the rightmost column being the average price, and the one in front of it the performance index (all data from October '11).
Hey, thanks a lot! I'm going to bookmark and check that out now.
Because if you use a fullscreen but not with max resolution then everything will look goofy, try 640x480 fullscreen on a 16:9 screen thats suppose to run 1920x1080, it has nothing to do with physics, it will just look bad, stretched, and ugly.
You're comparing apples with oranges. You would use a low 16:9 resolution if you have a 16:9 monitor and the game supports 16:9, obviously. Also Windowed mode would more likely result in a drop in performance in comparison to Fullscreen mode because your video card has to render the game as well as the Desktop (And Vista and Windows 7 have all them hardware accelerated effects for the Desktop when Aero is enabled).
Lastly, you can configure video cards these days to upscale low resolutions instead of your monitor so you end up with somewhat decent sharp picture still and you can even tell it to upscale low resolutions using your video card while retaining aspect ratio (for those who have monitors that lack the ability to set there monitor to 4:3 mode or vice versa)
Out of my own experience, I found the drop in performance when using windowed mode to be completely negligible. I always used it for Team Fortress 2 because it made alt-tabbing a lot easier, even at a point where I had a computer that was less powerful than a bag of porridge.
Because if you use a fullscreen but not with max resolution then everything will look goofy, try 640x480 fullscreen on a 16:9 screen thats suppose to run 1920x1080, it has nothing to do with physics, it will just look bad, stretched, and ugly.
It may well be because I'm still using a CRT, but I don't have too much of a problem of going as low as 800x600 fullscreen (I have a 19-inch screen). Small details get pixelated, but with most games that's not such a huge drawback, and the picture looks nice enough. Admittedly 640x480 starts being perhaps a bit too low. (OTOH windowed mode doesn't change the problem of disappearing detail due to low resolution...)
Actually CRTs are awesome for low resolution, because instead of ugly blur (or having the fullscreen image reduced to a small patch in the center, if you choose 1:1 mapping) you get sharp graphics with unintrusive scanlines. Unfortunately, gaming is the only thing CRTs are good for nowadays. :\
Warp wrote:
Edit: I think I understand now: It's my avatar, isn't it? It makes me look angry.