The rumor is that the NVIDIA GeForce GTX 580 will be coming out on November 9th to compete with AMD’s Cayman series of GPUs, which will be the 6900 series. Interestingly enough, this means that the GTX 580 will launch before the 6900 series. However, there is no telling if this is a soft launch or a hard launch with the product being in stores. The NVIDIA GeForce GTX 580 will have all 512 streaming processors enabled, have a 128 texture mapping units (TMUs), have 2 GB of GDDR5 memory, is using the GF110 chip, and will have a thermal design power (TDP) of 244 Watts.
You can look at this card as Fermi done right and we can also confirm that the card is scheduled to launch before AMD’s Cayman-based Radeon HD 6950 and HD 6970 parts.
The fun in the graphics world is just about to start, as there are still a few more interesting cards to launch until the end of 2010.
via Geforce GTX580 launch on 9th Nov, sampling now.
Update 11/5/2010 9am: Here’s the ‘highly contested’ graph, from This Site.
And their description:
The benchmarks show an average improvement of ~17% for the GTX 580 over the GTX 480. The maximum increase is in 3D Mark Vantage – ~35%, while the minimum increase is in Resident Evil 5 ~5%. Against the Radeon HD 5870, the GTX 580 wins comfortably by an average of ~45%. The gaps are massive (~2x) in tessellation oriented benchmarks, building on GTX 480’s strengths, while the DX10 benchmarks narrow the gap considerably. Unfortunately, according to these benchmarks, the GTX 580 will end up slower than AMD’s previous-gen Radeon HD 5970, on average, let alone AMD’s upcoming flagship – Antilles / Radeon HD 6990. GTX 580 against Cayman / Radeon HD 6950/70 is the real battle here. Naturally, we would advise you to take any such leaks with a grain of salt.
What do you think?
Dear Randall,
My company is involved in marine simulation (both for training and research purposes). We will upgrade to Presagis Vega Prime. We are currently debating the choice for a GTX 480/580 or Quadro 5000 for our image generator PCs in our simulators. Our Visual Database development workstations run on the Quadro GTX 4800.
I am curious to hear your advise!
Kind regards,
Martijn
@ Jeff
Ratios are blown out of proportion? That’s EXACTLY what we’re getting at. If something is 25% better, it’s 25% better. The FPS metric doesn’t matter because the comparison is meaningless to anyone not using that exact software on that exact configuration.
Let’s say I’m running the same application, but doing it at 1024×768, while the benchmarks are run at 1920×1080. The FPS comparison becomes meaningless, because I expect to get much higher on my configuration. But the ratio between two cards, say 25%, is meaningful because I can use that to predict performance at a different configuration.
It’s worse with different software. How does FPS on Batman help me buy a video card to run visualization software? It’s a number I can’t easily translate. Most of the software used in visualization isn’t highly optimized for a specific video card, and it doesn’t try to hit 120+fps. You might be dealing with a render that runs at 2 or 3 Hz. So knowing this, you want to see the ratio, because you can easily see how much of a performance benefit you will see in your application.
Regarding the accuracy of the benchmarks, it’s only fair to comment on the ones published by this site. Throwing up a forum link doesn’t make your arguement correct. What is to say that THOSE numbers are inaccurate? What if I posted a link to someone saying that the GTX480 what a bazillion times faster than a 5870? What’s the point?
You misinterpreted a graph, then proceeded to spew forth a series of nonsensical rants with such gems as “your an idiot”.
On a site called vizworld.com, expect the readership to know a thing or two about graphical displays of quantitative information. If you want to discuss such topics, go ahead.
Looks like this discussion between Jeff and Chad is a rare gem. It showed a rarely seen 2nd win in Jeff who initially looked like a simple troll, but was later revealed to be quite intelligent.
The internet continues to amaze 🙂 that’s all I can say… and I’m just Mr Anonymous passing through this site via google – my search was simply “amd” and clicked on “News”. Such is life on the net. Cheers guys.
Ratios are often blown out of proportion too. If the 5870 runs a game at 15 FPS and the GTX 480 runs the same game at 20 FPS, then the graph is going to say their is a 25% improvement over the Radeon when really its a measly 5 FPS you won’t even notice.
The 5870 is faster than 480 in BFBC2, in Metro 2033, in RE5, in Batman, and there are more.
Here is a link you can see for your self because I know you won’t believe me if I don’t give you a source. And after you read the source you still will won’t believe it because you’re the type of person who can never be wrong.
http://forums.anandtech.com/showthread.php?t=2062934
I already said this before in one of my earlier posts but I will again tell you since you are too lazy to read. No crap the graph is normalized to one card, and who knows if this graph is accurate because you don’t know what values the creator used, nor do I. Know one really knows what values were used to calculate the ratios or what have you. Again the Y-Axis should be FPS. Actual values you can look at and relate to. Your games don’t return normalized information, benchmark programs do not return normalized information, why… because it is hard to relate to them and not as appealing. People want to see the ACTUAL value or performance when comparing. Another thing is that the 5870 is normalized which is an AVERAGE figure. Please note I said AVERAGE. NOW everyone with half a brain knows that the GTX 480 is not faster than the 5870 in all these games that are listed in the graph. This is false and inaccurate data because of the method the creator chose to construct the graph. If you do not understand this then your an idiot.
The Y-Axis should be FPS. End of Discussion.
@ Jeff
What unit would you use for the Y axis that would yield a meaningful comparison based on the X axis? How does one create a common metric across varied software? Since we are comparing not the games themselves, but the cards, it is pointless to base the the Y axis on the anything other than the cards. Hence the graph is normalized to one card. It very clearly shows the relative performance advantage of the Nvidia cards, which is the entire point of the comparison. It’s designed to reduce confusion, which I think it does for 2/3 of the people we’ve sampled here.
One more thing I want to add, because I know CHAD will be back for round 2, I can just tell that is how he is. I should be more specific why these graphs are bad, its because since its a ratio no one can tell from the graph the “ACTUAL” performance values of each card(s). Who knows what numbers the person used to compile this graph. This could be totally BOGUS information. There is only one person who knows and that is the creator. That is why they should have just shown the “ACTUAL” performance values for each card, and each game.
I understand that completely, it is just how Chad was an asshole that pissed me off. Maybe I shouldn’t have called the person who created this graph an idiot, I should have said that who compiled this graph used a horrible technique. That is not the best way to show data, and compare other video cards. There are many ways that are better and look less ambiguous. I should have been more specific for people like CHAD. What I meant when I said bogus was that the graph is confusing for a lot of people who do not know much about this stuff. If they were to look at that graph they would think the 5870 sucks compared the 480, which simply is not true.
CHAD says “which is the only reasonable comparison to make considering the x-axis”, this doesn’t even make any sense. The person who compiled the graph easily could have put vantage scores, or another benchmark score, for each game and left the x-axis as is.
@Chad
haha you are so dumb dude.
So explain to me idiot why the 5870 was equal to 1 in each game that was tested ????? Just explain to me how that is the best way to present information like that????? People who don’t know anything like you will be like oh the 5870 is just 1 in every game but the NVIDIA cards are better in every single game. Everyone knows the 480 does not beat the 5870 in all games, so I am waiting for you to tell me how that is a reasonable graph that doesn’t show any discrepancy. I can tell you are an idiot and but fuck NVIDIA cards in the ass daily.
It’s a pretty common graphing technique. They chose the radeon as their baseline comparison, and normalize everything else to that. Places where the NVidia scores >1 , the NVidia did better. Where it scores <1, the Radeon did better.
@ Jeff
Sigh.
1 = 5870. It’s a chart comparing relative performance (which is the only reasonable comparison to make considering the x-axis). There doesn’t need to be a title, since it’s pretty obvious. I doubt the person who put together the chart was an idiot, but I suppose there may be idiots reading it.
That graph comparing video cards is obviously bogus and the person who made it obviously doesn’t like ATI/AMD cards. The 5870 is exactly the same speed through out the whole graph for each game which is bull shit. There isn’t even a title for the y-axis so who knows what that information means. If anyone takes this graph serious you are an idiot just like the person who compiled the graph.
.