Jump to content
Sign in to follow this  
Malevolence

AMD & Nvidia Desktop Graphics Cards Thread

Recommended Posts

The GeForce GTX 590 is terribly not worth it, as it does not contain 4GB of video memory, and parts of the specifications are not as fast as the AMD Radeon 6990, which still has full precision.

 

At first we thought that AMD Radeon 6990 is the world's fastest video card. Where are the results tests?

Share this post


Link to post

The HD6990 and GTX590 are equally matched, but in many cases the H6990 has both higher FPS and power consumption.

 

Share this post


Link to post
AMD Radeon HD 6790 Launched

AMD today launched its latest graphics card targeting the US $150 sweet-spot, the Radeon HD 6790. The new GPU is intended to make 1080p gaming accessible to even more gamers. It is based on the 40 nm "Barts" silicon (on which HD 6800 series is also based), and features 800 VLIW5 stream processors, 40 TMUs, 16 ROPs, and a 256-bit wide GDDR5 memory interface holding 1 GB of memory. The core is clocked at 840 MHz, and memory at 1050 MHz (4.20 GHz effective).

 

21ab.jpg

 

The AMD Radeon HD 6790 might be rarely available in a reference design, rather, AMD's AIB partners are allowed to come up with their own board designs. The new GPU will be launched by various AIB partners, including Sapphire, HIS, XFX, PowerColor, VisionTek, Club3D, ASUS, Gigabyte, MSI, ColorFire, etc. Target price is US $149.99.

 

http://www.techpowerup.com/143551/AMD-Rade...0-Launched.html

http://www.amd.com/us/products/desktop/gra...overview.aspx#1

Share this post


Link to post
NVIDIA Slips in GeForce GT 520 Entry-Level Graphics Card

Even as AMD's Radeon HD HD 6450 ebbs and flows between paper-launch and market-launch, NVIDIA is ready with its competitor, launched by its AIC partners: the GeForce GT 520. The new GPU marks NVIDIA's current-generation entry to the very basic low-end discrete graphics card segment, which are intended to be integrated graphics replacement products. NVIDIA's GeForce GT 520 is based on the new 40 nm GF118 silicon, it packs 48 CUDA cores, and a 64-bit wide GDDR3 memory interface, while being compact enough to fit on low-profile single slot board designs, if it's backed by an active (fan) heatsink. It is possible that passive heatsinks take up two slots.

 

boardshotgeforcegt5203q.th.png boardshotgeforcegt520fn.th.png boardshotgeforcegt520br.th.png

 

The core is clocked at 810 MHz, and CUDA cores at 1620 MHz. The memory is clocked at 900 MHz (actual, 1.8 GHz effective), churning out memory bandwidth of 14.4 GB/s. The card is designed to have three kinds of outputs which will be available on most partner designs: DVI, D-Sub (usually detachable), and full-size HDMI 1.4a, with HDMI audio. The card relies entirely on slot power. Its maximum power draw is rated at 29W. The GT 520 should take up entry-level price points around the US $50 mark.

 

http://www.techpowerup.com/forums/showthread.php?t=143971

http://www.geforce.com/#/Hardware/GPUs/gef...gt-520/overview

 

 

AMD Unveils Radeon HD 6450 Entry-Level Graphics Card

AMD apparently unveiled its new entry-level GPU, the Radeon HD 6450 today, with some leading sites publishing mini performance reviews (check out Today's Reviews on the front page). The new GPU is intended to be an integrated graphics substitute, which gives desktops all the essential features that today's desktop environments demand, such as DirectX 11 support, Aero acceleration, various kinds of HD video hardware acceleration features, apart from the obvious benefit of discrete GPUs: not taxing the system memory as frame buffer.

 

43aoy.th.jpg

 

The HD 6450 is based on the 40 nm Caicos GPU, it packs twice the amount of shader compute power as the previous generation, packing 160 VLIW5 stream processors, 8 TMUs, 4 ROPs, and a transistor count of 370 million. The 160 stream processors, with a core speed of 625~750 MHz (differs between AIBs), churn out compute power of up to 240 GFLOPs. It packs a 64-bit wide memory interface, that supports GDDR5 (clocks: 3.20 GHz to 3.60 GHz) and GDDR3 (1066 MHz to 1600 MHz). It has an idle board power of 9W, and max board power of 27W. Most implementations are low-profile single-slot, some even passive. The reference board features display outputs that include DVI, HDMI 1.4a, and D-Sub. Expect a $50~$60 price point. The official announcement however, is slated for April 19, or at least that's what we were told.

 

http://www.techpowerup.com/143670/AMD-Unve...phics-Card.html

Share this post


Link to post
The HD6990 and GTX590 are equally matched, but in many cases the H6990 has both higher FPS and power consumption.

 

Believe it or not, but the HD6990 uses power much more efficiently overall. It actually uses about W90~ less than the GTX590 and gets almost higher FPS than the GTX590. I thought I would throw that in there.

 

Share this post


Link to post
Believe it or not, but the HD6990 uses power much more efficiently overall. It actually uses about W90~ less than the GTX590 and gets almost higher FPS than the GTX590. I thought I would throw that in there.

 

Its a very big fight between AMD's Radeon HD 6990 and Nvidia's GeForce GTX 590.

In terms of size, Nvidia's dual GPU solution is shorter than AMD's 12 inches long card. Nvidia wins.

In terms of fan noise, Nvidia wins.

In terms of overclockability, AMD wins. (Dual bios switch is a bonus plus!)

Power consumption, both are relatively similar.

In various benchmark tests, some tests Nvidia wins, but most of the time AMD wins.

Nvidia's GeForce GTX 590 was a bit of a let down, it was downclocked severely, and overclockability sucks, lousy VRM, much to my disappointment.

On the other hand, while I really recommend that the engineers at AMD should seriously change their stock blower fan to a quieter and more efficent one, AMD's Radeon HD 6990 manage to surpass its rival in many performance tests. With 4GB GDDR5 memory on a single card (VS 3GB GDDR5 memory in Nvidia's GeForce GTX 590), a very useful dual bios switch to allow users to overclock beyond specifications, more overclockable than Nvidia's GeForce GTX 590, more display ports than Nvidia (4 mini-display ports and 1 DVI port VS 1 minidisplay port and 3 DVI ports)...

 

Overall, AMD's Radeon HD 6990 won the title as the "World's Fastest Graphic Card" and "World's Best Graphics Card" at this point of time. :thumbsup: :thumbsup:

Share this post


Link to post

You know what's odd? How PC hardware still all has those big-ass fans, while the hardware in the iPad is just as powerful but simply efficient enough to barely produce any heat.

Share this post


Link to post
You know what's odd? How PC hardware still all has those big-ass fans, while the hardware in the iPad is just as powerful but simply efficient enough to barely produce any heat.

 

"PC hardware" is a lot more powerful than the ARM based processors used in tablets such as the iPad or Galaxy Tab. Heck, even smartphones such as the Galaxy S, iPhone 4 and Desire HD use ARM processors. The power of dual core mobile processors are a little higher than the original Xbox.

 

That's why highend PC hardware needs those big-ass fans :P

 

x86 CPU performance is sometimes measured in gigaFLOPS, mobile processors are measured in megaFLOPS. Mobile processors are catching up, as the Nvidia Kal-El can beat a Core 2 Duo, I can't remember which model though...

 

Fun fact: Samsung makes Apple's A4/A5 Chips as well as their own Hummingbird/Exynos chips which in turn are all based on ARM chips. :P

Edited by Tore

Share this post


Link to post

That's not my point. My point is energy efficiency. If they used the same energy efficiency standards used in the Apple stuff, nothing would need fans. Not even CPUs. These things seriously don't heat up at all, despite not having fans.

 

Also, no matter what you claim, I've seen iPhones and iPads produce 3D stuff which looked like 3D movie quality, in real time rendering. Look at Infinity Blade.

Share this post


Link to post
Also, no matter what you claim, I've seen iPhones and iPads produce 3D stuff which looked like 3D movie quality, in real time rendering. Look at Infinity Blade.

 

So? Any good smartphone, tablet or computer can do that today and produce little to no heat. Modern PC's with "big-ass fans" can render a 10 min long FullHD 3D movie (1920 x 1080 + 1920 x 1080) in minutes, they can play 3D games in stereoscopic 3D in resolution over 3 times that of FullHD.

 

Apple uses the same stuff as everyone else in the industry. ARM processors are a lot more energy efficient than x86 processors which are used in PC's. When you look at Apple PC's you still see fans, or in the case of a MacBook Air a mix of passive and active cooling. Infinity Blade could run just as fine run on an Android device. My Desire Z could run it, if Epic made a Android version.

 

I've overclocked my Desire Z from 800 MHz to 1.5 GHz and barely noticed any difference in heat even when running any kind of emulator or benchmark.

 

Midrange PC's can use passive cooling and still play the newest games just fine. Fans are not needed but they have stayed just to be sure.

 

All ARM based chips like the Apple A5, Samsung Exynos, or Qualcomm Snapdragon are more energy efficient than x86 processors...unless we add AMD Fusion or Intel Atom in to the mix...

 

Another fun fact: Backlit displays consume more power than the average PC.

 

As far back as 2007 before the time of iPhones and iPads people played HD video and 3D games on their smartphones without "burning their hands", this is nothing new.

 

Edit: Whoopsie, after reading what I wrote this post became more of a "Why Apple isn't special" post :P

Edited by Tore

Share this post


Link to post

NVIDIA Introduces New GeForce GTX 560 GPU and Faster GeForce R275 Drivers

 

NVIDIA today introduced the GeForce GTX 560 GPU, the latest addition to the company's Fermi architecture-based product family, which brings amazing performance and enhanced features such as NVIDIA PhysX, 3D Vision, SLI and Surround technologies to this summer's hottest PC games.

 

Starting at $199 USD, the GeForce GTX 560 joins its big brother, the previously launched GTX 560 Ti GPU, in delivering an awesome gaming experience in its price class for games running at 1080p, the world's most popular gaming resolution, according to Valve's Steam Hardware and Software Survey.

 

geforcegtx560f.th.png geforcegtx560bracket.th.png geforcegtx5603qtr.th.png

 

NVIDIA today also released beta GeForce R275 drivers. They bring increased performance and enhanced functionality to a broad spectrum of PC games, including 3D Vision support to Duke Nukem Forever, PhysX support to Alice: Madness Returns, and Surround support to Dungeon Siege III.

 

Highlights of GeForce R275 Drivers:

 

Performance boost across a variety of games, including Crysis 2 (6%), Bulletstorm (15%), and Portal 2 (8%)

NVIDIA Update technology now includes SLI profiles

Improved desktop scaling experience with new user interface and features

Improved resizing experience for HDTVs

More than 525 3D Vision gaming profiles, including new additions for Portal 2, Duke Nukem Forever, Age of Empires Online, Assassin's Creed Brotherhood and Dungeon Siege III, among others

New 3D Vision Photo Viewer with Windowed Mode Support

Support for more than 65 3D Vision Ready displays, including desktop monitors, notebooks and projectors.

 

The GeForce GTX 560 GPU is available starting today from the world's leading add-in card partners, including ASL, Asus, Colorful, ECS, EVGA, Gainward, Galaxy, Gigabyte, Innovision 3D, Jetway, Leadtek, MSI, Palit, Point of View, PNY, Sparkle, Zotac and others. GeForce R275 drivers are available directly from www.geforce.com or from the driver download page on nvidia.com.

 

Share this post


Link to post

OMG monster cards are back!!!

ASUS MARS II Graphics Card Pictured

The Republic of Gamers MARS II, detailed earlier, is a new custom dual-GF110 based graphics card in the works at ASUS. Here are some of its first pictures, revealing a monstrosity that's about as long as a Radeon HD 5970, a couple of inches higher, and three slots thick. Its cooler sticks to the black+red color scheme in use with ASUS ROG products for a while now, and uses an intricate cutout design.

 

The shroud suspends two 120 mm high-sweep fans that blow air on to two heatsinks with highly dense aluminum fin arrays to which heat is fed by copper heat pipes. The card draws power from three 8-pin PCI-Express power connectors. The card uses two NVIDIA GF110 GPUs with the same core configuration and clock profile as GeForce GTX 580, effectively making MARS II a dual-GTX 580, which also provides the overclocking headroom of a GTX 580, something impossible on a NVIDIA GeForce GTX 590.

160av.jpg

160bh.jpg

160cq.jpg

160dq.jpg

160eu.jpg

Edited by Malevolence

Share this post


Link to post

Holy ****.

Share this post


Link to post

Time to see the scale of the monstrosity when compared to her hands! :thumbsup:

24920310150322413207388.jpg

25365410150322413282388.jpg

mars004.jpg

mars006.jpg

 

Edited by Malevolence

Share this post


Link to post

Meanwhile, for the AMD side...

 

PowerColor to Challenge ASUS MARS II with Monstrous Dual-HD 6970 Graphics Card

While between the GeForce GTX 580 and Radeon HD 6970, the former is clearly the faster graphics card, the two share a disputed lead over each other in their dual-GPU avatars, GeForce GTX 590 and Radeon HD 6990, attributed to the HD 6990 sustaining clock speeds closer to those on its single-GPU implementation, and a better electrical design. While NVIDIA is fixing the electricals on a revised PCB design scheduled for release in the weeks to come, companies like ASUS are wasting no time in designing their own PCBs that can let the two NVIDIA GF110 GPUs sustain clock speeds identical to those on the single-GPU GTX 580. This would pose serious competition to the HD 6990. To ward that off, PowerColor is working on a new Radeon HD 6970 X2 graphics card, which has two AMD Cayman GPUs clocked on par with single-GPU HD 6970, and having the same overclocking headroom.

 

The new card from PowerColor is not just an overclocked HD 6990, but also has the overclocking headroom of the HD 6970. Further, unlike the HD 6990, it uses Lucid Hydra technology. The PLX-made, AMD-branded PCI-Express bridge chip is replaced by a LucidLogix-made bridge chip that gives each GPU PCI-Express 2.0 x16 bandwidth. Users can run the two GPUs in either AMD CrossFire (with Hydra features disabled), or enable Lucid Hydra Engine features, and let the two GPUs work in tandem with any other graphics card installed in the system, that uses GPUs of any make and generation.

 

 

The card draws power from three 8-pin PCI-E power connectors, power is conditioned by two sets of 6+2 phase VRM. Each GPU has 2 GB of GDDR5 memory across a 256-bit wide memory interface. The GPUs are said to have clock speeds equal to, or higher than those of the HD 6790, that's 880 MHz core, 5.50 GHz memory. The beast is cooled by a humongous triple-slot cooler that uses a 120 mm and a 140 mm fan, to cool dense aluminum fin array heatsinks. Display outputs are the same as HD 6970, that's two DVI, two mini-DP, and a HDMI.

178a.jpg

178bi.jpg

178cf.jpg

178de.jpg

 

One monstrosity after another. Scary hardware!

Share this post


Link to post
PowerColor Releases Radeon HD 6870 X2 2 GB Graphics Card

TUL Corporation, a leading manufacturer of AMD graphics cards, today announces the very first dual GPU solution with BART XT graphics engine: the PowerColor HD6870X2. Powered by dual graphics engine, the PowerColor HD6870X2 delivers ground-breaking performance against the competitors and takes the game to the next level, enabling an unprecedented gaming experience you’ve never felt before.

 

The PowerColor HD6870X2 has 2240 stream processing units and 4.03 teraFLOP’s of computing power, easily accelerating the gaming speed and maximizing the gaming power with 900MHz core speed and 1050MHz memory speed, fully tackling the most demanding game titles and allows the amazing gaming experience.

 

Furthermore, the HD6870X2 takes advantage of Heat Pipe Direct Touch (HDT) technology; with 6 pieces of flattened heat pipes directly cover the GPU, allows 50 times better heat dissipation than copper base, provides an extremely cool working environment.

 

The latest dual core solution also equipped with “Platinum Power Kit”, including 13 phases PWM, ferrite Core Choke and DrMos; all these superior components provided the ultra stable platform and great power efficiency.

123ad.jpg

123bu.jpg

123cdn.jpg

 

http://www.techpowerup.com/147860/PowerCol...phics-Card.html

http://www.powercolor.com/Global/products_...ures.asp?id=364

Share this post


Link to post

EVGA Ready with GeForce GTX 580 Classified Graphics Card

In the wake of ASUS' ROG MATRIX GeForce GTX 580 and MSI N580GTX Lightning, EVGA is ready with its own enthusiast-grade GeForce GTX 580 designed for extreme overclocking. The EVGA GTX 580 Classified combines a strong electrical circuitry with powerful air-cooling and a feature-set designed for overclockers. To begin with, the card uses a tall PCB that draws power from one 6-pin and two 8-pin PCI-E power connectors.

 

The card uses a strong VRM circuitry that consists of solid-state chokes (which can't whine), direct-FETs (low RDS), and proadlizers (better power conditioning). The power connectors are fused to prevent surges from damaging anything, there are consolidated voltage measurement points, LEDs are used to indicate power status of each power domain (NVVDD, FBVDD, and PEXVDD). The card also features two BIOS ROMs that are selectable by a switch. One stores an overclocked profile, and the other stores a failsafe reference speed profile.

 

The card is cooled by a large air-cooler that covers the entire area of the PCB. It looks to have the same principle as the NVIDIA reference cooler, of a dense heatsink over most of the hot parts of the PCB, probably uses vapor-chamber technology. The heatsink features air channels which are ventilated by the blower. This card looks to have a larger blower than the one on the NVIDIA reference cooler. More details are awaited.

 

580classifiedbsw.jpg

580classifiedcard.jpg

580classifiedinpeg.jpg

580classifiedsr2n111.jpg

580classifiedvrm.jpg

09323063.jpg

09323183.jpg

09323201.jpg

09330490.jpg

09330544.jpg

 

http://www.evga.com/forums/tm.aspx?m=1103387

 

Damn, that is some serious hardware hardcore pr0n!

Share this post


Link to post

It's going to take years before someone will realize you can slim down four powerful video cards into one and save energy, space and money in that way. If that latest card is a dual-core GPU, combing those four cards into one will become an octa-core GPU, but that's just not possible yet.

 

 

 

Share this post


Link to post

Oh come on, that **** just gets more and MORE excessive! :o

Share this post


Link to post

That EVGA GTX 580 Classified is indeed a serious contender to this beautiful card by MSI, released a while back.

 

MSI N580GTX Lightning Xtreme Edition

 

gtx580lightningxe3.jpg

gtx580lightningxe19.jpg

gtxgtx580xe1b.jpg

gtxgtx580xe7b.jpg

gtxgtx580xe11b.jpg

gtxgtx580xe12b.jpg

1309163052zcb17fawtk11l.jpg

1309163052zcb17fawtk12l.jpg

1309163052zcb17fawtk13l.jpg

1309163052zcb17fawtk14l.jpg

1309163052zcb17fawtk15l.jpg

1309163052zcb17fawtk16l.jpg

 

I thought this was one of the truly more refined GTX580 solution by far. :thumbsup: :thumbsup:

 

Some cool factor about this, heat sensitive fans that changes color upon heat above 45 degree Celsius.

GTX_580_LIGHTNING_XE_20.gif

 

Very tempted to grab two of those, should I, or should I not?

Edited by Malevolence

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Recently Browsing   0 members

    No registered users viewing this page.

×