PurpleGaga27 40 Posted March 24, 2011 The GeForce GTX 590 is terribly not worth it, as it does not contain 4GB of video memory, and parts of the specifications are not as fast as the AMD Radeon 6990, which still has full precision. At first we thought that AMD Radeon 6990 is the world's fastest video card. Where are the results tests? Share this post Link to post
Tore 33 Posted March 24, 2011 The HD6990 and GTX590 are equally matched, but in many cases the H6990 has both higher FPS and power consumption. Share this post Link to post
Doctor Destiny 41 Posted March 24, 2011 I'll keep my GTX460 and be happy. Never understood the point in spending $600+ on a video card. Share this post Link to post
Malevolence 6 Posted April 5, 2011 AMD Radeon HD 6790 Launched AMD today launched its latest graphics card targeting the US $150 sweet-spot, the Radeon HD 6790. The new GPU is intended to make 1080p gaming accessible to even more gamers. It is based on the 40 nm "Barts" silicon (on which HD 6800 series is also based), and features 800 VLIW5 stream processors, 40 TMUs, 16 ROPs, and a 256-bit wide GDDR5 memory interface holding 1 GB of memory. The core is clocked at 840 MHz, and memory at 1050 MHz (4.20 GHz effective). The AMD Radeon HD 6790 might be rarely available in a reference design, rather, AMD's AIB partners are allowed to come up with their own board designs. The new GPU will be launched by various AIB partners, including Sapphire, HIS, XFX, PowerColor, VisionTek, Club3D, ASUS, Gigabyte, MSI, ColorFire, etc. Target price is US $149.99. http://www.techpowerup.com/143551/AMD-Rade...0-Launched.html http://www.amd.com/us/products/desktop/gra...overview.aspx#1 Share this post Link to post
Malevolence 6 Posted April 13, 2011 NVIDIA Slips in GeForce GT 520 Entry-Level Graphics Card Even as AMD's Radeon HD HD 6450 ebbs and flows between paper-launch and market-launch, NVIDIA is ready with its competitor, launched by its AIC partners: the GeForce GT 520. The new GPU marks NVIDIA's current-generation entry to the very basic low-end discrete graphics card segment, which are intended to be integrated graphics replacement products. NVIDIA's GeForce GT 520 is based on the new 40 nm GF118 silicon, it packs 48 CUDA cores, and a 64-bit wide GDDR3 memory interface, while being compact enough to fit on low-profile single slot board designs, if it's backed by an active (fan) heatsink. It is possible that passive heatsinks take up two slots. The core is clocked at 810 MHz, and CUDA cores at 1620 MHz. The memory is clocked at 900 MHz (actual, 1.8 GHz effective), churning out memory bandwidth of 14.4 GB/s. The card is designed to have three kinds of outputs which will be available on most partner designs: DVI, D-Sub (usually detachable), and full-size HDMI 1.4a, with HDMI audio. The card relies entirely on slot power. Its maximum power draw is rated at 29W. The GT 520 should take up entry-level price points around the US $50 mark. http://www.techpowerup.com/forums/showthread.php?t=143971 http://www.geforce.com/#/Hardware/GPUs/gef...gt-520/overview AMD Unveils Radeon HD 6450 Entry-Level Graphics Card AMD apparently unveiled its new entry-level GPU, the Radeon HD 6450 today, with some leading sites publishing mini performance reviews (check out Today's Reviews on the front page). The new GPU is intended to be an integrated graphics substitute, which gives desktops all the essential features that today's desktop environments demand, such as DirectX 11 support, Aero acceleration, various kinds of HD video hardware acceleration features, apart from the obvious benefit of discrete GPUs: not taxing the system memory as frame buffer. The HD 6450 is based on the 40 nm Caicos GPU, it packs twice the amount of shader compute power as the previous generation, packing 160 VLIW5 stream processors, 8 TMUs, 4 ROPs, and a transistor count of 370 million. The 160 stream processors, with a core speed of 625~750 MHz (differs between AIBs), churn out compute power of up to 240 GFLOPs. It packs a 64-bit wide memory interface, that supports GDDR5 (clocks: 3.20 GHz to 3.60 GHz) and GDDR3 (1066 MHz to 1600 MHz). It has an idle board power of 9W, and max board power of 27W. Most implementations are low-profile single-slot, some even passive. The reference board features display outputs that include DVI, HDMI 1.4a, and D-Sub. Expect a $50~$60 price point. The official announcement however, is slated for April 19, or at least that's what we were told. http://www.techpowerup.com/143670/AMD-Unve...phics-Card.html Share this post Link to post
Ferret 0 Posted April 13, 2011 The HD6990 and GTX590 are equally matched, but in many cases the H6990 has both higher FPS and power consumption. Believe it or not, but the HD6990 uses power much more efficiently overall. It actually uses about W90~ less than the GTX590 and gets almost higher FPS than the GTX590. I thought I would throw that in there. Share this post Link to post
Doctor Destiny 41 Posted April 13, 2011 The new cards can go to hell. My GTX460 is tits. Share this post Link to post
Malevolence 6 Posted April 14, 2011 Believe it or not, but the HD6990 uses power much more efficiently overall. It actually uses about W90~ less than the GTX590 and gets almost higher FPS than the GTX590. I thought I would throw that in there. Its a very big fight between AMD's Radeon HD 6990 and Nvidia's GeForce GTX 590. In terms of size, Nvidia's dual GPU solution is shorter than AMD's 12 inches long card. Nvidia wins. In terms of fan noise, Nvidia wins. In terms of overclockability, AMD wins. (Dual bios switch is a bonus plus!) Power consumption, both are relatively similar. In various benchmark tests, some tests Nvidia wins, but most of the time AMD wins. Nvidia's GeForce GTX 590 was a bit of a let down, it was downclocked severely, and overclockability sucks, lousy VRM, much to my disappointment. On the other hand, while I really recommend that the engineers at AMD should seriously change their stock blower fan to a quieter and more efficent one, AMD's Radeon HD 6990 manage to surpass its rival in many performance tests. With 4GB GDDR5 memory on a single card (VS 3GB GDDR5 memory in Nvidia's GeForce GTX 590), a very useful dual bios switch to allow users to overclock beyond specifications, more overclockable than Nvidia's GeForce GTX 590, more display ports than Nvidia (4 mini-display ports and 1 DVI port VS 1 minidisplay port and 3 DVI ports)... Overall, AMD's Radeon HD 6990 won the title as the "World's Fastest Graphic Card" and "World's Best Graphics Card" at this point of time. :thumbsup: Share this post Link to post
Nyerguds 100 Posted April 17, 2011 You know what's odd? How PC hardware still all has those big-ass fans, while the hardware in the iPad is just as powerful but simply efficient enough to barely produce any heat. Share this post Link to post
Tore 33 Posted April 17, 2011 (edited) You know what's odd? How PC hardware still all has those big-ass fans, while the hardware in the iPad is just as powerful but simply efficient enough to barely produce any heat. "PC hardware" is a lot more powerful than the ARM based processors used in tablets such as the iPad or Galaxy Tab. Heck, even smartphones such as the Galaxy S, iPhone 4 and Desire HD use ARM processors. The power of dual core mobile processors are a little higher than the original Xbox. That's why highend PC hardware needs those big-ass fans x86 CPU performance is sometimes measured in gigaFLOPS, mobile processors are measured in megaFLOPS. Mobile processors are catching up, as the Nvidia Kal-El can beat a Core 2 Duo, I can't remember which model though... Fun fact: Samsung makes Apple's A4/A5 Chips as well as their own Hummingbird/Exynos chips which in turn are all based on ARM chips. Edited April 17, 2011 by Tore Share this post Link to post
Nyerguds 100 Posted April 17, 2011 That's not my point. My point is energy efficiency. If they used the same energy efficiency standards used in the Apple stuff, nothing would need fans. Not even CPUs. These things seriously don't heat up at all, despite not having fans. Also, no matter what you claim, I've seen iPhones and iPads produce 3D stuff which looked like 3D movie quality, in real time rendering. Look at Infinity Blade. Share this post Link to post
Tore 33 Posted April 17, 2011 (edited) Also, no matter what you claim, I've seen iPhones and iPads produce 3D stuff which looked like 3D movie quality, in real time rendering. Look at Infinity Blade. So? Any good smartphone, tablet or computer can do that today and produce little to no heat. Modern PC's with "big-ass fans" can render a 10 min long FullHD 3D movie (1920 x 1080 + 1920 x 1080) in minutes, they can play 3D games in stereoscopic 3D in resolution over 3 times that of FullHD. Apple uses the same stuff as everyone else in the industry. ARM processors are a lot more energy efficient than x86 processors which are used in PC's. When you look at Apple PC's you still see fans, or in the case of a MacBook Air a mix of passive and active cooling. Infinity Blade could run just as fine run on an Android device. My Desire Z could run it, if Epic made a Android version. I've overclocked my Desire Z from 800 MHz to 1.5 GHz and barely noticed any difference in heat even when running any kind of emulator or benchmark. Midrange PC's can use passive cooling and still play the newest games just fine. Fans are not needed but they have stayed just to be sure. All ARM based chips like the Apple A5, Samsung Exynos, or Qualcomm Snapdragon are more energy efficient than x86 processors...unless we add AMD Fusion or Intel Atom in to the mix... Another fun fact: Backlit displays consume more power than the average PC. As far back as 2007 before the time of iPhones and iPads people played HD video and 3D games on their smartphones without "burning their hands", this is nothing new. Edit: Whoopsie, after reading what I wrote this post became more of a "Why Apple isn't special" post Edited April 17, 2011 by Tore Share this post Link to post
Malevolence 6 Posted May 17, 2011 NVIDIA Introduces New GeForce GTX 560 GPU and Faster GeForce R275 Drivers NVIDIA today introduced the GeForce GTX 560 GPU, the latest addition to the company's Fermi architecture-based product family, which brings amazing performance and enhanced features such as NVIDIA PhysX, 3D Vision, SLI and Surround technologies to this summer's hottest PC games. Starting at $199 USD, the GeForce GTX 560 joins its big brother, the previously launched GTX 560 Ti GPU, in delivering an awesome gaming experience in its price class for games running at 1080p, the world's most popular gaming resolution, according to Valve's Steam Hardware and Software Survey. NVIDIA today also released beta GeForce R275 drivers. They bring increased performance and enhanced functionality to a broad spectrum of PC games, including 3D Vision support to Duke Nukem Forever, PhysX support to Alice: Madness Returns, and Surround support to Dungeon Siege III. Highlights of GeForce R275 Drivers: Performance boost across a variety of games, including Crysis 2 (6%), Bulletstorm (15%), and Portal 2 (8%) NVIDIA Update technology now includes SLI profiles Improved desktop scaling experience with new user interface and features Improved resizing experience for HDTVs More than 525 3D Vision gaming profiles, including new additions for Portal 2, Duke Nukem Forever, Age of Empires Online, Assassin's Creed Brotherhood and Dungeon Siege III, among others New 3D Vision Photo Viewer with Windowed Mode Support Support for more than 65 3D Vision Ready displays, including desktop monitors, notebooks and projectors. The GeForce GTX 560 GPU is available starting today from the world's leading add-in card partners, including ASL, Asus, Colorful, ECS, EVGA, Gainward, Galaxy, Gigabyte, Innovision 3D, Jetway, Leadtek, MSI, Palit, Point of View, PNY, Sparkle, Zotac and others. GeForce R275 drivers are available directly from www.geforce.com or from the driver download page on nvidia.com. Share this post Link to post
Malevolence 6 Posted May 30, 2011 (edited) OMG monster cards are back!!! ASUS MARS II Graphics Card Pictured The Republic of Gamers MARS II, detailed earlier, is a new custom dual-GF110 based graphics card in the works at ASUS. Here are some of its first pictures, revealing a monstrosity that's about as long as a Radeon HD 5970, a couple of inches higher, and three slots thick. Its cooler sticks to the black+red color scheme in use with ASUS ROG products for a while now, and uses an intricate cutout design. The shroud suspends two 120 mm high-sweep fans that blow air on to two heatsinks with highly dense aluminum fin arrays to which heat is fed by copper heat pipes. The card draws power from three 8-pin PCI-Express power connectors. The card uses two NVIDIA GF110 GPUs with the same core configuration and clock profile as GeForce GTX 580, effectively making MARS II a dual-GTX 580, which also provides the overclocking headroom of a GTX 580, something impossible on a NVIDIA GeForce GTX 590. Edited May 31, 2011 by Malevolence Share this post Link to post
Doctor Destiny 41 Posted May 30, 2011 Three 8-pin connectors. What a beast. Share this post Link to post
Malevolence 6 Posted May 31, 2011 (edited) Time to see the scale of the monstrosity when compared to her hands! Edited May 31, 2011 by Malevolence Share this post Link to post
Malevolence 6 Posted May 31, 2011 Meanwhile, for the AMD side... PowerColor to Challenge ASUS MARS II with Monstrous Dual-HD 6970 Graphics Card While between the GeForce GTX 580 and Radeon HD 6970, the former is clearly the faster graphics card, the two share a disputed lead over each other in their dual-GPU avatars, GeForce GTX 590 and Radeon HD 6990, attributed to the HD 6990 sustaining clock speeds closer to those on its single-GPU implementation, and a better electrical design. While NVIDIA is fixing the electricals on a revised PCB design scheduled for release in the weeks to come, companies like ASUS are wasting no time in designing their own PCBs that can let the two NVIDIA GF110 GPUs sustain clock speeds identical to those on the single-GPU GTX 580. This would pose serious competition to the HD 6990. To ward that off, PowerColor is working on a new Radeon HD 6970 X2 graphics card, which has two AMD Cayman GPUs clocked on par with single-GPU HD 6970, and having the same overclocking headroom. The new card from PowerColor is not just an overclocked HD 6990, but also has the overclocking headroom of the HD 6970. Further, unlike the HD 6990, it uses Lucid Hydra technology. The PLX-made, AMD-branded PCI-Express bridge chip is replaced by a LucidLogix-made bridge chip that gives each GPU PCI-Express 2.0 x16 bandwidth. Users can run the two GPUs in either AMD CrossFire (with Hydra features disabled), or enable Lucid Hydra Engine features, and let the two GPUs work in tandem with any other graphics card installed in the system, that uses GPUs of any make and generation. The card draws power from three 8-pin PCI-E power connectors, power is conditioned by two sets of 6+2 phase VRM. Each GPU has 2 GB of GDDR5 memory across a 256-bit wide memory interface. The GPUs are said to have clock speeds equal to, or higher than those of the HD 6790, that's 880 MHz core, 5.50 GHz memory. The beast is cooled by a humongous triple-slot cooler that uses a 120 mm and a 140 mm fan, to cool dense aluminum fin array heatsinks. Display outputs are the same as HD 6970, that's two DVI, two mini-DP, and a HDMI. One monstrosity after another. Scary hardware! Share this post Link to post
Malevolence 6 Posted June 23, 2011 PowerColor Releases Radeon HD 6870 X2 2 GB Graphics Card TUL Corporation, a leading manufacturer of AMD graphics cards, today announces the very first dual GPU solution with BART XT graphics engine: the PowerColor HD6870X2. Powered by dual graphics engine, the PowerColor HD6870X2 delivers ground-breaking performance against the competitors and takes the game to the next level, enabling an unprecedented gaming experience you’ve never felt before. The PowerColor HD6870X2 has 2240 stream processing units and 4.03 teraFLOP’s of computing power, easily accelerating the gaming speed and maximizing the gaming power with 900MHz core speed and 1050MHz memory speed, fully tackling the most demanding game titles and allows the amazing gaming experience. Furthermore, the HD6870X2 takes advantage of Heat Pipe Direct Touch (HDT) technology; with 6 pieces of flattened heat pipes directly cover the GPU, allows 50 times better heat dissipation than copper base, provides an extremely cool working environment. The latest dual core solution also equipped with “Platinum Power Kit”, including 13 phases PWM, ferrite Core Choke and DrMos; all these superior components provided the ultra stable platform and great power efficiency. http://www.techpowerup.com/147860/PowerCol...phics-Card.html http://www.powercolor.com/Global/products_...ures.asp?id=364 Share this post Link to post
Malevolence 6 Posted July 6, 2011 EVGA Ready with GeForce GTX 580 Classified Graphics Card In the wake of ASUS' ROG MATRIX GeForce GTX 580 and MSI N580GTX Lightning, EVGA is ready with its own enthusiast-grade GeForce GTX 580 designed for extreme overclocking. The EVGA GTX 580 Classified combines a strong electrical circuitry with powerful air-cooling and a feature-set designed for overclockers. To begin with, the card uses a tall PCB that draws power from one 6-pin and two 8-pin PCI-E power connectors. The card uses a strong VRM circuitry that consists of solid-state chokes (which can't whine), direct-FETs (low RDS), and proadlizers (better power conditioning). The power connectors are fused to prevent surges from damaging anything, there are consolidated voltage measurement points, LEDs are used to indicate power status of each power domain (NVVDD, FBVDD, and PEXVDD). The card also features two BIOS ROMs that are selectable by a switch. One stores an overclocked profile, and the other stores a failsafe reference speed profile. The card is cooled by a large air-cooler that covers the entire area of the PCB. It looks to have the same principle as the NVIDIA reference cooler, of a dense heatsink over most of the hot parts of the PCB, probably uses vapor-chamber technology. The heatsink features air channels which are ventilated by the blower. This card looks to have a larger blower than the one on the NVIDIA reference cooler. More details are awaited. http://www.evga.com/forums/tm.aspx?m=1103387 Damn, that is some serious hardware hardcore pr0n! Share this post Link to post
PurpleGaga27 40 Posted July 6, 2011 It's going to take years before someone will realize you can slim down four powerful video cards into one and save energy, space and money in that way. If that latest card is a dual-core GPU, combing those four cards into one will become an octa-core GPU, but that's just not possible yet. Share this post Link to post
Luk3us 63 Posted July 7, 2011 Oh come on, that **** just gets more and MORE excessive! Share this post Link to post
Malevolence 6 Posted July 7, 2011 (edited) That EVGA GTX 580 Classified is indeed a serious contender to this beautiful card by MSI, released a while back. MSI N580GTX Lightning Xtreme Edition I thought this was one of the truly more refined GTX580 solution by far. :thumbsup: Some cool factor about this, heat sensitive fans that changes color upon heat above 45 degree Celsius. Very tempted to grab two of those, should I, or should I not? Edited July 7, 2011 by Malevolence Share this post Link to post