Featured Post Today
print this page
Latest Post
Showing posts with label Nvidia Graphics Card. Show all posts
Showing posts with label Nvidia Graphics Card. Show all posts

News Nvidia Technology Update : NVIDIA FY 2016 Q2 Results - GPU Sales Are Strong But Write Downs Hurt Bottom Line

Today NVIDIA released its quarterly results for the second quarter of their fiscal year 2016 (yes, 2016) and they had excellent sales of their GeForce GPUs, but have decided to write down their Icera modem business, which hit their operating expenses to the tune of around $90 million. Revenue for the quarter was up 5% though as compared to Q2 2015, and came in at $1.153 billion for the quarter. On a GAAP basis, gross margin was 55%, down 110 bps over last year and down 170 bps since last quarter. Net income was just $26 million, down 81% sequentially and 80% year-over-year. This resulted in diluted earnings per share of $0.05, down 77% from Q2 2015’s $0.22.
A big factor in this was the write down of their Icera modem division. NVIDIA had been looking for a buyer for their modem unit, but was unable to find a suitable buyer for the business and is therefore winding down operations in this unit. This caused a hit of $0.19 per diluted share. Also during the quarter, NVIDIA announced a recall of their SHIELD tablets due to overheating batteries, and there have been two cases of property damage due to this. This caused another hit of $0.02 per diluted share. They also had $24 million in expenses related to the Samsung and Qualcomm lawsuit.
NVIDIA’s non-GAAP results “exclude stock-based compensation, product warranty charge, acquisition-related costs, restructuring and other charges, gains and losses from non-affiliated investments, interest expense related to amortization of debt discount, and the associated tax impact of these items, where applicable” which means that they do not reflect either the Icera write-down, nor the tablet recall. On a non-GAAP basis, gross margin was up 20 bps to 56.6%, with net income up 10% to $190 million. Diluted earnings per share were $0.34, up 13% from Q2 2015’s $0.30 non-GAAP numbers. Despite a significant write-down and a recall, the core business is still doing very well.

For the quarter, NVIDIA paid out $52 million in dividends and repurchased $400 million in stock.

What is driving growth right now is its GPU business. Revenue for GeForce GPUs grew 51%, and NVIDIA has continued to see strength in the PC gaming sector. Fueled by the release of the GTX 980 and GTX 980 Ti, sales of high-end GTX GPUs “grew significantly” year-over-year. The Titan X would certainly fall in there as well, although unlikely at as high of volume. Maxwell has been a very strong performer, and gamers tend to go where the performance is. Souring the results somewhat is a decline in Tesla GPU sales, as well as Quadro GPU sales. Overall, GPU revenue was up 9% year-over-year to $959 million. Even as NVIDIA has tried to diversify with SoCs, their GPU business is still almost 85% of the company.

NVIDIA has found a niche in the automotive infotainment world, and that that area is still strong for them. Tegra has not taken off in the tablet or smartphone space in any meaningful way, but there was still growth in the automotive sales for Tegra. Overall Tegra processor revenue was down 19% year-over-year, which is mainly due to Tegra OEM smartphones and tablets. NVIDIA’s own Tegra sales in the Shield helped offset this loss somewhat, but as the recall filings showed, they only sold 88,000 SHIELD tablets. Margins are likely helped by the fact that they run their own SoC in it though.

NVIDIA’s “Other” segment is a fixed 66 million licensing payment from Intel, and as always, that is flat and does not change. This is from the 2011 settlement of a licensing dispute, and will end in 2017.
For Q3 2016, NVIDIA is expecting revenue to be $1.18 billion, plus or minus 2%, with margins of 56.2% to 56.5%.

NVIDIA is obviously a giant in the GPU space, and that is going very well for them. Sales are very strong, and PC gaming has been a strong point in an otherwise weakening PC market. They are attempting to diversify to mobile, but have found out just how difficult that can be, and had to write down their modem division completely. Without a good integrated modem, it will be difficult to gain traction in the smartphone space, but NVIDIA’s current SoC offerings don’t seem well suited to smartphones anyway. Their strength in GPU knowledge has certainly helped them with the GPU side of the equation, but their first attempt at CPU design has not been as strong. We shall see what their plans are for the SoC space going forward, but for now they are riding a wave of strong GPU sales, and that is a good thing for NVIDIA.

Source: NVIDIA Investor Relations

News Gaming Update : AMD Is Working On A New Linux Graphics Driver To Catch Up With Nvidia

There’s no doubt about it: AMD’s Linux graphics drivers are behind Nvidia’s, something that will start mattering a lot more when Valve’s first Linux-based Steam Machines start hitting the market this November.
AMD hasn’t turned the ship around yet, and big-name games are still only supporting Nvidia hardware when they launch on Linux. But AMD hasn’t been sitting on its hands. AMD’s developers are working on a new Linux driver architecture that will result in better open-source drivers, too—eventually.

How has AMD been doing?

Before we dive into that, though, let’s recap what’s happened since out last look at the subject of Linux graphics drivers.

Nvidia is still maintaining its lead over AMD on Linux, and new games still target Nvidia hardware. Middle-earth: Shadow of Mordor recently launched on Linux thanks to Feral Interactive, but it only officially supports Nvidia graphics cards. The official FAQ says you’ll experience poor performance if you attempt to run it on an AMD graphics card. 
Phoronix recently discovered you can boost the performance of Counter-Strike: Global Offensive on Linux when you’re using an AMD graphics card just by renaming the “csgo_linux” binary to “hl2_linux”. This will give you as much as a 40 percent graphics boost. The AMD Catalyst driver has application profiles designed for Source engine games, but AMD’s developers haven’t bothered adding csgo_linux to the application profiles—despite Counter-Strike: Global Offensive having been out for a year at this point.

Application profiles are commonly used across operating systems and drivers, so this is normal. What looks bad for AMD here is how slow it’s been to maintain these application profiles when compared to Nvidia on Linux and even AMD’s own profiles on Windows.

It’s not all bad for AMD users. AMD released Catalyst 15.5 Linux in early June. Phoronix ran some benchmarks and put it bluntly: “Metro Last Light Redux and Metro 2033 Redux no longer run like garbage on AMD Linux.” That’s an improvement, but the AMD Catalyst graphics drivers are still behind Nvidia’s. And that was the only game that noticeably improved in performance with the new drivers.

AMD’s new graphics driver architecture :
Currently, there are two main AMD graphics drivers on Linux. There’s the open-source “Radeon” driver and the closed-source “Catalyst” driver. As with Nvidia’s drivers, the open-source driver is fine for just using a graphical desktop with AMD graphics cards, but you’ll want the closed-source driver to get maximum gaming performance.

AMD has now been pursuing a “unified” Linux driver strategy and writing an entirely new driver. This driver, known as “AMDGPU,” will have a single Linux kernel module, which will be open-source. The closed-source Catalyst code will continue to exist, but it will be a smaller “binary blob” that runs in userspace. Open-source fans who don’t need maximum gaming performance can skip the Catalyst blob and use an entirely open-source driver.

 This driver will only be used for new AMD graphics cards. It will only support the very latest GPUs and future AMD graphics hardware.
The new structure could help a lot. Rather than two entirely separate drivers with separate kernel modules, there will be a single open-source kernel driver. The closed-source Catalyst part of the driver becomes much smaller and confined to userspace. AMD won’t have to update the Catalyst driver whenever there’s a new Linux kernel or X.org X server release. It will automatically be compatible because the Catalyst driver is a smaller piece of code that hooks into the open-source AMD driver included in the projects themselves.

While AMD isn’t going completely open-source as Intel did with its graphics drivers, it’s much more open-source-friendly than Nvidia’s strategy of going it alone. Linux developers have wanted closed-source kernel modules to go away for a long time, too. For more details, read Phoronix’s report on AMD’s new Linux driver strategy.

This driver should appear over 2015, and the “AMDGPU” kernel driver is set to debut in Linux 4.2. (Yes, we’re already past Linux 4.0 !) However, the driver is in a very early state and has a long way to go. Don’t expect to be using it any time soon.

In the long run, this could be what helps AMD close the gap with Nvidia when it comes to Linux graphics drivers. We should all hope so, anyway—it would be best for Steam Machines if AMD and Nvidia were competitive.

News Graphics Card Review Update : Nvidia GeForce GTX 980 ti

The GTX 980 Ti is the undisputed top-end gaming champion and will play all current titles at 4K resolutions.

Nvidia might already have one of the fastest graphics cards on the planet in the GeForce Titan X, but it was very much the Bugatti Veyron of GPUs; insanely fast, overkill for almost everyone and ludicrously expensive. Unless you were in serious need of graphics memory, which the Titan X has in spades, it was difficult to justify the incredible £800+ price. Nvidia sensibly left enough of a price gap between the Titan X and the £400 GTX 980 to fit in a card that could compete with AMD's impending Fury X: the GTX 980 Ti.

THE GPU :

The 980 Ti is essentially a scaled-down GeForce Titan X, although there's hardly a massive gulf between the two cards. Both use the same GM200 GPU, which is based on Nvidia's energy-efficient Maxwell architecture and manufactured on a 28nm process. Both run at a 1GHz base clock and boost to 1,075MHz. Both have the same 250W TDP, and with effective cooling should prove to be monstrous overclockers.
There are differences, though. Nvidia has reduced the number of CUDA cores from 3,072 to 2,816, lowered the texture units from 192 to 176 and removed a pair of streaming multiprocessors (SMMs), leaving 22 rather than the 24 found in the Titan X. The GTX 980 Ti has 6GB of GDDR5 memory, compared to the Titan X's 12GB, although because both cards use a 384-bit memory bus and clock the RAM chips at an effective 7GHz, they have the same 336GB/sec peak memory bandwidth.

THE CARD :

At 267mm long, the 980 Ti is no larger than Nvidia's current top-end graphics cards, and as such should fit inside most ATX cases without needing to remove drive cages. The green backlit logo is a nice touch, illuminating the interior of your case and giving you something to look at if you have a windowed side panel. There's one six-pin and one eight-pin PCI-Express power socket on the front edge of the card as it sits in your case, which should help you keep cables in check. Two SLI connections will even let you run four cards in SLI, if your motherboard (and bank balance) will support it.

Naturally for a card designed to play games at 4K resolutions, the GTX 980 Ti has three DisplayPort 1.2 ports on the back for hooking up to Ultra HD monitors. It also has a single HDMI 2.0 port, meaning you can hook it up to a 4K TV, and dual-link DVI.
The GTX 980 Ti is a fully DirectX 12-compliant card, meaning it can take advantage of more realistic smoke, fire and material effects once developers start using the DirectX 12 API in their games. It also works with Nvidia's G-Sync adaptive refresh technology, meaning you can eliminate screen tear in games when playing on a compatible G-Sync monitor. Add in optimisations for virtual reality gaming, including multi-resolution shading to only render the pixels visible through the spherical lenses of a VR headset, and the GTX 980 Ti is about a future-proof as it's possible to get.

PERFORMANCE:


Unlike the Titan X, which was only available as a reference design card, Nvidia is letting its board partners release GTX 980 Ti cards with custom coolers and out-of-the-box overclocks. We've looked at the reference design, complete with Nvidia's standard radial fan blower heatsink. It's surprisingly quiet in use, and managed to keep the GPU core below 60 degrees Celsius throughout our testing.

With the GTX 980 Ti installed in our reference PC, it quickly became clear that no games can trouble the card at 1,920x1,080. Dirt Showdown produced a silky 126.8fps with Ultra settings and 4x MSAA. Even with demanding super sampling anti-aliasing (SSAA) and Ultra detail enabled, we saw incredibly smooth frame rates in both Tomb Raider and Metro: Last Light Redux, at 156fps and 64fps respectively.

Stepping up to 2,560x1,440 wasn't enough to make the 980 Ti sweat, either. Dirt Showdown maintained a fantastic 115.2fps and Tomb Raider stayed strong at 78.1fps. Metro begins to drop below the perfectly playable 60 frames per second, producing 40.7fps, but disabling anti-aliasing boosted this back to 78.1fps.

It's only when playing at 4K resolutions that we begin to see the limits of the card. Dirt Showdown was still a perfectly playable 69.7fps, but Tomb Raider dropped to 30fps and Metro stumbled down to 17.7fps. However, SSAA anti-aliasing isn't realistic at this resolution; it renders the game at double your desired resolution before downscaling it, meaning at 4K games were effectively being rendered at 8K. Switching to the far less demanding FXAA resulted in a much smoother 51.2fps in Tomb Raider, and 37.2fps in Metro: Last Light Redux. Anti-aliasing isn't really required at such high resolutions, so we're confident that almost every game will be playable at 4K on this card.

We overclocked the 980 Ti using EVGA's Precision X utility, and were blown away with how much extra performance we were able to eke out of the card. After adding 250MHz to the core clock and 150MHz to the memory, we managed to get Metro: Last Light Redux to a much smoother 44fps at 4K resolution, and could even play Tomb Raider at 4K with SSAA and the AMD-specific TressFX hair rendering at 47.9fps.

CONCLUSION :

Right now, the GeForce GTX 980 Ti is the fastest single graphics card around. The newly announced AMD Fury X reportedly matches it for frame rates at 4K, but we’ll have to wait until we get one in for testing to see if the AMD card's high-bandwidth memory architecture puts it on a level playing field with Nvidia's card at 1,920x1,080 and 2,560x1,400 resolutions too. We'd be more than a little cross if we'd invested in a Titan X, as the GTX 980 Ti isn’t much slower but is almost £300 cheaper.

Like the Titan X, the GTX 980 Ti is overkill for 1080p resolutions. However, anyone with multiple 2,560x1,440 resolution displays or a 4K monitor will reap the benefits. It's a big investment, particularly if you opt for a custom-cooled, overclocked model from one of Nvidia's board partners, but you can rest assured that you'll be able to play the latest games at the highest frame rates for a long time to come.


Specifications :

News Gaming Gadget Review: Inno3D GeForce GTX 980 Ti iChill X3 Ultra

Something kind of weird is happening in the graphics world. The high-profile and eagerly anticipated launch of AMD's Radeon R9 Fury X has unwittingly sparked a surge in sales for Nvidia's rival GeForce GTX 980 Ti. Our conversations with retail partners suggest that keeping GTX 980 Ti on the shelves is proving to be a challenge, as said cards are now being snapped up by all those gamers who were, until recently, sat on the fence.

Buoyed by such news, and heartened by the fact that there are currently no custom R9 Fury X cards, Nvidia's partners are doubling down in their efforts to flood the market with an eclectic array of air- and liquid-cooled GTX 980 Ti solutions. Inno3D has both bases covered with five variants to choose from, and we have the mid-range iChill X3 Ultra in for review today.
Priced at around £575 - that's £50 more than the GTX 980 Ti entry point - this, clearly, isn't one for the faint of heart. Inno3D's unusual, almost Goth-like aesthetic design is very much an acquired taste, and you could either love the shroud's intricacy, or it may even give you nightmares.

It's a scary-looking beast, and it definitely doesn't shy away. Measuring 300mm in length and occupying the best part of three expansion slots, the iChill X3 Ultra is one of the meatiest GTX 980 Tis we've seen and tips the scales at 1.2kg. You could argue you're getting more card for your money, and if a compact form factor isn't a priority, you'll like the fact that Inno3D has made good use of the card's dimensions.
There's a full-size backplate, for starters, which shields the PCB and gives the card an extra feel of rigidity. The way in which the backplate extends well beyond the PCB reveals that the card is bigger than it needs to be, though the extra room does allow for three 90mm fans to be squeezed into the gigantic cooler.

Described as a modular design that's "easy to install, easy to clean," the iChill X3's metal cover can be detached using an Allen key, providing simple access to the three removable fans and a 118-fin aluminium heatsink interspersed with five heatpipes of varying widths. The removable parts are a handy way of keeping the card free of dust in the long run, though do be careful during the disassembly as the shroud does feel fragile and plasticky in parts.
A side-on-view gives you a better idea of the iChill X3 Ultra's girth. Such bravado hints at a good dollop of factory overclocking and Inno3D duly obliges by notching-up base and boost frequencies from 1,000MHz and 1,076MHz to a much tastier 1,152MHz and 1,241MHz, respectively. That's on par with Gigabyte's G1 Gaming, but Inno3D goes a step further by elevating memory from a reference 7,012MHz to an effective 7,200MHz. A nice touch, as overclocked memory is something of a rarity on custom GTX 980 Tis.

Inno3D has made all the right tweaks and, just as importantly, knows better than to mess with what already works. Power continues to be sourced by 6+8-pin connectors, two SLI fingers allow for multi-GPU configurations and Nvidia's usual display outputs - dual-link DVI, HDMI 2.0 and a trio of DisplayPort 1.2 - are ever present.
The iChill X3 Ultra's overall aesthetic has divided the crowd here at HEXUS HQ, though one element we all agree on is the black I/O panel - it's a neat little addition that fits in well with most PC enclosures.

Knowing that it lacks the brand power to go toe-to-toe with the likes of Asus, EVGA, Gigabyte and MSI, Inno3D is relying on a balanced mix of price and factory overclock. The iChill X3 Ultra should be one of the fastest GTX 980 Tis available at the £575 price point - if not the fastest - so let's see if the benchmarks stack up.
Nvidia's partners were all holding their breath a few short weeks ago, but with AMD's Radeon R9 Fury X failing to beat GTX 980 Ti into submission, they can all now rest easy and flex their muscles with a revitalised retail push.

It's almost an unfair battle with a myriad of custom GTX 980 Tis going up against the single-flavoured Radeon, however such choice does provide would-be buyers with a challenge: if you've decided GTX 980 Ti is the right GPU for you, which card should you buy?

EVGA's well-rounded Superclocked+ ACX 2.0+ is a firm favourite around these here parts, and is followed closely by Gigabyte's G1 Gaming, but it's good to see a lesser-known brand throwing its hat into the ring and Inno3D has done exactly that with the GTX 980 Ti iChill X3 Ultra.

Built to look formidable, overclocked on core and memory, yet able to remain whisper quiet, Inno3D's £575 solution is full of promise and should be on your list of GTX 980 Tis to consider

News Gaming Update : NVIDIA GTX 960M And 950M Processors Bring High-Powered Desktop Gaming To Notebooks

Some time ago, NVIDIA had launched the GeForce GTX 980M and 970M processors for notebooks. Now, to turn up the heat on the competition a bit more, the chip maker has added GeForce GTX 960M and 950M to the GTX 900M lineup. Nvidia is mainly aiming for notebooks from ASUS, Lenovo, Alienware, Acer and HP. Both the processors features Maxwell architecture, including 640 CUDA cores and are built on 28nm process.

These new processors are very powerful and hence can be used for purposes other than gaming, for example video editing and creating 3D graphics. Acer has announced that a new batch of its existing line of Aspire V Nitro Black Edition laptops will offer the GeForce GTX 960M as an option.One of the features of the processor is the BatteryBoost which lets users enjoy longer playtime than previous gaming notebooks. It provides more battery backup, even if the user is playing off-cord. 
Users can choose between the best graphics performance or the best battery life, according to the application they are using. Other features include ShadowPlay activation. Users can record full-resolution videos for sharing on YouTube or stream live to Twitch.

As of now, only the Acer V Nitro will be offering the GTX 960M service. Alienware, ASUS, HP and Lenovo would be releasing the upgraded version soon.

UPDATE :

For a limited period, customers who purchase the GeForce GTX 980, GTX 970, and GTX 960 graphics cards, or a notebook with a GTX 970M or above, will get a code for a digital copy of the game. Labelled the ‘Undeniably Epic’ bundle, it will be authorized for purchases from Flipkart, Snapdeal and computer stores that are a part of Nvidia’s retail network.

News Gaming Industry Update : NVIDIA Releases GeForce GTX 960M and GTX 950M Mobile Graphics

NVIDIA has announced new GPUs to round out their 900-series mobile lineup, and the new GTX 960M and GTX 950M are based on the same GM107 core as the previous 860M/850M parts.

Both GPUs feature 640 CUDA Cores and are separated by Base clock speed, with the GTX 960M operating at 1096 MHz and GTX 950M at 914 MHz. Both have unlisted maximum Boost frequencies that will likely vary based on thermal constraints. The memory interface is the other differentiator between the GPUs, with the GTX 960M sporting dedicated GDDR5 memory, and the GTX 950M can be implemented with either DDR3 or GDDR5 memory. Both GTX 960M and 950M use the same 128-bit memory interface and support up to 4GB of memory.
As reported by multiple sources the core powering the 960M/950M is a GM107 Maxwell GPU, which means that we are essentially talking about rebadged 860M/850M products, though the unlisted Boost frequencies could potentially be higher with these parts with improved silicon on a mature 28nm process. In contrast the previously announced GTX 965M is based on a cut down Maxwell GM204 GPU, with its 1024 CUDA Cores representing half of the GPU core introduced with the GTX 980.

New notebooks featuring the GTX 960M have already been announced by NVIDIA's partners, so we will soon see if there is any performance improvement to these refreshed GM107 parts.

News Gaming Update : Nvidia Surprise Launches The 12GB GeForce Titan X, The Most Advanced GPU Ever'

 Nvidia chief executive Jen-Hsun Huang strolled on stage at a Game Developer Conference presentation from Epic Games, casually announced the new flagship GeForce Titan X graphics card, autographed it, and left.

So that happened.

After spending the better part of two hours launching its Nvidia Shield gaming set-top box/console on Tuesday night, Huang took everyone by surprise when he launched the Titan X on Wednesday morning.

“We have launched the most advanced GPU ever but and [given] the first one to Tim Sweeney,” the founder of Epic Games, Huang said on stage. After announcing the Titan X and some of its specs, Huang autographed the massive GPU module with a flourish—“To Tim, with love, Jen-Hsun” waved goodbye, and left.
The Titan X is apparently anything but vaporware; Huang said that it “will power GDC 2015,” and that showgoers would see “some amazing demonstrations this week”.

“I cherish the hardware and of course we’ll see what it can do,” Sweeney said.

 So what’s inside the Titan X? Eight billion transistors, making it the “most advanced GPU the world has ever seen,” Huang said. It will contain a massive 12GB frame buffer. And that’s about it: Huang didn’t reveal more details, and Nvidia’s official blog and press releases haven't mentioned it yet.

Given what Huang said about it, however, we can assume that it’s more powerful than the GeForce Titan Z, a $3,000 graphics card which Nvidia launched last March.  That card included 5,760 CUDA cores with two Kepler cores inside of it, 12GB of memory, and 8 teraflops of computing power. Since the GPU has the same amount of memory inside, it’s likely that there are more cores. It’s also not clear what GPU is at the heart of the Titan X; Nvidia is preparing its next-gen Pascal processor, but that’s not due until 2016, Huang said at the time.
More details about the new GeForce Titan X will be shared at Nvidia’s GPU Technology Conference on March 17, a spokesman for the company said. He declined to comment further.

Why this matters: Although Nvidia still makes the bulk of its income from the PC, enterprise products command huge premiums. Nvidia would like to make products like its Iray VCA, a $50,000 virtual computing appliance for rendering images using modeled photons, as the tool for CAD and CGI specialists to fabricate their renderings. What's really interesting is that we should be able to see this on the show floor at the Game Developer Conference this week.

News Mobile Tech Updates : Nvidia Kills Mobile GPU Overclocking In Latest Driver Update, Irate Customers Up In Arms

Nvidia’s mobile Maxwell parts have won significant enthusiast acclaim since launch thanks to excellent performance and relatively low power consumption. Boutique builders and enthusiasts alike also tend to enjoy pushing the envelope, and Maxwell’s manufacturing characteristics apparently make it eminently suited to overclocking. Now, apparently, Nvidia is cracking down on these options with a driver update that removes the overclocking features that apparently some vendors sold to customers.
As DailyTech points out, part of what makes this driver update problematic is that system manufacturers actively advertise their hardware as having overclock support baked in to mobile products. Asus, MSI, Dell (Alienware) and Sager have apparently all sold models with overclocking as a core feature, as shown in the copy below.

Nvidia apparently cut off the overclocking feature with its 347.09 driver and kept it off with the 347.52 driver released last week. Mobile customers have been demanding answers in the company forums, with Nvidia finally weighing in to tell its users that this feature had previously only been available because of a “bug” and that its removal constituted a return to proper function rather than any removal of capability.

Under normal circumstances, I’d call this a simple case of Nvidia adjusting a capability whether users like it or not, but the fact that multiple vendors explicitly advertised and sold hardware based on overclocking complicates matters. It’s not clear if Asus or the other manufacturing charged extra for factory overclocked hardware or if they simply shipped the systems with higher stock speeds, but we know that OEMs typically do put a price premium on the feature.
To date, Nvidia has not responded formally or indicated if it will reconsider its stance on overclocking. The company isn’t currently under much competitive pressure to do so — it dominates the high-end GPU market, and while AMD is rumored to have a new set of cards coming in 2015, it’s not clear when those cards will launch or what the mobile flavors will look like. For now, mobile Maxwell has a lock on the enthusiast space. Some customers are claiming that they’re angry enough to quit using Team Green, but performance has a persausive siren song all its own, and the performance impact of disabling overclocking is going to be in the 5-10% range for the majority of users. If customers can prove they paid extra for the feature, that could open the door to potential claims against the OEMs themselves.

For Nvidia, this surge of attention on their mobile overclocking is a likely-unwelcome follow-up to concerns about the GTX 970’s memory allocation and the confusion and allegations swarming around mobile G-Sync. While none of these are knock-out blows, they continue to rile segments of the enthusiast community.

Nvidia Launches GeForce GTX 660 Ti Desktop Graphics Card

Summary: Kepler graphics engine comes to the mid-range market, competes against AMD Radeon 7950.
While all the attention is compensated to the excessive great end when it comes to design cards -- and which is the quickest greeting credit card ever -- not as many players are willing to pay $500 or more for those forums. So it may not be as interesting when the new high-end technology outflow down to the mid variety -- that is, unless you're in the marketplace for one of those cards.

A very good example is the new Nvidia GeForce GTX 660 Ti, which is designed using the same GK104 GPU (a.k.a Kepler) that abilities the high -- and expensive -- GeForce GTX 680, but will cost around $299 instead of $499. Obviously, you won't get the same performance, but if you're improving from an older greeting credit card, the benefits will be easily obvious. You will get 1,344 CUDA cores, 2GB of video storage, a 192-bit storage bus, and a platform time rate of 915MHz. Those specifications are for any referrals greeting credit card, though personalized and overclocked cards from Nvidia's production associates are available.

So how does the GTX 660 Ti's performance evaluate to the AMD Radeon HD 7870 and Radeon HD 7950, two competitive cards that bookend the new Nvidia panel in pricing? As is more the situation these days, results are combined from analyze to analyze and writer to writer (see Anandtech, HotHardware, and Tom's Hardware) -- there's no slam-dunk choose among these mid-range cards. AMD has made things even tougher by stating that it would be offering a BIOS upgrade to the Radeon HD 7950 that will provide its Increase software overclocking resource, no question trying to back up the strike from the GTX 660 Ti release.

Gaming PC creators have been quick to declare support for the new Nvidia greeting credit card, with everyone from CyberPower to Maingear to Speed Small now such as the GTX 660 Ti as an option for their game playing personal computers. Are you in the marketplace for the new greeting credit card, or would you like one of AMD's mid-range offerings?

 
Support :. Copyright © 2015. The Technology Zone - All Rights Reserved
Template Created By Gourav Kashyap Proudly Powered By Blogger