User Tag List

Page 1 of 85 1 2 3 4 5 6 11 51 ... LastLast
Results 1 to 20 of 1685

Thread: Nvidia News

  1. #1
    Member
    Join Date
    Nov 2012
    Location
    Dhaka
    Posts
    612

    Default Nvidia News

    I found some news relating to Nvidia's GPU's but couldn't find any relevant threads to post it in, so I'm starting this thread. If you guys come across any information about Nvidia GPU's, driver updates or any other topic which may not warrant a whole new thread, then post them here.

    Here's the news I found.

    GeForce 800 series “Maxwell” in Q1 2014
    NVIDIA will launch next series sooner than expected. NVIDIA will release Maxwell series in early 2014. It is expected that GeForce 800 series will arrive in the first quarter of 2014, somewhere between February and March. What it basically means is that the chances for 20nm process are low. Of course NVIDIA could already have first 20nm samples sooner than that, but TSMC will not be ready for mass production till June 2014.
    Maxwell GPUs will not be made in 20nm fabrication process, but if they are expect huge graphics cards shortage and more paper-launches.



    NVIDIA GeForce Kepler Lineup To Be Expanded in 2H 2013
    Videocardz.com have some exclusive details regarding NVIDIA’s plans to expand their GeForce Kepler lineup going in 2H of 2013. For some time it was known that NVIDIA won’t release any new graphics card after the launch of their GeForce GTX 760 graphics card but turns out to be NVIDIA was just preparing for something better.
    While there’s limited information regarding what new GeForce Kepler cards would be introduced but this could be NVIDIA’s response to the upcoming Hawaii GPU from AMD. According to Videocardz, the upcoming GeForce Kepler based graphic cards would be based on the GK104, GK106 and GK208 architectures which probably mean more entry to mid-range cards. These are probably going to turn up as the GeForce GTX 750Ti, GTX 750, and the various other boost cards we got to see with the GeForce Kepler 600 series lineup. But NVIDIA possibly has a well kept secret which they may be aiming to launch after AMD unleashes the Hawaii GPU. The card itself could be the final GPU of the GeForce Kepler lineup and one that ends up as the most fastest of all. A new High-End model based on the GK110 core seems to be in plans by NVIDIA which could be used as a deterrent against AMD’s Hawaii GPU which is supposedly a “Titan Killer”. This new card could be a single chip ultimate GK110 chip featuring the fully enabled 15 SMX GeForce Kepler core and a 12 GB GDDR5 memory as featured on the Quadro K6000 which is currently the worlds fastest professional graphics card. Or it could be a dual graphics solution called “GeForce GTX 790″ based on two weaker GK110 cores but with a handsome amount of power over the GeForce GTX 690. At the end what NVIDIA plans to choose would almost certainly be unveiled in Fall 2013 provided that the report is true.

    GeForce GTX 790 or TITAN Ultra in few months? However this is not all. NVIDIA is also working on a new high-end model. At this point I don’t know if this is the so-called TITAN ULTRA with fully enabled GK110 GPU or, more likely, GeForce GTX 790 with two GK110s. Assuming that the full GK110 processor is somewhat dedicated to the professional segment, and NVIDIA made dual-gpu cards a standard, I think that it is GTX 790. You are probably wondering how much would it cost. I’m more than sure that $1000 range won’t be exceeded. This card is made as a response to AMD Hawaii cards which will arrive in two months. Then NVIDIA will most likely (slightly) lower their prices on GK110 graphics cards.

    Nvidia's Maxwell GPU architecture will access system RAM Nvidia said at its GPU Technology Conference (GTC) today that its upcoming Maxwell GPU architecture will be able to access main memory.
    Nvidia's GPUs have traditionally been limited to only being able to access memory on the card, though Kepler can access other GPUs through the firm's GPU Direct technology. However the firm's next generation Maxwell GPU architecture will allow the GPU to access main memory, reminiscent of AMD's unified memory architecture.
    Nvidia co-founder and CEO Jen-Hsun Huang said Maxwell, which will replace the present Kepler architecture, will be the firm's first GPU architecture that is able to access main memory.
    Although Nvidia's Kepler architecture can address terabytes of memory, the cost and memory density limitations of GDDR have meant that its Tesla boards are limited to 8GB of memory. While the firm's GPU Direct allows Kepler based Tesla boards to access memory on other Kepler boards, its bandwidth is limited by the network connection, which typically tops out at 40Gbit/s, while DDR3-1600 bandwidth is 12.8GB/s - or about 100Gbit/s - and avoids the network latency.
    AMD has been pushing the ability of GPUs to address system memory as part of its unified memory addressing technology that will be a key part of its Heterogeneous System Architecture (HSA). The firm is expected to produce its first HSA enabled APU later this year and the technology has been incorporated in the APU that Sony uses for the Playstation 4.

    NVIDIA GeForce and Ubisoft Form Alliance For Next Generation AAA Titles
    Like AMD who would offer exclusive optimization for EA titles including the blockbuster Battlefield 4, NVIDIA would offer exclusive features, bundles and optimizations for their NVIDIA GeForce graphics card with latest titles from Ubisoft offering more realism and immersive experiences to PC gamers. Features like smooth and fluid animations, soft shadows, HBAO+ (horizon-based ambient occlusion) and advanced DX11 tessellation would be implemented in the upcoming titles from Ubisoft offering the best possible gameplay experience on the PC as a platform.
    “PC gaming is stronger than ever and Ubisoft understands that PC gamers demand a truly elite experience — the best resolutions, the smoothest frame rates and the latest gaming breakthroughs,” said Tony Tamasi, senior vice president of Content and Technology at NVIDIA. “We’ve worked closely with Ubisoft’s incredibly talented creative team throughout the development process to incorporate our technologies and deliver the most immersive and visually spectacular game worlds imaginable.” NVIDIA

  2. #2
    Member
    • Sybaris Caesar's Gadgets
      • Motherboard:
      • Intel DH61WW B3 Rev.
      • CPU:
      • Intel Core i5 2400 3.10 Ghz
      • RAM:
      • Kingston KVR1333D3N9/4G 4GB DDR3 1333 Mhz
      • Hard Drive:
      • Seagate Barracuda 7200.12 1TB 7200 RPM
      • Graphics Card:
      • Leadtek Winfast GTX 560 ti 1GB GDDR5 822/1644/4008
      • Display:
      • Samsung S19A100 18.5"
      • Sound Card:
      • Onboard
      • Speakers/HPs:
      • Microlab M-100 2.1 audio speaker
      • Keyboard:
      • A4tech KR-38 ps/2
      • Mouse:
      • Logitech G402
      • Controller:
      • 2x cheap generic controller + x360ce.exe
      • Power Supply:
      • Cooler Master G550M
      • Optical Drive:
      • Samsung SH222 22x DVD Writer
      • UPS:
      • Orlando 800va(packet e lekha apollo)
      • Operating System:
      • Windows 8.1 Pro x64
      • Upload Speed:
      • ∞
    Sybaris Caesar's Avatar
    Join Date
    May 2012
    Location
    Nowy Warsaw
    Posts
    3,277

    Default

    :O boy. I hope they are on the right track. We don't want another GeForce 400 Fermi blowout.
    My Dream Build: Intel® Core™ i5 k / AMD Zen equivalent • Cooler Master Hyper 212x • 2×8 GB DDR4 2400 MHz (shitty Intel chipset cap)/3200 MHz (if Zen allows) • 256 GB SATA/M.2 SSD • Nvidia GeForce® GTX™/AMD Radeon™ 2nd fastest GPU • 144 Hz FreeSync™ monitor with Overdrive and/or Backlight Strobing • Logitech G Daedalus Apex™ G303 • Phillips SHP9500/Superlux HD 681 EVO/Takstar Pro 80/Grado SR60e • Creative Sound Blaster® E3 • SteelSeries QcK Heavy

  3. #3
    Member
    Join Date
    Nov 2012
    Location
    Dhaka
    Posts
    612

    Default

    GTX 780Ti due for release. Supposedly the most powerful Nvidia GPU yet, more so than Titan or 690.

  4. #4
    Member
    • Single Player's Gadgets
      • Motherboard:
      • Foxconn H55MXV
      • CPU:
      • Core i3 540
      • RAM:
      • 2+2 GB DDR3 (1333 MHz)
      • Hard Drive:
      • 400 GB Samsung(320+80 @ 7200RPM)
      • Graphics Card:
      • 1 GB HD5570 DDR3
      • Display:
      • Samsung 793DF
      • Sound Card:
      • Realtek HD Audio | Creative Sound Blaster Live!
      • Speakers/HPs:
      • Creative 4400 4:1 | Cosonic CT-669
      • Keyboard:
      • A4Tech KRS-86 Multimedia
      • Mouse:
      • Havit HV-MS686
      • Controller:
      • X360 Replica Controller
      • Power Supply:
      • TT Lite Power W0316
      • Optical Drive:
      • 24x ASUS DVD RW
      • USB Devices:
      • 16GB + 8GB Tanscend
      • UPS:
      • None
      • Operating System:
      • Windows7 (x64)
      • Comment:
      • Not gonna upgrade within 2/3 years :P
      • ISP:
      • DFN BD
      • Download Speed:
      • -450 Kb/s
    Single Player's Avatar
    Join Date
    Dec 2011
    Location
    চেয়ারের উপর
    Posts
    1,915

    Default

    Introducing Revolutionary NVIDIA G-SYNC Display Technology: Ultra-Smooth, Stutter-Free Gaming Is Here

    NVIDIA is a company founded on innovation, and over the last twenty years, pioneered and invented hundreds of new technologies that span almost every industry imaginable, from life-saving scientific research, to advanced supercomputing and extreme gaming. Today, we revolutionize display technology with the announcement of NVIDIA® G-SYNC, a groundbreaking new innovation that casts aside decades-old thinking to create the smoothest, most responsive computerdisplays ever seen.

    With a NVIDIA G-SYNC monitor, screen tearing, input lag, and even most eyestrain-inducing stutter are simply gone. All it takes is a NVIDIA GeForce GTX 650 Ti Boost or better GPU, and a NVIDIA G-SYNC enabled monitor – age old frustrations will be eliminated, and games such as Assassin’s Creed IV Black Flag, Batman: Arkham City, Call of Duty: Ghosts, and Watch Dogs will be enhanced with NVIDIA-exclusive features, resulting in the definitive experience.

    Industry luminaries John Carmack, Johan Andersson, Tim Sweeney, and Mark Rein have been bowled over by NVIDIA G-SYNC’s game-enhancing technology, and are speaking further about its benefits today at a NVIDIA press event in Montreal. Pro eSports players and pro-gaming leagues are lining up to use NVIDIA G-SYNC, which will expose a player’s true skill, demanding even greater reflexes thanks to the unnoticeable delay between on-screen actions and keyboard commands. In-house, our diehard gamers have been dominating lunchtime LAN matches, surreptitiously using G-SYNC monitors to gain the upper hand. And online, if you have a NVIDIA G-SYNC monitor you’ll have a clear advantage over others, assuming you’re also a LPB.


    Beginning later this year, NVIDIA G-SYNC will be available as monitor module you can install yourself, or buy pre-installed in one of the best monitors currently available. Next year, G-SYNC monitors will be available on the shelves of your favorite e-tailers and retailers, in a variety of screen sizes and resolutions, eventually scaling all the way up to 3840x2160 (“4K”).

    Click image for larger version. 

Name:	gsync-monitor-key-visual.jpg 
Views:	7 
Size:	1.61 MB 
ID:	31345

    How To Upgrade To G-SYNC

    If you’re as excited by NVIDIA G-SYNC as we are, and want to get your own G-SYNC monitor, here’s how. Later this year, our first G-SYNC modules will be winging their way to professional modders who will install G-SYNC modules into ASUS VG248QE monitors, rated by press and gamers as one of the best gaming panels available. These modded VG248QE monitors will be sold by the modding firms at a small premium to cover their costs, and a 1-year warranty will be included, covering both the monitor and the G-SYNC module, giving buyers peace of mind.


    Click image for larger version. 

Name:	GEFORCE-G-SYNC-Performance_Chart.jpg 
Views:	12 
Size:	480.4 KB 
ID:	31347

    Alternatively, if you’re a dab hand with a Philips screwdriver, you can purchase the kit itself and mod an ASUS VG248QE monitor at home. This is of course the cheaper option, and you’ll still receive a 1-year warranty on the G-SYNC module, though this obviously won’t cover modding accidents that are a result of your own doing. A complete installation instruction manual will be available to view online when the module becomes available, giving you a good idea of the skill level required for the DIY solution; assuming proficiency with modding, our technical gurus believe installation should take approximately 30 minutes.
    If you prefer to simply buy a monitor off the shelf from a retailer or e-tailer, NVIDIA G-SYNC monitors developed and manufactured by monitor OEMs will be available for sale next year. These monitors will range in size and resolution, scaling all the way up to deluxe 3840x2160 “4K” models, resulting in the ultimate combination of image quality, image smoothness, and input responsiveness.

    Conclusion: A Groundbreaking Revolution Has Arrived

    In this time of technological marvels, there are few advances one can truly call “innovative”, or “revolutionary”. NVIDIA G-SYNC, however, is one of the few, revolutionizing outmoded monitor technology with a truly innovative, groundbreaking advancement that has never before been attempted.

    G-SYNC’s elimination of input lag, tearing, and stutter delivers a stunning visual experience on any G-SYNC-enhanced monitor; one so stunning that you’ll never want to use a ‘normal’ monitor ever again. In addition to cutting-edge changes to the viewing experience, multiplayer gamers will receive a significant competitive advantage when G-SYNC is paired with a fast GeForce GTX GPU, and low-lag input devices, something that’ll surely pique the interest of shooter aficionados. For eSports players, NVIDIA G-SYNC is an essential upgrade. With G-SYNC’s removal of input lag, successes and failures are squarely in the hands of players, differentiating the pros from the amateurs.
    When the biggest names in the business are blown away, and the architect of Unreal Engine calls G-SYNC “the biggest leap forward in gaming monitors since we went from standard definition to high-def”, you know that G-SYNC will raise the bar for display technology. Somehow, if that testimony isn’t proof enough, keep your eyes peeled on your favorite hardware sites for hands-on impressions of NVIDIA G-SYNC monitors, which are currently being shown at a press event in Montreal.


    Full Story

  5. #5
    Member
    • furiousTaher's Gadgets
      • Motherboard:
      • Asus z170p ||| Asus p8-P67m-pro
      • CPU:
      • i7 6700k ||| i5 2400
      • RAM:
      • 2 x Ripjaws 8gb 3200C16D 16GVKB ||| (Transcend 4gb + Adata 4gb) 1333
      • Hard Drive:
      • Evo 850 250gb + Toshiba 2tb ||| WD 500gb blue
      • Graphics Card:
      • Sapphire R9 280x Vapor-x ||| Sapphire r7 260x 1GB
      • Display:
      • Asus vx229h |||Samsung 21.5" S22A300B
      • Sound Card:
      • Xonar dgx
      • Speakers/HPs:
      • AltecLansing VS2621 + Xiaomi iron ring ||| Microlab 223
      • Mouse:
      • a4tech x7
      • Power Supply:
      • Adata HM 850w ||| Thermaltek 600 TR2 S
      • Optical Drive:
      • Asus dvd writer 24x max
      • USB Devices:
      • Phantom 240 red/black ||| Vatyn 664b (Tyrannosaurus)
      • UPS:
      • Power guard 1200va
      • Comment:
      • :D
    furiousTaher's Avatar
    Join Date
    Apr 2010
    Location
    Dhaka
    Posts
    7,630

    Default

    i guess they wouldnt make 780ti if r9 290x didnt beat titan

  6. #6
    Moderator
    • minitt's Gadgets
      • Motherboard:
      • Gigabyte P55-USB3
      • CPU:
      • Core i5 750 @ 3.6~4.0Ghz (dynamic)
      • RAM:
      • 16GB Crucial ballistix Sport DDR3 8-8-8-20 1T
      • Hard Drive:
      • Samsung 830 (128GB) +500gb samsung F3
      • Graphics Card:
      • GTX 780 With Titan HS
      • Display:
      • Sony Bravia KDL 47W804A
      • Sound Card:
      • On board
      • Speakers/HPs:
      • Microlab Solo 7C
      • Keyboard:
      • Wireless A4tech
      • Mouse:
      • Razer Imperator 2012
      • Power Supply:
      • Antec HCG-620M Modular
      • Optical Drive:
      • Nai
      • UPS:
      • PROLiNK Pro901S 1KVA Double Conversion Pure Sine Wave Online UPS with 21A battery bank+USB data port
      • Operating System:
      • Genuine Windows 7 & 8
      • Comment:
      • wish i had more time for my pc
      • ISP:
      • BTCL ADSL
      • Download Speed:
      • 180KB/sec
      • Upload Speed:
      • 90KB
      • Console:
      • 32
    minitt's Avatar
    Join Date
    Feb 2008
    Location
    dhaka
    Posts
    3,912

    Default

    the G sync thing seems quite interesting... no more choppiness in low fps.

  7. #7
    Forum Staff
    • aayman's Gadgets
      • Motherboard:
      • Asrock Extreme4 Z77
      • CPU:
      • Intel 3770K @4.4GHz (Delidded; GTX H100)
      • RAM:
      • G-Skill Something 2400mhz
      • Hard Drive:
      • EVO 250GB + SanDisk SSDX 960GB + 1TB WD Black
      • Graphics Card:
      • EVGA 1070 FTW
      • Display:
      • Dell U2312HM + ASUS VG248QE
      • Sound Card:
      • Creative Sound Blaster Zx
      • Speakers/HPs:
      • Sennheiser PC350SE + Phillips SPH9500
      • Keyboard:
      • CM QuickFire Stealth
      • Mouse:
      • Logitech G303|G900
      • Controller:
      • Steam Controller
      • Power Supply:
      • Corsair AX850
      • UPS:
      • CyberPower 1350VA
      • Operating System:
      • Windows 10 x64
      • ISP:
      • Comcast
    aayman's Avatar
    Join Date
    Jul 2008
    Location
    where I rightfully belong.
    Posts
    13,460

    Default

    Quote Originally Posted by Taher furious View Post
    i guess they wouldnt make 780ti if r9 290x didnt beat titan
    That's stretching too far. They released 780 even after Titan had a huge lead over 7970.

  8. #8
    Member
    • Sybaris Caesar's Gadgets
      • Motherboard:
      • Intel DH61WW B3 Rev.
      • CPU:
      • Intel Core i5 2400 3.10 Ghz
      • RAM:
      • Kingston KVR1333D3N9/4G 4GB DDR3 1333 Mhz
      • Hard Drive:
      • Seagate Barracuda 7200.12 1TB 7200 RPM
      • Graphics Card:
      • Leadtek Winfast GTX 560 ti 1GB GDDR5 822/1644/4008
      • Display:
      • Samsung S19A100 18.5"
      • Sound Card:
      • Onboard
      • Speakers/HPs:
      • Microlab M-100 2.1 audio speaker
      • Keyboard:
      • A4tech KR-38 ps/2
      • Mouse:
      • Logitech G402
      • Controller:
      • 2x cheap generic controller + x360ce.exe
      • Power Supply:
      • Cooler Master G550M
      • Optical Drive:
      • Samsung SH222 22x DVD Writer
      • UPS:
      • Orlando 800va(packet e lekha apollo)
      • Operating System:
      • Windows 8.1 Pro x64
      • Upload Speed:
      • ∞
    Sybaris Caesar's Avatar
    Join Date
    May 2012
    Location
    Nowy Warsaw
    Posts
    3,277

    Default

    Quote Originally Posted by aayman View Post
    That's stretching too far. They released 780 even after Titan had a huge lead over 7970.
    Coz people started 7970 ge cfx. It's cheap and doesn't [email protected] your wallet. They still do.
    My Dream Build: Intel® Core™ i5 k / AMD Zen equivalent • Cooler Master Hyper 212x • 2×8 GB DDR4 2400 MHz (shitty Intel chipset cap)/3200 MHz (if Zen allows) • 256 GB SATA/M.2 SSD • Nvidia GeForce® GTX™/AMD Radeon™ 2nd fastest GPU • 144 Hz FreeSync™ monitor with Overdrive and/or Backlight Strobing • Logitech G Daedalus Apex™ G303 • Phillips SHP9500/Superlux HD 681 EVO/Takstar Pro 80/Grado SR60e • Creative Sound Blaster® E3 • SteelSeries QcK Heavy

  9. #9
    Forum Staff
    • aayman's Gadgets
      • Motherboard:
      • Asrock Extreme4 Z77
      • CPU:
      • Intel 3770K @4.4GHz (Delidded; GTX H100)
      • RAM:
      • G-Skill Something 2400mhz
      • Hard Drive:
      • EVO 250GB + SanDisk SSDX 960GB + 1TB WD Black
      • Graphics Card:
      • EVGA 1070 FTW
      • Display:
      • Dell U2312HM + ASUS VG248QE
      • Sound Card:
      • Creative Sound Blaster Zx
      • Speakers/HPs:
      • Sennheiser PC350SE + Phillips SPH9500
      • Keyboard:
      • CM QuickFire Stealth
      • Mouse:
      • Logitech G303|G900
      • Controller:
      • Steam Controller
      • Power Supply:
      • Corsair AX850
      • UPS:
      • CyberPower 1350VA
      • Operating System:
      • Windows 10 x64
      • ISP:
      • Comcast
    aayman's Avatar
    Join Date
    Jul 2008
    Location
    where I rightfully belong.
    Posts
    13,460

    Default

    Quote Originally Posted by Dunia_r_JAURA View Post
    Coz people started 7970 ge cfx. It's cheap and doesn't [email protected] your wallet. They still do.
    People could do 7970CF even before Titan's release, so why was it still released anyway and at that absurd price? For people buying these cards, how much you have in your wallet is not a priority.

  10. #10
    Member
    • Sybaris Caesar's Gadgets
      • Motherboard:
      • Intel DH61WW B3 Rev.
      • CPU:
      • Intel Core i5 2400 3.10 Ghz
      • RAM:
      • Kingston KVR1333D3N9/4G 4GB DDR3 1333 Mhz
      • Hard Drive:
      • Seagate Barracuda 7200.12 1TB 7200 RPM
      • Graphics Card:
      • Leadtek Winfast GTX 560 ti 1GB GDDR5 822/1644/4008
      • Display:
      • Samsung S19A100 18.5"
      • Sound Card:
      • Onboard
      • Speakers/HPs:
      • Microlab M-100 2.1 audio speaker
      • Keyboard:
      • A4tech KR-38 ps/2
      • Mouse:
      • Logitech G402
      • Controller:
      • 2x cheap generic controller + x360ce.exe
      • Power Supply:
      • Cooler Master G550M
      • Optical Drive:
      • Samsung SH222 22x DVD Writer
      • UPS:
      • Orlando 800va(packet e lekha apollo)
      • Operating System:
      • Windows 8.1 Pro x64
      • Upload Speed:
      • ∞
    Sybaris Caesar's Avatar
    Join Date
    May 2012
    Location
    Nowy Warsaw
    Posts
    3,277

    Default

    Quote Originally Posted by aayman View Post
    People could do 7970CF even before Titan's release, so why was it still released anyway and at that absurd price? For people buying these cards, how much you have in your wallet is not a priority.
    Valid. They don't care about money. But really? $1000 for a single card? 7970cf was what? $700?
    My Dream Build: Intel® Core™ i5 k / AMD Zen equivalent • Cooler Master Hyper 212x • 2×8 GB DDR4 2400 MHz (shitty Intel chipset cap)/3200 MHz (if Zen allows) • 256 GB SATA/M.2 SSD • Nvidia GeForce® GTX™/AMD Radeon™ 2nd fastest GPU • 144 Hz FreeSync™ monitor with Overdrive and/or Backlight Strobing • Logitech G Daedalus Apex™ G303 • Phillips SHP9500/Superlux HD 681 EVO/Takstar Pro 80/Grado SR60e • Creative Sound Blaster® E3 • SteelSeries QcK Heavy

  11. #11
    Forum Staff
    • aayman's Gadgets
      • Motherboard:
      • Asrock Extreme4 Z77
      • CPU:
      • Intel 3770K @4.4GHz (Delidded; GTX H100)
      • RAM:
      • G-Skill Something 2400mhz
      • Hard Drive:
      • EVO 250GB + SanDisk SSDX 960GB + 1TB WD Black
      • Graphics Card:
      • EVGA 1070 FTW
      • Display:
      • Dell U2312HM + ASUS VG248QE
      • Sound Card:
      • Creative Sound Blaster Zx
      • Speakers/HPs:
      • Sennheiser PC350SE + Phillips SPH9500
      • Keyboard:
      • CM QuickFire Stealth
      • Mouse:
      • Logitech G303|G900
      • Controller:
      • Steam Controller
      • Power Supply:
      • Corsair AX850
      • UPS:
      • CyberPower 1350VA
      • Operating System:
      • Windows 10 x64
      • ISP:
      • Comcast
    aayman's Avatar
    Join Date
    Jul 2008
    Location
    where I rightfully belong.
    Posts
    13,460

    Default

    NVIDIA G-Sync - Is synchronizing monitor and graphics card a game changer ?On Friday NVIDIA announced G-Sync, and considering the little details available out there I wanted to write a quick follow-up on this new technology, as it really is a big announcement - a really big thing actually. In recent years we all have been driven by the knowledge that on a 60 Hz monitor you want 60 FPS rendered, this was for a reason, you want the two as close as possible to each other as that offers you not only the best gaming experience, but also the best visual experience. This is why framerate limiters are so popular, you sync each rendered frame in line with your monitor refresh rate. Obviously 9 out of 10 times that is not happing. This results into tow anomalies that everybody knows and experiences, stutter and tearing.So what is happening ?

    Very simply put, the graphics card is always firing of frames as fast as it can possibly do, that FPS this is dynamic and can bounce from say 30 to 80 FPS in an matter of split seconds. On the eye side of things, you have this hardware which is the monitor, and it is a fixed device as it refreshes at 60 Hz (60Hz is example). Fixed and Dynamic are two different things and collide with each other. So on one end we have the graphics card rendering at a varying framerate while the monitor refreshes at 60 images per second. That causes a problem as with a slower or faster FPS then 60 you'll get multiple images displayed on the screen per refresh of the monitor. So graphics cards don’t render at fixed speeds. In fact, their frame rates will vary dramatically even within a single scene of a single game, based on the instant load that the GPU sees. In the past we solved problems like Vsync stutter and Tearing basically in two ways. The first way is to simply ignore the refresh rate of the monitor altogether, and update the image being scanned to the display in mid cycle. This you guys all know and have learned as ‘VSync Off Mode’ it is the default way most gamers play.The downside is that when a single refresh cycle show 2 images, a very obvious “tear line” is evident at the break, yup, we all refer to this as screen tearing. You can solve tearing though.The solution to bypass tearing is to turn VSync on, here you will force the GPU to delay screen updates until the monitor cycles to the start of a new refresh cycle. That delay causes stutter whenever the GPU frame rate is below the display refresh rate. Iit also increases latency, which is the direct result for input lag, the visible delay between a button being pressed and the result occurring on-screen.
    • Enabling VSYNC helps a lot, but with the video card firing off all these images per refresh you can typically see some pulsing (I don't wanna call it vsync stuttering) when that framerate varies and your you pan from left to right in your 3D scene. So that is not perfect.
    • Alternatively most people disable VSYNC - but that runs into a problem as well, multiple images per refreshed Hz will result into the phenomenon that is screen tearing, which we all hate.

    Basically this is why we all want extremely fast graphics cards as most of you guys want to enable VSYNC and have a graphics card that runs faster then 60 FPS.What is the solve ?

    Nvidia is releasing G-Sync. Now as I explained the graphics card is running dynamic Hz, the monitor is static Hz, these two don't really match together. G-Sync is both a software and a hardware solution that will solve screen tearing and stuttering. A daughter hardware board (it actually looks a little like a mobile MXM module) is placed into a G-Sync enabled monitor which will do something very interesting. With G-Sync the monitor will become a slave to your graphics card as the its refresh rate in Hz becomes dynamic. Yes, it is no longer static. So each time your graphics card has rendered one frame that frame is aligned up with the monitor refresh rate. So the refresh rate of the monitor will become dynamic. With both the graphics card and monitor both dynamically in sync with each other you have eliminated stutter and screen tearing completely.It gets even better, without stutter and screen tearing on an nice IPS LCD panel even at 30+ Hz you'd be having an incredibly good gaming experience (visually). BTW monitors upto 177 hz will get supported with Gsync as well as 4K monitors.Summed up : NVIDIA G-SYNC is a solution that pretty much eliminates screen tearing, VSync input lag, and stutter. You need a G-SYNC module into monitors, allowing G-SYNC to synchronize the monitor to the output of the GPU, instead of the GPU to the monitor, resulting in a tear-free, faster, smoother experience.
    The Nvidia G-Sync module
    So Hilbert, are there any foreseeable problems ?Not a lot really but sure, Low FPS could be a drag as say 20 FPS and thus Hz on a LCD panel will look like crap, you'd literally see the screen refresh. Meaning low FPS moments could potentially be horrible with refreshes that you could see live on your screen. So in an optimal situation you will need a graphics card that can stay above 30 FPS as minimum.Secondly, dynamically altering the Hz refresh rate on your monitor just has to put some load on the monitor hardware, it MIGHT have an effect on your monitors lifespan. Last but not least. It is Nvidia proprietary technology and thus works with selected Nvidia GeForce Graphics cards only.When is it available ?

    You can dee the first monitors and upgrade kits later this year, realistically we expect good availibility Q1 2013 already. One current ASUS model actually can be updated where (ASUS VG248QE) you can insert the G-Sync hardware yourself. G-Sync is going to included into monitors from ASUS, BenQ, Philips and ViewSonic.What will prices be like ?

    That is not yet disclosed, but we think you can expect a 75 EUR/USD price premium per monitor for this solution. But after such an upgrade, even a Geforce GTX 760 running 30+ Hz/FPS would result into a very nice visual gaming experience. We learned that Asus will release the VG248QE (used in the demo) in a G-Sync-Enhanced version for 399 U.S. dollars .Will this be an NVIDIA only enabled feature

    For now, yes. Currently these graphics cards will be G-Sync compatible: GTX TITAN, GTX 780, GTX 770,GTX 760, GTX 690, GTX 680, GTX 670, GTX 660 Ti, GTX 660, GTX 650 Ti Boost. You need to be running Windows 7 or higher as operating system.Some Videos


    Youtube is not a good way te demonstrate this at 30 FPS but please try and get an overview of the tech in this video recording we made. Concluding

    In the end we feel Nvidia G-Sync has the potential to be a game changer in the PC gaming industry. As even with the more mainstream graphics card you'll be enhancing your graphics experience greatly, think of it .. no more vsync stutter or screen tearing. That means silky smooth input lag free gaming at say 40 FPS. As such G-Sync has huge potential for the you guys the gamers, and the hardware industry.

  12. #12
    Member
    • Reventon's Gadgets
      • Motherboard:
      • Gigabyte H67MA-D2H
      • CPU:
      • Intel Core i5 2400 3.1 ghz.
      • RAM:
      • 4 GB DDR3 1600 BUS CL7 OCZ FATALITY EDITION
      • Hard Drive:
      • WD CAVIER BLACK 1 TB
      • Graphics Card:
      • Sapphire AMD 7870 GHZ edition
      • Display:
      • PHILIPS 224EL LED 1080p HDMI+DVI
      • Sound Card:
      • ON BOARD
      • Speakers/HPs:
      • 4:1 creative inspire
      • Keyboard:
      • a4 tech
      • Mouse:
      • a4 tech
      • Power Supply:
      • COOLER MASTER SILENT PRO 600W
      • Optical Drive:
      • Samsung.
      • Operating System:
      • win 7
      • Comment:
      • should be better
      • Console:
      • 16
    Reventon's Avatar
    Join Date
    Sep 2009
    Location
    Moghbazar
    Posts
    1,755

    Default

    Nice tech indeed.Question is how much Nvdia is investing for the manufacturers to take their chip ?????!!!!!!
    People should stop blaming games for allegedly *copying* each other: "oh this game is Skyrim copy", "this game is WoW knock off", "this game is just another DMC" - enough!

    On this rate every 3rd person action will be God of War, every 3rd p shooter Gears, every 1st will be CoD.. it's ridiculous. They are called genres people! And if you look carefully you'll see allot more than mimicry out there.

  13. #13
    Member
    • Ghost Riley's Gadgets
      • Motherboard:
      • Gigabyte B75M-D3H [Case: Thermaltake Commander GS-II black]
      • CPU:
      • Intel core i5 3470 @ 3.6Ghz
      • RAM:
      • Corsair Vengeance 8GB, @ 1600Mhz Dual Channel
      • Hard Drive:
      • Hitachi 500GB+Samsung 160GB (7200RPM)
      • Graphics Card:
      • Intel HD 2500
      • Display:
      • Dell S2240L IPS panel (1920X1080)
      • Sound Card:
      • Creative X Fi Titanium
      • Speakers/HPs:
      • Logitech Z506 5.1 Surround
      • Keyboard:
      • A4 TECH KR-85
      • Mouse:
      • Logitech B100
      • Power Supply:
      • CORSAIR 500CXV2 [80 PLUS BRONZE Certified]
      • Optical Drive:
      • Samsung SH-S223
      • USB Devices:
      • Transcend 32GB USB 3, Adata C906 32GB USB 3, Adata UV120 32GB USB 3
      • Operating System:
      • Windows 8 Pro 64bit
      • Comment:
      • ochcham
      • ISP:
      • Banglalion Cruise pro
      • Download Speed:
      • 60-70 KBps
      • Upload Speed:
      • 20-25 KBps
    Ghost Riley's Avatar
    Join Date
    Aug 2012
    Location
    Narayangonj
    Posts
    422

    Default

    Quote Originally Posted by Dunia_r_JAURA View Post
    Valid. They don't care about money. But really? $1000 for a single card? 7970cf was what? $700?
    R9 280X 7970Ghz er che slightly better, ota CF korte 600$ lagbe. better performance!!!
    Last edited by Ghost Riley; October 21st, 2013 at 01:02.

  14. #14
    Forum Staff
    • aayman's Gadgets
      • Motherboard:
      • Asrock Extreme4 Z77
      • CPU:
      • Intel 3770K @4.4GHz (Delidded; GTX H100)
      • RAM:
      • G-Skill Something 2400mhz
      • Hard Drive:
      • EVO 250GB + SanDisk SSDX 960GB + 1TB WD Black
      • Graphics Card:
      • EVGA 1070 FTW
      • Display:
      • Dell U2312HM + ASUS VG248QE
      • Sound Card:
      • Creative Sound Blaster Zx
      • Speakers/HPs:
      • Sennheiser PC350SE + Phillips SPH9500
      • Keyboard:
      • CM QuickFire Stealth
      • Mouse:
      • Logitech G303|G900
      • Controller:
      • Steam Controller
      • Power Supply:
      • Corsair AX850
      • UPS:
      • CyberPower 1350VA
      • Operating System:
      • Windows 10 x64
      • ISP:
      • Comcast
    aayman's Avatar
    Join Date
    Jul 2008
    Location
    where I rightfully belong.
    Posts
    13,460

    Default

    Quote Originally Posted by Ghost Riley View Post
    7970 CF korbe ken??? R9 280X 7970Ghz er che slightly better, ota CF korte 600$ lagbe. better performance!!!
    R280X is a 7970Ghz edition. http://www.guru3d.com/articles_pages...chmarks,9.html

    And he was talking about something else, about a time when there was no R280X.

  15. #15
    Member
    • Ghost Riley's Gadgets
      • Motherboard:
      • Gigabyte B75M-D3H [Case: Thermaltake Commander GS-II black]
      • CPU:
      • Intel core i5 3470 @ 3.6Ghz
      • RAM:
      • Corsair Vengeance 8GB, @ 1600Mhz Dual Channel
      • Hard Drive:
      • Hitachi 500GB+Samsung 160GB (7200RPM)
      • Graphics Card:
      • Intel HD 2500
      • Display:
      • Dell S2240L IPS panel (1920X1080)
      • Sound Card:
      • Creative X Fi Titanium
      • Speakers/HPs:
      • Logitech Z506 5.1 Surround
      • Keyboard:
      • A4 TECH KR-85
      • Mouse:
      • Logitech B100
      • Power Supply:
      • CORSAIR 500CXV2 [80 PLUS BRONZE Certified]
      • Optical Drive:
      • Samsung SH-S223
      • USB Devices:
      • Transcend 32GB USB 3, Adata C906 32GB USB 3, Adata UV120 32GB USB 3
      • Operating System:
      • Windows 8 Pro 64bit
      • Comment:
      • ochcham
      • ISP:
      • Banglalion Cruise pro
      • Download Speed:
      • 60-70 KBps
      • Upload Speed:
      • 20-25 KBps
    Ghost Riley's Avatar
    Join Date
    Aug 2012
    Location
    Narayangonj
    Posts
    422

    Default

    Quote Originally Posted by aayman View Post
    R280X is a 7970Ghz edition. http://www.guru3d.com/articles_pages...chmarks,9.html

    And he was talking about something else, about a time when there was no R280X.
    sorry for mistake, i know, it is re-brand of 7970Ghz

  16. #16
    Forum Staff
    • dipanzan's Gadgets
      • Motherboard:
      • Gigabyte Z87 HD3
      • CPU:
      • Intel Core i5 4670k
      • RAM:
      • Corsair Dominator 16GB 1600
      • Hard Drive:
      • Crucial M4 128GB, Western Digital 1TB Blue, My Passport 2TB
      • Graphics Card:
      • HIS HD5850
      • Display:
      • Dell P2212H
      • Sound Card:
      • Asus Xonar DGx
      • Speakers/HPs:
      • Sennheiser HD598
      • Keyboard:
      • Filco Majestouch 2 TKL Ninja Reds
      • Mouse:
      • Mionix Avior 7000, Steelseries XAI
      • Power Supply:
      • Corsair HX650 v2
      • Operating System:
      • Windows 8.1 Pro x64
      • ISP:
      • Link3 :: Linksys WRT54GL w/ DD-WRT
      • Download Speed:
      • 64-128KB/s
      • Upload Speed:
      • 64-128KB/s
    dipanzan's Avatar
    Join Date
    Mar 2009
    Location
    Kalabagan, Dhaka
    Posts
    6,939

    Default

    This is actually a really good move by nvidia as this has been bothering people since the dawn of time.

  17. #17
    Moderator
    • minitt's Gadgets
      • Motherboard:
      • Gigabyte P55-USB3
      • CPU:
      • Core i5 750 @ 3.6~4.0Ghz (dynamic)
      • RAM:
      • 16GB Crucial ballistix Sport DDR3 8-8-8-20 1T
      • Hard Drive:
      • Samsung 830 (128GB) +500gb samsung F3
      • Graphics Card:
      • GTX 780 With Titan HS
      • Display:
      • Sony Bravia KDL 47W804A
      • Sound Card:
      • On board
      • Speakers/HPs:
      • Microlab Solo 7C
      • Keyboard:
      • Wireless A4tech
      • Mouse:
      • Razer Imperator 2012
      • Power Supply:
      • Antec HCG-620M Modular
      • Optical Drive:
      • Nai
      • UPS:
      • PROLiNK Pro901S 1KVA Double Conversion Pure Sine Wave Online UPS with 21A battery bank+USB data port
      • Operating System:
      • Genuine Windows 7 & 8
      • Comment:
      • wish i had more time for my pc
      • ISP:
      • BTCL ADSL
      • Download Speed:
      • 180KB/sec
      • Upload Speed:
      • 90KB
      • Console:
      • 32
    minitt's Avatar
    Join Date
    Feb 2008
    Location
    dhaka
    Posts
    3,912

    Default

    I think, at some point, 3rd party application (such as afterburner aka rivatuner) will be able to control this g sync module as well

  18. #18
    Member
    Join Date
    Nov 2012
    Location
    Dhaka
    Posts
    612

    Default

    NVIDIA took special care to outline Gameworks newest systems : Flame Works for fire and smoke, FLEX for water and fabric, and GI Works for shadows. Of course that’s an over-simplification of the whole collection – the real in-depth action is much better seen than read.
    http://www.slashgear.com/nvidia-game...dows-19302116/

    One of the more impressive demonstrations of realism we’ve seen from NVIDIA this year has been a bit of software called Faceworks. They brought this fellow called “Digital Ira” out again, tearing him apart at the frames to prove the versatility of the GameWorks system. We saw this fellow appear on Ubuntu then work in real time on a next-generation NVIDIA Tegra SoC.

    Faceworks working on the Tegra mobile processor code-named Project Logan is of particular significance because this demonstration was originally created to show off the power of the highest-end graphics cards made by NVIDIA in the GeForce GTX Titan. Here we’ve got it working on a system-on-chip made for tomorrow’s Android and Windows mobile devices.

    These are two more great things Nvidia showed off. Imagine if the next gen androids look and play as well as a Vita due to the new mobile processors. They even showed off a demo game made for PS3 using this same Tegra processor. Looks amazing.

  19. #19
    Member
    Join Date
    Nov 2012
    Location
    Dhaka
    Posts
    612

    Default

    Here's a picture of a reference 780Ti with a GPU Z screenshot.



    Click image for larger version. 

Name:	NVIDIA-GeForce-GTX-780-Ti-GPU-Z.jpg 
Views:	7 
Size:	94.3 KB 
ID:	31396

  20. #20
    Member
    Join Date
    Nov 2012
    Location
    Dhaka
    Posts
    612

    Default

    VIDIA’s latest creation, the GeForce GTX 780 Ti, will be released next month. The new card is supposedly a response to AMD’s R9 290X, while replacing a GTX 780 at the same price point. A member of XtremeSystems forums posted a screenshot of 3Dmark11 benchmark with three scores with GeForce GTX TITAN, AMD Radeon R9 290X and supposedly the GTX 780 Ti.
    According to this data the GTX 780 Ti would be faster than both cards by few percents in 3DMark 11 (Extreme Preset):

    • NVIDIA GeForce GTX 780 TI: X5204 (100%)
    • NVIDIA GeForce GTX TITAN: X4924 (94,6%)
    • AMD Radeon R9 290X: X4664 (89,6%)

    The leaker also revealed that is sample is equipped with 6GB memory, we are probably looking at GeForce GTX TITAN refresh with faster clocks. From the information I received it looks like GeForce GTX 780 Ti will consume more power than TITAN, since power section has been redesigned and manufacturers were notified about this change.


    Possible GeForce GTX 780 Ti specifications
    GeForce GTX 780 Ti GeForce GTX TITAN GeForce GTX 780 Radeon R9 290X
    GPU Codename Kepler GK110 Kepler GK110 Kepler GK110 Hawaii XT
    GPU Process 28nm 28nm 28nm 28nm
    GPU Config 2688 : 224 : 48 2688 : 224 : 48 2304 : 192 : 48 2816 : 176 : 64
    Video Memory 3GB / 6GB GDDR5 6GB GDDR5 3GB GDDR5 4GB GDDR5
    Memory Bus 384-bit 384-bit 384-bit 512-bit
    Base Clock 950+ MHz 837 MHz 863 MHz 800 MHz
    Turbo/Boost Clock 1000+ MHz 876 MHz 900 MHz 1000 MHz
    Memory Clock 1750 MHz 1502 MHz 1502 MHz 1250 MHz
    Effective Clock 7008 MHz 6008 MHz 6008 MHz 5000 MHz
    Bandwidth 336 GB/s 288 GB/s 288 GB/s 320 GB/s
    Power Configuration 8+6 Pin 8+6 Pin 8+6 Pin 8+6 Pin
    TDP 270 W 250 W 250 W 300W
    MSRP $649 $999 $649 $549

Page 1 of 85 1 2 3 4 5 6 11 51 ... LastLast

Similar Threads

  1. Replies: 9
    Last Post: May 12th, 2013, 15:06
  2. NVIDIA CEO Updates NVIDIA’s Roadmap
    By Shrapmeth in forum News
    Replies: 3
    Last Post: April 18th, 2013, 15:25
  3. Replies: 21
    Last Post: September 13th, 2011, 02:46
  4. Replies: 8
    Last Post: July 10th, 2010, 17:13
  5. Replies: 6
    Last Post: June 1st, 2010, 11:02

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Page generated in 0.21664 seconds with 15 queries.