User Tag List

Results 1 to 4 of 4

Thread: ATI - Assassin’s Creed with DX 10.1 runs up to 20 percent faster

  1. #1
    Member
    • GiDDY_SOUL's Gadgets
      • Motherboard:
      • GIGABYTE B350 GAMING 3
      • CPU:
      • AMD Ryzen 1600
      • RAM:
      • 8GB Corsair Dominator RAM X2
      • Hard Drive:
      • 320GB Samsung F1 X2 RAID + PNY 240GB SSD
      • Graphics Card:
      • HIS R9 290 BIOS undervolt tweaked
      • Display:
      • LG 21.5' IPSHD LCD
      • Sound Card:
      • Realtek HD Audio 8CH (Built-in)
      • Speakers/HPs:
      • Creative Inspire 5.1, Cosonic 775
      • Keyboard:
      • Logitech
      • Mouse:
      • A4tech Glaser / GiGABYTE GM-M8000
      • Controller:
      • Xbox360 Wireless
      • Power Supply:
      • Cosair HX1050
      • Optical Drive:
      • None
      • USB Devices:
      • Transcend Jetstore 250GB
      • UPS:
      • None
      • Operating System:
      • Windows 10 Pro
      • Comment:
      • The Best Cost efficient PC of the Time..
      • ISP:
      • Doze
      • Download Speed:
      • 700
      • Upload Speed:
      • 700
    GiDDY_SOUL's Avatar
    Join Date
    Mar 2008
    Location
    Dhaka
    Posts
    1,087

    Default ATI - Assassin’s Creed with DX 10.1 runs up to 20 percent faster

    After Service Pack 1 installation

    Rage3D took the time to test Assassin’s Creed from Ubisoft and it looks that this might be the first game to use DirectX 10.1 code.

    What they found out is that with DirectX 10.1 hardware you get up to a 20 percent performance boost and we all know that DirectX 10.1 is part of the Vista service pack and is currently only supported by the Radeon 3xxx series of ATI cards and S3's new Chrome 430GT.

    Nvidia simply doesn't and won't support DirectX 10.1 and it looks like a 20 percent improvements in Anti Aliasing is something that definitely gets one's attention.

    The Rage3D boys did the testing and according to their findings you can gain up to 20 percent with DirectX 10.1, and this is something that you simply can’t ignore.

    You can find a link to the tests below and this is something that should get everyone’s attention. It wouldl be fun to test with Nvidia cards to see if this is SP 1 related or not.

    You can find the article here -> http://www.rage3d.com/articles/assas.../index.php?p=1

  2. #2
    Member
    • ЯV●@spret's Gadgets
      • Motherboard:
      • Asus 970 pro
      • CPU:
      • AMD FX-6100 @ 3.5ghz
      • RAM:
      • Kingstone Hyper-X 1600mhz 8GB DDR3
      • Hard Drive:
      • Samsung Spinpoint 1TB 7200rpm
      • Graphics Card:
      • XFX HD 6870 @ 910mhz
      • Display:
      • Viewsonic 17'' LCD @ 1440x900
      • Sound Card:
      • Creative Tactic 3d Alpha USB sound card
      • Speakers/HPs:
      • Thermaltake TT Shock
      • Keyboard:
      • A4 Tech Normal
      • Mouse:
      • A4 Tech 2x Office
      • Controller:
      • Xbox 360 controller
      • Power Supply:
      • Gigabyte Superb 720W
      • Optical Drive:
      • DVD RW
      • USB Devices:
      • WD Ext. 2TB 7200 rpm HDD
      • UPS:
      • RahimAfrooz 100VA Premium
      • Operating System:
      • Windows 7 home premium 64 bit
      • Benchmark Scores:
      • dont know
      • Comment:
      • Awsome....
      • ISP:
      • BTCL
      • Download Speed:
      • 110-135kbps(avg)
      • Upload Speed:
      • 64kbps
    ЯV●@spret's Avatar
    Join Date
    Mar 2008
    Location
    Chittagong
    Posts
    1,123

    Default

    ATI all time rocks as gpu. Nvidia sucks
    Speed is like ADDICTION.....

  3. #3
    Member
    • GiDDY_SOUL's Gadgets
      • Motherboard:
      • GIGABYTE B350 GAMING 3
      • CPU:
      • AMD Ryzen 1600
      • RAM:
      • 8GB Corsair Dominator RAM X2
      • Hard Drive:
      • 320GB Samsung F1 X2 RAID + PNY 240GB SSD
      • Graphics Card:
      • HIS R9 290 BIOS undervolt tweaked
      • Display:
      • LG 21.5' IPSHD LCD
      • Sound Card:
      • Realtek HD Audio 8CH (Built-in)
      • Speakers/HPs:
      • Creative Inspire 5.1, Cosonic 775
      • Keyboard:
      • Logitech
      • Mouse:
      • A4tech Glaser / GiGABYTE GM-M8000
      • Controller:
      • Xbox360 Wireless
      • Power Supply:
      • Cosair HX1050
      • Optical Drive:
      • None
      • USB Devices:
      • Transcend Jetstore 250GB
      • UPS:
      • None
      • Operating System:
      • Windows 10 Pro
      • Comment:
      • The Best Cost efficient PC of the Time..
      • ISP:
      • Doze
      • Download Speed:
      • 700
      • Upload Speed:
      • 700
    GiDDY_SOUL's Avatar
    Join Date
    Mar 2008
    Location
    Dhaka
    Posts
    1,087

    Default

    Not always true. If u look at the 8800 Ultra, an 1 year old product from nvidia, Even an Ati Radeon 3870 X2 have a hard time beating it.
    The Ati cards r cool with gr8 image Quality with very gud price/performance ratio.

    The Nvidia High end cards r the best performing availavble but Costs a bit higher meanin price/performance ratio sucks

  4. #4
    Member
    • GiDDY_SOUL's Gadgets
      • Motherboard:
      • GIGABYTE B350 GAMING 3
      • CPU:
      • AMD Ryzen 1600
      • RAM:
      • 8GB Corsair Dominator RAM X2
      • Hard Drive:
      • 320GB Samsung F1 X2 RAID + PNY 240GB SSD
      • Graphics Card:
      • HIS R9 290 BIOS undervolt tweaked
      • Display:
      • LG 21.5' IPSHD LCD
      • Sound Card:
      • Realtek HD Audio 8CH (Built-in)
      • Speakers/HPs:
      • Creative Inspire 5.1, Cosonic 775
      • Keyboard:
      • Logitech
      • Mouse:
      • A4tech Glaser / GiGABYTE GM-M8000
      • Controller:
      • Xbox360 Wireless
      • Power Supply:
      • Cosair HX1050
      • Optical Drive:
      • None
      • USB Devices:
      • Transcend Jetstore 250GB
      • UPS:
      • None
      • Operating System:
      • Windows 10 Pro
      • Comment:
      • The Best Cost efficient PC of the Time..
      • ISP:
      • Doze
      • Download Speed:
      • 700
      • Upload Speed:
      • 700
    GiDDY_SOUL's Avatar
    Join Date
    Mar 2008
    Location
    Dhaka
    Posts
    1,087

    Default Ubisoft caught in a little Assassin’s Creed scandal

    In the beginning, everything looked perfect. The DX10.1 API included in Assassin’s Creed enabled Anti-Aliasing in a single pass, which allowed ATI Radeon HD 3000 hardware (which supports DX10.1) to flaunt a competitive advantage over Nvidia (which does not support DX10.1). But Assassin's Creed had problems. We noticed various reports citing stability issues such as widescreen scaling, camera loops and crashes - mostly on Nvidia hardware.

    Ubisoft became aware of these complaints, which ultimately led to the announcement of a patch. According to Ubisoft Montreal, this patch will remove support for DX10.1 and exactly this result caused Internet forums to catch fire.

    So, what is it that convinced Ubisoft to drop the DirectX 10.1 code path? Here is the official explanation:

    “We’re planning to release a patch for the PC version of Assassin’s Creed that addresses the majority of issues reported by fans. In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin’s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly.”

    We certainly could eat this statement with a lot of salt, but the game developer did not address a statement made by ATI's Developer Relations team, which was released when the game was introduced:

    “Ubisoft [is] at the forefront of technology adoption, as showcased with the fantastic Assassin’s Creed title. In this instance our developer relations team worked directly with the developer and found an area of code that could be executed more optimally under DX10.1 operation, thus benefiting the ATI Radeon HD 3000 Series.”

    Let’s get this straight: On the one hand, the game was released, worked better on ATI hardware, supported an API that Nvidia or Intel did and still do not support and then game developer is releasing a patch that will kill that advantage. This brought back memories of Tomb Raider: The Angel of Darkness, a TWIMTBP-supported game that worked better on ATI hardware at time of release, because Nvidia was plagued with performance issues of GeForce FX series.

    Assassin's Creed is an Nvidia-branded “The Way It's Meant To Be Played” title, and it didn't take long until rumors about a possible foul-play by Nvidia surfaced. Some voices on Internet forums allege that Nvidia threatened Ubisoft and requested the developer to remove DirectX 10.1. We kept our eyes on this development, but when we received e-mails from graphics card manufacturers (ATI and Nvidia customers) adding to the already heated discussion of what may have happened in the background, we decided to shift our investigation into a higher gear and talk to all parties involved.


    DirectX 10.0 vs. DirectX 10.1 in Assassin' Creed effects

    The difference that developers failed to explain is the way how AntiAliasing is controlled in DirectX 10.0 and 10.1. In DX10.0, it was impossible to access information for each sample from a depth buffer. This actually led to a costly slowdown in AntiAliasing operations. 10.1 allows shader units to access all AntiAliasing buffers. All of this was brought to limelight by article an over at Rage3D (http://www.rage3d.com/articles/assassinscreed/).

    Following three quotes from software developers, this effect was experienced with all DirectX 10 titles, and there is a good chance that you've already played their games. We talked to a (DX10.0) game developer close to Ubisoft, who requested to remain anonymous, told us that Ubisoft’s explanation walks on thin ice. Here is what he responded to our inquiry and his take on Ubisoft’s statement:

    “Felt you might want to hear this out. Read the explanation and laughed hard … the way how DX10.1 works is to remove excessive passes and kill overhead that happened there. That overhead wasn't supposed to happen - we all know that DX10.0 screwed AA in the process, and that 10.1 would solve that [issue]. Yet, even with DX10.0, our stuff runs faster on GeForce than on Radeon, but SP1 resolves scaling issues on [Radeon HD 3800] X2.”

    We received a second reply from another game developer, who is currently a DirectX 10.1 title that fully compliant with DX10.0 hardware:

    “Of course it removes the render pass! That's what 10.1 does! Why is no one pointing this out, that's the correct way to implement it and is why we will implement 10.1. The same effects in 10.1 take 1 pass whereas in 10 it takes 2 passes.”

    A third email reply reached us from a developer a multiplatform development studio:

    “Our port to DX10.1 code does not differ from DX10.0, but if you own DX10.1-class hardware from either Nvidia or ATI, FSAA equals performance jump. Remember "Free FSAA"?”


    Michael Beadle, senior PR representative at Ubisoft and Jade Raymond, producer of the game, told us in a phone call that the decision to remove DX10.1 support was made by Ubisoft’s management. Both told us that there was no external influence which would mean that Nvidia did not participate in this decision. It was explained to us that features were implemented and tested on a platform with DirectX 10.1 graphics during the development process, which led to an implementation in the final code. However, that code was untested on a large number of DX10 systems and that ultimately led to crashes or system instability.

    Ubisoft's explanation indicates a classic case of QA failure that already happened to EA's Crysis as well. Unfinished code was released as the final version. The changes developer made caused instabilities with GeForce hardware, but owners of older ATI products should also be affected (Radeon 2900, for instance) and expect crashes or camera lockdowns.


    Money? What Money? Oh, that money.

    There is no information whether DX10.1 will be re-implemented and that fact makes the story look fishy. There are rumors that Nvidia may have threatened Ubisoft to pull from co-advertising deals, which are said to have a value of less than $2 million. As a sane businessman, you don't jeopardize a cooperation because of one title - and those $2 million are just one component in the cooperation between these two companies. Of course, we asked both companies for comment, which delivered two different answers.

    Derek Perez, director of public relations at Nvidia said that “Nvidia never paid for and will not pay for anything with Ubi. That is a completely false claim.” Michael Beadle from Ubisoft stated that “there was a [co-marketing] money amount, but that [transaction] was already done. That had nothing to do with development team or with Assassin's Creed.”

    So, Nvidia appears to have some sort of financial relationship with Ubisoft, just like EA, Activision and other top publishers. Yes, AMD has a similar cooperation in place, but it is not as extensive as Nvidia’s program.


    Conclusion

    We leave it up to you to draw your own conclusion. Our take is that the Ubisoft team could have done a better to bringing the game to the PC platform. The proprietary Scimitar engine showed a lot of flexibility when it comes to advanced graphics, but the team developed the DirectX 10.1 path without checking the stability of DirectX 10.0 parts, causing numerous issues on the most popular DX10 hardware out here - the GeForce 8 series. The new patch will kill DX10.1 support and it is unclear when DX10.1 will see another implementation. The first "victim" of this battle was Nvidia (on a title they support), the second victim was AMD. Who is really at a loss here, however, are gamers, who are expected to pony up $50 or $75 (depending on where you live) for a title that was not finished.

    Sadly, this is the way PC gaming is: There is a great game, but it is burdened with technical issues and ends up getting caught in marketing wars. We have no doubt that the development team made a mistake in QA process and integrated a feature that caused instabilities with Nvidia cards. Given Nvidia’s market share, Ubisoft had to patch it. What we do not understand is the reason for an explanation that left more questions than answers.

    But then again, only the developers at Ubisoft know what the Scimitar engine can and cannot do.

    Solution is simple: if you run ATI hardware, just don't patch the game (once that the patch comes out) and enjoy all the advantages of DirectX 10.1 - e.g. one pass less to render.
    From TGdaily

Similar Threads

  1. Wish????????????
    By Walking Death in forum Miscellaneous Topics
    Replies: 63
    Last Post: November 22nd, 2011, 17:05
  2. Goal Of The Month February'08
    By Noob™ in forum Goal of The Month
    Replies: 2
    Last Post: March 2nd, 2008, 19:49
  3. 6300 agp urgent sell
    By soul soccer in forum Shop - Gadgets
    Replies: 0
    Last Post: February 17th, 2008, 23:17

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Page generated in 0.19514 seconds with 14 queries.