GeForce RTX 4xxx Ada Lovelace & AMD Radeon 7xxx RDNA3

  • I like graphics cards of all sorts I just deprive NV's Anti-Competative and Anti Consumer tactics
    They constantly lie to people and always use false advertising, heck people could even sue them for that as it comes to false advertising it's self.

    Pretty much what Paul from NAAF explained in his video I linked earlier, for those that need a wake up call this video by Jim from Adored TV explains exactly how evil this Jen-Hsun Huang and his company NVIDIA actually is, for all the white knights protecting them all I can say that they are truly deranged and are being constantly lied to and I feel sorry for them.
    https://www.youtube.com/watch?v=H0L3OTZ13Os

    NVIDIA employs anti-competitive practices, closed source solutions (proprietary), and don't seem to mind screwing their own customers.

    They have a gigantic ego and think they deserve to be loved no matter what they do, all in all these are seen as narcissistic bullies.

  • Gold Leader:

    I have looked into the EVGA GeForce 1080 FTW2:

    That card is a overclocked NVDIA GTX 1080.

    EVGA Card:

    GPU: 1721MHz, Boost: 1860MHz

    MEM: 1377MHz

    TDP 215W

    Orginal:

    GPU 1607 Boost: 1733

    TDP: 180W

    MEM: 1251MHz

    I think there where a lot of small things that where the cause for your problems and the death of that card.


    First Thing:

    The original clock rate of chips is set by the chip manufacture after a ton of test runs.

    it is set to the Point where the worst possible chip runs stable.

    If you have a Wafer then the chips you produce in the middle of it are the best ones and to the borders it get's worse.

    So you will produce your top product in the middle and the mainstream products on the border.

    I will take 775 CPUs as a example:

    From the Middle to the Border: Xeon - Core 2 - Pentium D - Celeron

    You can use the top chips for servers and the worst you can use for consumer, where it don't matter much. You can disable defect cores (like AMD did) defect cache areas etc.

    (no guaranty that Intel worked this way, it is just to explain things)

    But even if you do that there might be a bad chip in the Middle of the Wafer. So you will test around 10.000 chips and choose the worst one for your clock speeds.

    Manufactures of oc cards are using this.

    They might test 100 or 200 of the chips they buy to build their cards, they might push a little the Voltage etc to find a good setting where most of the chips runs stable. Then they are setting this as the default. And now you have a SUPER FTW XL OC GAMER PENISLARGER Grafikcard.

    They may test graphics card before they sell them or not. Giving them a 10 minute burn in test. But i think they might just test 10 Cards per 1000 produced.

    So there might go some not stable cards out.

    Second Thing:

    there are often some bad VRM Chips in every charge of chips.

    Third Thing:

    If you combine different Hardware that can have interesting results.

    There are hardware combinations that are running way better and faster then they should. And hardware combinations that runs like shit even if it is top end hardware. And there are combinations that might break hardware down.

    I have seen messages like that already from the Nvidia driver and the AMD Driver. Most of the time when i had Hardware problems. (gpu not liking the screen, defect GPU, defect graphics memory etc)

    I think the driver crashes came from a not 100% working graphics card.

    The Driver normally don't control the Cards settings. (except if you OC the Card via the driver) The Driver might say go into Clock mode xyz

    Nvidia can't but the clock settings for every card in the driver. The Clock settings comes from the graphic cards bios. So the VRM Burnup wasn't the Driver but the BIOS EVGA flashed on that Card. The driver don't know what the VRM can do.

    I think that card that you had was a bad card of EVGA.

  • That NVIDIA does employ anti-competitive tactics is true however, I had to observe this myself, albeit less in the gaming area and more in the professional area (vGPUs with Teslas, deep learning as well as engineering applications, mostly in the simulation area). But then again, ANY company will do that once they've reached a position were they're ahead of the rest for a time...

    That any commercial company (we remember: They all exist only for profit) will treat you fairly under all circumstances is simply an illusion where you'd be lying to yourself. ;)

    We're simply seeing it more on the NVIDIA side now, because they're ahead in several areas: The Automotive AI sector, general deep learning, general GPGPU compute via CUDA (yes, proprietary) and vGPU for the cloud, etc.

    If AMD would lead the pack instead of NVIDIA, we'd just have the same sauce in red instead of green. :topmodel:


    Daß NVIDIA anti-kompetitive Taktiken implementiert ist in jedem Fall wahr, das konnte ich selbst beobachten, wenngleich auch nicht im Spielebereich, sondern im professionellen Sektor (vGPUs mit Teslas, Deep Learning und technische Anwendungen, meist im Simulationsbereich). Aber auf der anderen Seite muß man sagen, JEDE Firma wird das machen, sobald sie Mal etwas länger auf einem Markt eine führende Position eingenommen hat...

    Das irgendeine kommerzielle Firma (wir erinnern uns: Die existieren nur für Profit) einen unter allen Umständen fair behandeln würde ist einfach nur eine Illusion, mit der man sich selbst belügen würde. ;)

    Wir sehen's nur deutlicher bei NVIDIA, weil die halt in einigen Bereichen voraus sind: Im automativen KI Sektor, generell im Deep Learning, im allgemeinen GPGPU Compute Bereich mit CUDA (jop, proprietär) und vGPU für die Cloud, etc.

    Wenn AMD statt NVIDIA führen würde, hätten wir halt die gleiche Sauce, nur in rot statt grün. :topmodel:

    1-6000-banner-88x31-jpg

    Stolzer Besitzer eines 3dfx Voodoo5 6000 AGP Prototypen:

    • 3dfx Voodoo5 6000 AGP HiNT Rev.A-3700

    [//wp.xin.at] - No RISC, no fun!

    QotY: Girls Love, BEST Love; 2018 - Lo and behold, for it is the third Coming; The third great Year of Yuri, citric as it may be! Edit: 2019 wasn't too bad either... Edit: 2020... holy crap, we're on a roll here~♡!

    Quote Bier.jpg@IRC 2020: "Je schlimmer der Fetisch, desto besser!"

  • For all I have experienced AMD's Customer care actually cares for their customers same for Sapphire, NVIDIA is the absolute worst of them all, even if AMD was great they would not ignore their customers like NVIDIA aka n-Greedia has done over the years, I literally deprive and hate everything they represent.

    it's fine there are people that love them then again it's kind of sad as well, watch the video from AdoredTV by Jim he even proves what NV has done to people and how they like to run an other's company even if it's not theirs., but I think there will be a time NVIDIA will stop existing it's a slow burning company that has lost their most valued partners already like EVGA, BFG, Colorful, Hercules, XFX Pine Group even these guys were ditched out because they make AMD products as well and there were some others as well and there will be more that will end the collaboration with NVIDIA it won't end with EVGA.

    The entire RTX 4080 Scandal is their newest deepest low as well, two different cards baring the same damn name, Mislead marketing at it's best.

    RTX 4080 16GB uses the AD103 die as where the RTX 4080 12GB uses the AD104 die, both have different specs and there is a 400 USD gap, people like us will know the difference but most gamers and conventional users will not NVIDIA does not even label the actual specs of the GPU's used and what other needed specs to show this difference, people are being robbed right in front of their eyes, Ngreedia is greedy they don't care if a person buys these cards and falls for their trickery, NVIDIA have won them and that person was simply the bait & victim for their mislead marketing scandals.

    Selling two products under the same name with a 400 USD value difference and with different GPU type & GPU specs isn't really legal, it's confused marketing, some even refer it as false marketing, all in all it's just really bad.

    Even if they named the RTX 4080 12GB as a RTX 4070 this would still of been bad since no one would pay 900 USD for a 70 class card these mostly went for 500 a 550 Eur instead, so making that 12GB card a 4070 is abit better name wise but price wise it's still bad.

    All in all NV did major fuck up of these series and if you look around many people are upset angry and confused because as always NVIDIA lied to them.
    But why the hell am I not even surprised....

    Then again what ever happene to my EVGA GTX 1080 FTW2 which had the GP104-410-A1 GPU the refresh of the GP104-400-A1 GPU it jsut showed how low quality parts NV boards have it's not rthe first time a high end NV GPU died on me as for all the Radeons I had they gave me a far better experience lifespan wise especially.

    I am DONE with NVIDIA. no more, only AMD Radeons for me, they at least have a far better build quality, they are more performant per watt per design and the price /performance aspect ratio is also much cleaner.

    Yea yea I am on V64 but I did undervolt it so all I will do now is save up for a Nitro+ RX 7900 XT and use that to replace my V64 which will be around 6.5 to 7 years of age a nice time to upgrade it.

    More facts are popping up lol

    NVIDIA is LYING! The truth about DLSS 3 and RTX 4000 "Higher" FPS...

    https://www.youtube.com/watch?v=qFMSgzJlzFI

    3 Mal editiert, zuletzt von Gold Leader (25. September 2022 um 02:10)



  • NVIDIA is LYING! The truth about DLSS 3 and RTX 4000 "Higher" FPS...

    https://www.youtube.com/watch?v=qFMSgzJlzFI

    Yes, without the new DLSS 3.0 it will probably be much closer to the 3000 series. Just stupid that only the 4000 series supports DLSS 3.0. Anyone who has a 2000 or 3000 series cannot use the new feature. I have a Titan RTX and then I can't use it either :|. Since this still has the Turing architecture. My next card will be a Radeon again. I also think the power consumption of the new cards is clearly too high and I will not buy a card that has more than 300w power consumption.

  • Ist ja wie im Hardwareluxx...

    Jetzt definitiv :D

    Für mich bleibt das immer eine Abwägung. Was benötige ich (Leistungsklasse, Technologie) und welche Rahmenbedingungen habe ich (Geld, Verfügbarkeit, Wärmeabgabe/Gehäuse, restliche Komponenten...). Bei emotionalen Entscheidungen kann ich nur immer den Kopf schütteln, da hat AMD nicht viel weniger Mist in der Vergangenheit gebaut (z.B. Treibersupport für ältere GCN Karten von heute auf morgen einstellen, während dem Crypto-Boom).


  • Then again what ever happene to my EVGA GTX 1080 FTW2 which had the GP104-410-A1 GPU the refresh of the GP104-400-A1 GPU it jsut showed how low quality parts NV boards have it's not rthe first time a high end NV GPU died on me as for all the Radeons I had they gave me a far better experience lifespan wise especially.

    You can only Blame EVGA for the bad parts. Not Nvidia in that case. You can't even blame the NVIDIA support for not helping you with a card that is not build from them and where the chip runs outside their specs.

    I don't say, what NVIDIA does is good. Yes they killed a ton of company's.

    For me i have problems with both. AMD and NVIDIA.

    But the NVIDIA Drivers where working way better for me.

    I am hoping that we will see something nice from the new players in the market.

    Intel Arc might be the beginning of fast Intel gpus

    MTT S60 seams to be a nice "Beginner Card" Power around a GTX1080/1070 after everything i have seen.

  • Für mich bleibt das immer eine Abwägung. Was benötige ich (Leistungsklasse, Technologie) und welche Rahmenbedingungen habe ich (Geld, Verfügbarkeit, Wärmeabgabe/Gehäuse, restliche Komponenten...). Bei emotionalen Entscheidungen kann ich nur immer den Kopf schütteln, da hat AMD nicht viel weniger Mist in der Vergangenheit gebaut (z.B. Treibersupport für ältere GCN Karten von heute auf morgen einstellen, während dem Crypto-Boom).

    Du bist in der Hinsicht wohl auch so ein Opportunist wie ich. Mir ist auch so ziemlich wumpe, was draufsteht.:spitze:

    "Du bist und bleibst a Mensch und du kannst eben net deine menschlichkeit überwinden."

    Dennis_50300



  • NVIDIA is LYING! The truth about DLSS 3 and RTX 4000 "Higher" FPS...

    https://www.youtube.com/watch?v=qFMSgzJlzFI

    Yes, without the new DLSS 3.0 it will probably be much closer to the 3000 series. Just stupid that only the 4000 series supports DLSS 3.0. Anyone who has a 2000 or 3000 series cannot use the new feature. I have a Titan RTX and then I can't use it either :|. Since this still has the Turing architecture. My next card will be a Radeon again. I also think the power consumption of the new cards is clearly too high and I will not buy a card that has more than 300w power consumption.

    At least you understand what is going on, I will never fall for nGreedia's lies again just how they treated me their customer support is absolute dog shite of quality, they really don't give a damn what a customer's issue is they just tell them to buy one of their newer products it's that obscene.

    My next GPU will always be a Radeon the Sapphire Nitro+ Radeon RX 7900 XT will be the card I am going to go for to replace my Sapphire Nitro+ RX Vega 64 which has been in a reliable 5.5 year service AMD's drivers are fantastic I never had the countless issues many tends to cry about, tey are probably n00bs with using a simple production card or just are new to it or are just too afraid to t out something that isn't branded NVIDIA lol, I never had any issues with drivers from ATi, Omega Heaven as well as from AMD, I just find it strange that a simple driver can be so complicated to use for these people it's hilarious and funny in that perspective, besides the last 10 years AMD's drivers have been top quality based tbh, NVDIIA drivers are a hell for Linus users as well.

    The only thing NVIDIA is good at for use is AI other than this you're best off with a Radeon, but I am sure AMD will improve their AI performance as well time will surely show that difference.

    But yea DLSS 3.0 is a fluke rofl, a marketing flaw in my humble opinion, that video I posted explains how bad it really is and how NVIDIA lies on about it, it's amazing.

  • What the on planet Earth are you on about... EVGA was NVIDIA's top partner man, yea they had their bad cards here and there, but 9 of the 10 times they use the parts NVIDIA uses for their boards, quality wise they were often the cause for defects, solder quality was bad all round it didn't matter what brand you had or used, watch the tear downs from Steve Burke aka Gamers Nexus will point out such things.
    Geforce 8800 GTS/ GTX / Ultra series are some good examples same for the GTX 280, GTX 295, GTX 480 , GTX 780 Ti these all had very high RMA rates it didn't matter what brand they were from, then the Turing & Ampere boards, 60% of the cards made were RMA'ed, EVGA, Gainward, leadtek, PNY,, ASUS, Gigabyte , ZOTAC etc.

    NVIDIA is the worst company you'd ever see me say anything good about... they are good for AI Computing and that is really it, they are good at that but they are also good with marketing and lying to their customers, mainly by confusing them by selling two products with the same name yet different specs and a 400 USD price difference the GTX 970 Scandal and the GTX 1060 Scandal are two other good examples.


    NVIDIA will like those that give them their hard earned money for their overpriced products.
    EVGA's CEO this was like lifting a massive weight from their shoulders. I am hoping other AB's will ditch NVIDIA the more the merrier, NVDIIA needs to die period.
    Jason Langevin explains:
    https://www.youtube.com/watch?v=12Hcbx33Rb4

    Intel ARC had a very poor start tbh, heck it was almost cancelled :steinigung:
    It does okay in the mid section so yea I guess it's better than nothing, it's too new to really decide it's fate tho, we will have to wait and see.
    But it would be nice to see them do better, I would surely try out the ARC 770 if it's well priced, it's nice to have a 3rd choice even if it's Intel it's at least something new to look in to and I too stand open for such choices.
    But if it's an Intel only product then that would be a very bad statrt, I am really hoping their cards will perform both well in AMD & Intel setups.
    Even I do doubt they doing this it is Intel after all.

    Intel does have a few years to go I think before they can even compete with AMD or NVIDIA.
    But for me and msot people I am connected to, AMD is the best choice, even they too had their shills, at least they were not all named the same thing... just not as bad as NV has done.

    NVIDIA is to blame for everything even as my GTX 1080 FTW2 went EVGA actually tried to help me as others had it as well but they too were forwarded to NV Support and they too were ignored like I was, NVIDIA is the main designer of every GTX 1080 if the base design is bad so will the AIB designs be bad because the base components are of bad quality it won't matter if you have 24"4 phases or 16+ 4 phase if the parts are all of the same manufacturer it will suck all ways man.

  • If i design a 250 ml water glass. And give that design to you. You use it and sell it as 350ml water glas. And it don't work for the people. Who is to blame?

    If you are selling a overclocked card, you will need a better VRM then the original design.

    Fact is that EVGA Card didn't run in the specifications that PCB was for, if they have taken the NVIDIA design 1 to 1.

    If not they have designed a bad VRM.

    In both settings they are to blame.

    I don't say NVIDIA is a good Company. I don't even see Card designs today i would want to have....

    But thier drivers are running good on my systems. Even on Linux. The never AMD Drivers won't load / start on my System and are crashing. While the older drivers are putting out 15 fps without clocking the card up.

    I can't buy a intel ARC Card. i can't even buy a MTT S60. That is the Problem here.

  • then you are most likely doing something wrong, the drivers are rock solid in Linux even better than NVDIIA's drivers I know a ton of linux users from x-3dfx and outside it that have fantastic experiences, which is why I think you are most likely doing something incorrectly.

  • Wir bewegen uns langsam sinkend auf PCGH-Niveau zu. Schaffen wir es, auch Computerbase-Niveau zu erreichen?

    Oh, btw:

    FGLRX sucks! Just use radeonhd!!!!

    "Du bist und bleibst a Mensch und du kannst eben net deine menschlichkeit überwinden."

    Dennis_50300

    2 Mal editiert, zuletzt von CryptonNite (26. September 2022 um 18:45)

  • You are saying I am doing something wrong when i am pressing the 5 or so buttons from the AMD Driver installation on Windows 7?

    Fact is, that of the drivers i have tested every driver up to 17.something runs and all newer are just crashing. Without Information..

  • that you have issues doesn't mean everyone does :') you're a first tbh, I know a few hundred people that use Radeons under Linux just fine then again it may depend what Distri you are using since there are thousands of them as well, so being a bit more specific can be helpful ;)

    And I can't help you are having a bad time, just don't do as if it's for all people, maybe the few from this small community, still it says nothing.

    Just strange really.

    TokenTech is a well known Linux Gamer he uses Radeons he may have some tips for you :)
    https://odysee.com/@TokenTech

    Here is YT Page:
    https://www.youtube.com/c/TokenTech