Intel ARC series GPU's yea we need a thread for these too, so here it is!

  • Hmm strange, I can find 6800 XT's for 500 Euros quite easily same for RTX 3080's, it just depends where you look I would guess.

    And no I don't care for ray tracing, it marketing gimmick garbage that I can't even see lol I Seen many demo's in RL and nah man I could not see much actual difference one image was a bit darker than the other, and to spend so much on a RTX GPU just fora stupid effect no thanks, once you disable it you get a ton more FPS as well, we never needed in the 3dfx days so we won't need it now.

    Al whatl I could see were the stupid taxes it was causing to the prices of these graphic cards, even most games that do take good use of it hmm let's see Control, Minecraft RT & Quake II... 1997, it barely show any differences, as for reflections it adds, uhm so what? who cares??? our Voodoo 5's could do soft reflections, what is there to miss?

    So what are they lying about in that sense? Yea it's funny how companies do bad with marketing like that it just hurts my brain.
    Also how AMD did the 8K gaming act, this was retarded and stupid, no one is asking for such things, even that this was a joke towards NVIDIA 8K gaming during their 2080 Ti release, in the sense yea we too can pull out a dumb joke like that.... I still think they should not of done that, that too doesn't leave a good mark, even I did get the joke and found it hilarious, it would of been best to leave out such remarks especially ina presentation, if it was done in a review as a joke then it would of been fine.

    In the end all companies are greedy man, they just want your money they won't give a damn who buys what, they will pull every trick out of their sleeves to attract buyers.
    And they come to use with these terrible marketing gimmicks, I just want a product that performs great and that can last for 6 a 7 years, this is why I buy the best Radeon I can find and go with that for 6 a 7 years and after that I will see what comes around.

    Now you may get why I prefer the 3dfx era we just had more choice and nothing was as stupid as it was now.

  • Hm... I guess the need for Raytracing depends on your own point of view. Is it important? Maybe Not. But we had the same Situation many years ago.

    Do you remember when Lightmaps were new?

    It was a big think and Games started took look more believable in terms of lightning.

    But Lightmaps dropped the Framerate by almost

    50%! 3dfx was clever enough and added a second TMU for Voodoo Graphics and the later Voodoo2 brought the capability for Multitexturing to the masses.

    But you're right. I See No real difference in Games Like Cyberpunk wich could excuse the massiv framedrop.

    Control is a different story. It Looks amazing.

    So i started playing with RT on 1080Ti and barely 35fps.

    And i don't know If buying a high end GPU is the best choice nowdays. (You can correct me)

    High End GPUs cost your Soul, need much Power, needs a big cooling solution and got killed by next gen midrange GPU's.

    Just look at the cool Rage Fury MAXX.

    It was expensive but got toasted by a 32MB Radeon SDR.

    You could spend nearly 600€ on a Geforce 2 Ultra and even a Geforce 3 Ti200 leaves it in the dust.

    The unglory FX5800Ultra got smoked by a Midrange GeForce 6600.

    OK the Radeon 9700Pro was a unicorn.

    That beefy GPU would last for a Long time in Games with PS2.0.

    Think of Radeon R9 Fury. It was fast but expensive and need much power.

    AMDs Polaris RX480 killed it for much less Money and Power.

    Think of Vega (without UV tweaks). Vega was expensive for a very long time and the cheaper midrange RX5600XT is in the same Level of Performance but with much less Power consumption. I had Vega too but also a RX590 Nitro at the same time. I have to say that the RX590 was the better buy in terms of Price/Performance.

    I'm a friend of Low End. My GTX750ti was a steal and Last for Long. I have a 1050ti in 3rd PC and that Card did the Job for a Long time too.

    Heck i can throw everything on that little GTX. The Card will do the Job.

    Yes my old Sapphire R9 290 Tri X can do it too, but it needs so much more Power.

    And Powerconsumption becomes more and more important these days.

  • Control schaut miit RT definitiv geil aus. Aber nur wenige Spiele sind so. Die Unterschiede in CyberPunk finde ich schon klar ersichtlich, aber halt nicht so deutlich wie in Control. Da war dann auch noch dieses kleine Lego Spiel mit RT (wie hieß das noch gleich?) das mit RT ziemlich gut ausgesehen hat. Und dann gibt's halt Sachen wie Riftbreaker (das ich liebe!), wo man quasi gar nichts davon sieht. Hängt echt einfach vom jeweiligen Spiel ab, vor allem jetzt in dieser Anfangszeit von RT.

    War eh mit fast jedem neuen Feature so, z.B. Bump Mapping oder EMBM. Sogar Pixel Shader - jetzt allgegenwärtig - waren anfangs mehr obskur als sonst was. Und der Leistungseinbruch dabei war anfangs komplett irrsinnig. Selbiges bei Hardware Triangle Transformation (jetzt: Vertex Shader), die die heutigen, hohen Polygondichten überhaupt erst ermöglicht haben. Dieses Zeug bildet die Grundsteine moderner Spiele, aber als das Zeug erstmals als "T&L" rausgekommen ist haben wir uns alle drüber lustig gemacht. ;)

    Um Eindrucksvolleres mit RT zu sehen werden wir halt einfach noch ein paar Jahre warten müssen. Vor allem wenn man jetzt nicht die Energie investieren kann/will, die dazu nötigen Wattmonster zu betreiben.


    Control is definitly beautiful with RT on. But not many titles are like that. The differences are clearly visible in CyberPunk imho, but not nearly as pronounced as they are in Control. There was also this funny little Lego game with RT (what was it called again?) that looked pretty good with the feature being active. Then there are games like Riftbeaker (which I love!), that don't seem to profit at all. It really depends on the individual game, especially in this early age of RT.

    It was similar for almost every new feature that was introduced, like e.g. Bump Mapping or even EMBM. Or take pixel shaders - now ubiquitous - and they were once more obscure than anything else. And the performance cost was insane in the beginning. Not to mention hardware triangle transformation (now: vertex shaders), which are what makes today's high polygon counts possible. They're cornerstone technologies of modern games, and yet back in the day we mocked both as "T&L" came out. ;)

    For more RT impact, we'll just have to wait a few years longer. Especially if you cannot or do not want to invest the energy required to run the wattage beasts you need for it.

    1-6000-banner-88x31-jpg

    Stolzer Besitzer eines 3dfx Voodoo5 6000 AGP Prototypen:

    • 3dfx Voodoo5 6000 AGP HiNT Rev.A-3700

    [//wp.xin.at] - No RISC, no fun!

    QotY: Girls Love, BEST Love; 2018 - Lo and behold, for it is the third Coming; The third great Year of Yuri, citric as it may be! Edit: 2019 wasn't too bad either... Edit: 2020... holy crap, we're on a roll here~♡!

    Quote Bier.jpg@IRC 2020: "Je schlimmer der Fetisch, desto besser!"

  • Gibt es überhaupt ein Spiel, das Bumpmapping richtig genutzt hat? Dafür gibts doch Normal- und Displacement-Mapping. Auch Parallax-Mapping ist besser...

    "Du bist und bleibst a Mensch und du kannst eben net deine menschlichkeit überwinden."

    Dennis_50300

  • GrandAdmiralThrawn Genau so sehe ich das auch. Control ist da wirklich die Ausnahme. Reflexion und Beleuchtung (Janz wichtig!) ist mit RT auf einem ganz anderen Level. Und das ohne starken Leistungsverlust. Für die Zukunft wird's aber doch sehr interessant. RT steht am Anfang, so wie damals Bilineare Filterung in ansprechender Auflösung, das hat Jahre gedauert, bis genug Speicherbandbreite bezahlbar war.

    CryptonNite welches Bumpmapping meinst du? TR4 hatte das schon sinnvoll genutzt, sofern die Grafikkarte das konnte.

  • Nur bedingt, auch wenn die beiden Techniken auf das gleiche abzielen. Tesselation ist eine echte Geometrietransformation, also da werden wirklich 3D Modelle dynamisch generiert. Bump Mapping und Environment-mapped Bump Mapping waren reine Texturtricks auf Basis zusätzlicher Normal Maps. Wenn du eine Bump Mapped Textur also in sehr spitzem Winkel betrachtest, sieht sie unnatürlich flach aus, der scheinbare Tiefeneffekt geht verloren. Je weiter du dich vom orthogonalen Betrachtungswinkel (45°, also die Normale zum Dreieck auf dem die Textur liegt) entfernst, desto unnatürlicher wird der Effekt. Bei Tesselation hast du das Problem nicht, weil's ja echtes 3D ist.

    Andere Technologie, besseres Ergebnis. Kostet aber soweit ich weiß in der Regel auch mehr Leistung als die alte Technik.

    1-6000-banner-88x31-jpg

    Stolzer Besitzer eines 3dfx Voodoo5 6000 AGP Prototypen:

    • 3dfx Voodoo5 6000 AGP HiNT Rev.A-3700

    [//wp.xin.at] - No RISC, no fun!

    QotY: Girls Love, BEST Love; 2018 - Lo and behold, for it is the third Coming; The third great Year of Yuri, citric as it may be! Edit: 2019 wasn't too bad either... Edit: 2020... holy crap, we're on a roll here~♡!

    Quote Bier.jpg@IRC 2020: "Je schlimmer der Fetisch, desto besser!"

    Einmal editiert, zuletzt von GrandAdmiralThrawn (3. Januar 2023 um 12:53)

  • Texturemap, Lightmap und Bumbmap.

    ATI's erste Radeon konnte dies dank 3 Pipelines im einem Durchgang rendern. Leider haben Spiele das erste lange nach dem Release der Radeon ausgiebig angewandt.

    Halflife2 ist ein tolles Beispiel dafür wie die Radeon das Spiel mit ihren 6 Textureneinheiten rocken kann. Mindestens 300% über dem Niveau einer Rage Fury Pro.

    HL2 Gameplay Rage Fury Pro - Radeon 9800

    Der Voodoo Graphics kann auch bis zu 3 TMU's verwalten. Hach das wäre was gewesen.

    Ein Voodoo2 SLi Setup mit einer Füllrate von bis zu 600MTexel. ^^

  • the R9 Fury X beats even the RX 590, the RX 480 was very power efficient but it could not even touch a R9 290, let stand the R9 290X & R9 Fury X, a good fine tunes R9 Fury X could easily take out the 980 Ti that is a decent under volt and OC and you could surely beat those things, I went for the Sapphire Nitro+ OC R9 Fury, this is not the FuryX but it runs at the speeds of the Fury X aka the GPU & HBM clocks, it can be unlocked to a Fury X with a Bios Hack and I paid for one in 2016 for 265 Eur from a caring collector from Austria I compared it to the RX 480 and the Nitro+ R9 Fury destroyed the RX 480 in every game what ever you threw at it.

    So I don't know what you did wrong saying that a RX 480 beats a Fury, in the actual reality it comes no where near it.
    As for now I am still using my Sapphire Nitro+ RX Vega 64 , it has been in service for the past 5 year sand 8 months aka 5.9 years and no too are not going to replace it yet it still does everything I want to at 1440p, I Bought it for 620 Eur back in March of 2017, so it's actually paying me back in that sense, nothing much to loose or left alone to miss out on.

    The Sapphire Nitro+ RX 580 Special Edition is in my server, it is only used when a friend comes over so we can play ARK for the rest it's fans never spin as it's always in idle mode, it's still a fine card for 1920x1200x32 in most games so yea not gonna replace that either, I love my Radeons and as long they do what they need to do I'm good and I never will need a GeForce to be happy, I prefer Radeons their drivers make more sense and I find them far more stable in every work load plus the image quality output is always better.

    Here my current Radeons:

    Spoiler anzeigen

    Sapphire Nitro OC+ Radeon R9 Fury PCI_E 4GB 4096Bit HBM




    Sapphire Nitro+ Radeon RX 580 Special Edition PCI-E 8GB 256Bit GDDR5

    Sapphire Nitro+ Radeon RX Vega 64 PCI-E 8GB 2048Bit HBM2



    But coming back to the RT thing it's not even full Ray Tracing like what Render farms do for Hollywood Movies it;s nothing even close to that it's a very tiny fraction of that, by the time GPU's can do that you will be 40 a 50 years down the road I think.

    NVIDIA uses false marketing and refers it as full ray tracing which it really isn't it's nothing close to it, Intel & AMD also have Ray Tracing but they too only do a tiny fraction of it, so to me it's not even the real deal, as for Control it's not my kind of game, so for me I prefer not to pay up for the RT tax.

    Einmal editiert, zuletzt von Gold Leader (4. Januar 2023 um 12:57)

  • Wie schaut's jetzt eigentlich wirklich aus mit der Zukunft der ARC? Also nachdem Intel die gesamte GPU Division Ende 2022 restrukturiert / gespalten und anderen Abteilungen untergeordnet hat? Raja Koduri wurde ja auch wieder auf eine untere Ebene zurückgestuft.

    Weiß da jemand mehr (außer die typischen Webgerüchte)?

    1-6000-banner-88x31-jpg

    Stolzer Besitzer eines 3dfx Voodoo5 6000 AGP Prototypen:

    • 3dfx Voodoo5 6000 AGP HiNT Rev.A-3700

    [//wp.xin.at] - No RISC, no fun!

    QotY: Girls Love, BEST Love; 2018 - Lo and behold, for it is the third Coming; The third great Year of Yuri, citric as it may be! Edit: 2019 wasn't too bad either... Edit: 2020... holy crap, we're on a roll here~♡!

    Quote Bier.jpg@IRC 2020: "Je schlimmer der Fetisch, desto besser!"

  • Ich hab nur gehört, daß Gelsinger gesagt hat, es solle weitergehen. Battlemage soll 2023 erscheinen und etwa RTX 4070 Niveau erreichen. Naja...

    Woher hast Du, daß Koduri zurückgestuft wurde? Zu begrüßen wäre es ja. Er hat schon bei AMD nicht will gerissen.

    Vielleicht erfährt man ja im Rahmen der CES mehr.

    Ich werde mich von keinem einzzzigen Prozzzessor trennen.
    Jedoch lockt es mich beinahe, ihn Dir zu überlassen, nur um zu sehen, wie er Dich in den Wahnsinn treibt :evil:

    Meine Begehren

  • Koduri war Generaldirektor / Geschäftsführer der kompletten und jetzt aufgelösten bzw. zweigeteilten "Core and Visual Computing Group", wo die ganzen GPU Lösungen reingehörten, vom Notebook bis zum Data Center. Diesen Posten ist er scheinbar los, und ist jetzt wieder wie früher "Chief Architect" und Vizepräsident, nur halt der neuen "CPU, GPU, and AI divisions", die sich also um das Zusammenspiel verschiedener KI-Technologien rein im professionellen Markt kümmern soll. Da ist wohl auch einiges an Software mit im Spiel, und keine Chipentwicklung an sich mehr. Eine vergleichbare Rolle hatte er schon einmal inne.

    Die ARC Entwicklung wurde scheint's der "Client Compute Group" unterstellt, damit müßte Koduri aus dem Management der ARC Entwicklung raus sein, und selbige ist jetzt Teil der selben Strukturen die sich auch um IoT (z.B. "Smart Home") und Client PC Technik (Core i) kümmert. Und halt Sachen wie Thunderbolt, LTE Modems für Laptops, sowas. Da stellt sich mir halt die Frage, ob das bedeutet daß ARC jetzt in seiner Bedeutung sinkt, oder ob nicht... das kann ich nicht klar beantworten, deswegen bin ich da verunsichert.

    Quellen bzgl. Koduri: [TechPowerUp], [Intel].

    Vielleicht mißinterpretiere ich da was, was ihn angeht. Aber für mich klingt das, als hättens ihm gut den Kopf gewaschen. Oder ich verstehe die Meldung falsch?

    1-6000-banner-88x31-jpg

    Stolzer Besitzer eines 3dfx Voodoo5 6000 AGP Prototypen:

    • 3dfx Voodoo5 6000 AGP HiNT Rev.A-3700

    [//wp.xin.at] - No RISC, no fun!

    QotY: Girls Love, BEST Love; 2018 - Lo and behold, for it is the third Coming; The third great Year of Yuri, citric as it may be! Edit: 2019 wasn't too bad either... Edit: 2020... holy crap, we're on a roll here~♡!

    Quote Bier.jpg@IRC 2020: "Je schlimmer der Fetisch, desto besser!"

  • Hört sich so an, als wäre Herr K. überall dabei, hat aber nicht mehr wirklich was zu sagen. Probleme für die Produktentwicklung sehe ich da eher nicht. ;)

    Ich werde mich von keinem einzzzigen Prozzzessor trennen.
    Jedoch lockt es mich beinahe, ihn Dir zu überlassen, nur um zu sehen, wie er Dich in den Wahnsinn treibt :evil:

    Meine Begehren

  • Don't start talking like a Fanboy to me.

    I said that the RX480 killed the Fury Not its faster.

    Then you should of mentioned that detail? Was it that hard?
    It's all about the details.

    And it has nothing to be a "fan of something" even that is no crime either I don't see the problem even if that would of been the case, truly I'd not give a damn who if fan of what, not like that would be relevant to anything lmao.

    I mean come on dude, the amount of Intel & NVIDIA Fanboys & Fangirls here is larger than you'd think, to me that was a known thing even when I joined this place back in 2004... Yea was with my first account... then rejoined in '09 all because I was asked to return.

    So before calling me a fanboy, could you please take a look at yourself first perhaps? As it's often the case that those that call people for certain undelightful things, they are often that themselves... think about it and take your time, you might need it....

    I just go by hard facts, as I have done since 1988, deal with it, you can't like or agree with everything, but facts are hard to beat ;)
    ATi & Matrox were masters of image quality, as a graphics card collector you should know this the same for the 3dfx Voodoo Banshee & Vodooo3 series, 3dfx put a ton of money in to their 2D cores and they never did this before, but they managed extremely well, the Voodoo3 3000 & 3500 matched the 2D image quality of the G400 series of cards and that is really pushing it, the ATii Rage 128 series also came close to this level of 2D image quality output.

    They made the TNT 2 cards look bad as their 2D image output was horrid, am I now a 3dfx, Marox ATi fanboy all of a sudden now because of this??
    If you have a healthy mind I am sure you would see things in more positive ways.

    In the end, damn dude, if people call me a fanboy, I should feel blessed, lmao!
    Such things I just see as strongly placed compliments, so thanks! :respekt:

    The RX 480 replaced it yes and it was a healthy replacement, only the Fury as high end and the RX 480 was mid end, power consumption is was a lovely improvement as Polaris goes, but performance was it wasn't as great as people had hoped it would be.
    AMD had to rely on CFX to beat a GTX 1080 and even this was hardly reached the 1080 just walked over the cards, Pascal was NVIDIA's best GPU since G80

    Also if I was an AMD fanboy I would not even admit such things either.

    7 Mal editiert, zuletzt von Gold Leader (4. Januar 2023 um 19:41)

  • It's nice to know that you aren't an fanboy.

    But you wrote like that.

    You got pissed as I said the Fury got killed by RX480. And why should i go more in detail?

    You know the facts.

    As you said. Polaris was a healthy improvement.

    Causal Gamer could afford a Midrange Graphics Card with whooping 8GB of VRAM.

    And AMD saled ton's of Polaris wich was good.

    All i did was to ask you if it's good to buy a high end gpu nowdays?

    And wrote some examples why i (Just me) think it's a bad Idea. Nothing more!

  • I have dyslexia man haha I can't even be like that, even if I tried :topmodel: But Ir eally didn't expect people to see me as one, those that know me should know by now I am a fan of the Boeing 747 and 3dfx as we all are here, it's called Voodoaolert for a reason right.
    But hey man, I love all graphics cards you seen my collection yet from the days of old?
    Check it out here man :)
    https://www.3dfx.ch/gallery/index.…dleader/prevgas
    In there you will see that every GeForce I had was a high end model if I was an AMD /ATI fanboy I would not of collected 3dfx, ATi/AMD, Matrox S3, 3D labs and other brands and well that isn't really the case :)

    They are sorted per chip maker they mostly have high end models I just love full dies and such that is why I kept my pattern only now cards have such stupid high prices it takes me like one year to save something up that I find decent for my liking it's never about need it's about what I can make doable in a year's time or all the time I will need to save up a particular card.

    As for the Iintel ARC 770 LE 16GB I'd love to get one and I think I will once my Nitro+ RX 7900 XTX & Valve Index VR headset + controllers have arrived these are upgrades I really want to do first before adding other cards so yea I am glad many of you here did buy the Intel ARC and I hope you guys will be able to post significant performance improvements as it's drivers get more mature in time, it's something new thus exciting to follow.

    Which is why I made this thread , since no one else did, I was like hey sure why not :)
    For the average guy don't buy any of the new cards paying 800Eur+ for a graphics card is crazy imo, but for us enthusiasts yea go for it, I want a full die and the Nitro+ 7900 XTX has that so that is what I will use to replace my current Nitro+ RX Vega 64, but yea you are right to a certain point of view.

    Also how is the Intel ARC 770 LE 16GB with VR, has anyone tried that yet?