Forum Settings
Forums

Nvidia CEO: "We can't do computer graphics anymore without artificial intelligence"

New
Sep 16, 2024 11:52 AM
#1
avolition
Offline
Jan 2009
103552
"We can't do computer graphics anymore without artificial intelligence," he said. "We compute one pixel, we infer the other 32. I mean, it's incredible... And so we hallucinate, if you will, the other 32, and it looks temporally stable, it looks photorealistic, and the image quality is incredible, the performance is incredible."

Jensen is doubling down on observations that Nvidia and other tech executives have made about AI-based upscaling in PC gaming, arguing that it is a natural evolution in graphics technology, similar to past innovations like anti-aliasing or tessellation.

https://www.techspot.com/news/104725-nvidia-ceo-cant-do-computer-graphics-anymore-without.html

yep moores law is dead because chips are limited by physical limits we are at 3 nanometers now and we can never go down to negative nanometers

so ai upscaling is the only way to improve graphics performance going forward
degSep 18, 2024 1:41 PM
Sep 17, 2024 9:38 PM
#2

Offline
Nov 2013
2142
NGL, Im kinda sick of the games with DLSS forced.
Sep 17, 2024 9:46 PM
#3

Offline
Aug 2012
443
Reply to DGemu
NGL, Im kinda sick of the games with DLSS forced.
@DGemu Yeah, same here. It's annoying but guess it's the future... bleh
Sep 17, 2024 11:32 PM
#4

Offline
Jun 2016
13525
Up to this point I've refused to use DLSS and I just decided to live with either crap RT performance or no RT but the situation is getting so bad I might need to violate my made up principle.
MEA·MENTVLA·INGENS·EST
Sep 18, 2024 1:57 AM
#5

Offline
Mar 2008
50946
That's really just a lie. What is actually meant is that you cant improve graphics speeds beyond a certain point without predictive branching but this is the same security hole CPUs had one by one. You will see waves of malware that attack GPUs now.
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⣸⠋⠀⠀⠀⡄⠀⠀⡔⠀⢀⠀⢸⠀⠀⠀⡘⡰⠁⠘⡀⠀⠀⢠⠀⠀⠀⢸⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠁⠀⣀⠀⠀⡇⠀⡜⠈⠁⠀⢸⡈⢇⠀⠀⢣⠑⠢⢄⣇⠀⠀⠸⠀⠀⠀⢸⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⢰⡟⡀⠀⡇⡜⠀⠀⠀⠀⠘⡇⠈⢆⢰⠁⠀⠀⠀⠘⣆⠀⠀⠀⠀⠀⠸⠀⠀⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠤⢄⠀⠀⠀⠀⠀⠀⠀⠀⡼⠀⣧⠀⢿⢠⣤⣤⣬⣥⠀⠁⠀⠀⠛⢀⡒⠀⠀⠀⠘⡆⡆⠀⠀⠀⡇⠀⠀⠇⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⢵⡀⠀⠀⠀⠀⠀⡰⠀⢠⠃⠱⣼⡀⣀⡀⠀⠀⠀⠀⠀⠀⠀⠈⠛⠳⠶⠶⠆⡸⢀⡀⣀⢰⠀⠀⢸ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⣀⣀⣀⠄⠀⠉⠁⠀⠀⢠⠃⢀⠎⠀⠀⣼⠋⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠴⠢⢄⡔⣕⡍⠣⣱⢸⠀⠀⢷⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⡰⠃⢀⠎⠀⠀⡜⡨⢢⡀⠀⠀⠀⠐⣄⠀⠀⣠⠀⠀⠀⠐⢛⠽⠗⠁⠀⠁⠊⠀⡜⠸⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀ ⢀⠔⣁⡴⠃⠀⡠⡪⠊⣠⣾⣟⣷⡦⠤⣀⡈⠁⠉⢀⣀⡠⢔⠊⠁⠀⠀⠀⠀⢀⡤⡗⢀⠇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⢀⣠⠴⢑⡨⠊⡀⠤⠚⢉⣴⣾⣿⡿⣾⣿⡇⠀⠹⣻⠛⠉⠉⢀⠠⠺⠀⠀⡀⢄⣴⣾⣧⣞⠀⡜⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠐⠒⣉⠠⠄⡂⠅⠊⠁⠀⠀⣴⣿⣿⣿⣿⣻⣿⣿⡇⠀⠀⢠⣷⣮⡍⡠⠔⢉⡇⡠⠋⠁⠀⣿⣿⣿⣿⣄⠀⠀⠀⠀
Sep 18, 2024 8:50 AM
#6

Offline
Jul 2013
8125
AI is all a scam. I can't believe you guys don't realize something so simple.
Sep 18, 2024 1:11 PM
#7

Offline
Sep 2024
511
Gosh, the use of AI for everything needs to stop. AI will never be able to replicate how the human brain thinks and can be wrong most of the time. LCM's are really wrong at times.
"When clouds appear, wise men put on their cloaks;
When great leaves fall, the winter is at hand;
When the sun sets, who doth not look for night?
Untimely storms make men expect a dearth."

William Shakespeare
Sep 18, 2024 2:33 PM
#8

Offline
Jun 2016
13525
Reply to Theo1899
Up to this point I've refused to use DLSS and I just decided to live with either crap RT performance or no RT but the situation is getting so bad I might need to violate my made up principle.
@Theo1899 Ok I gave in because of this thread and enabled DLSS on Hogwarts Legacy because RT performance was particularly shite on my system. It looks way better than I expected, faces don't look "off model", I gained almost double the FPS and the game really looks like it's rendered at full resolution BUT there are some significant issues I couldn't ignore. Hair looks like ass on the closeup talkey bits and there are a lot of artifacts on shadow edges and reflections.
Notice what's especially wrong about the last one? Shadows and reflections are 2 out the 3 things people use RT for (the third is ambient occlusion) so having DLSS display artifacts on them kinda defeats the purpose.
I'll admit it's very impressive tech and modern cards also support hardware accelerated frame generation which I haven't tried (that's enough fake graphics for today although if you think about it all graphics are fake by nature but now we're starting to get philosophical. has anyone written anything about this? if not I'll need to capitalize on it asap and increase my philosophy street cred which is seriously lacking) but I think I'll leave it disabled for now. I prefer the sharp natively rendered image to better lighting but with artifacts so I'll disagree with Jensen unless they invent some magical upscaler trained on every possible scene of every game that can magically generate a decent image from 30% resolution.
MEA·MENTVLA·INGENS·EST
Sep 18, 2024 5:08 PM
#9

Offline
Jul 2024
499
Honestly the last good GPU's that Nvidia made were the 10 series.

I am currently using a 1660 Ti,so no DLSS or RT and I am pretty satisfied with the image quality (I never got the hype around the whole RT thing anyway)

But the 16 series and 20's were a flop,also already obsolete if you wanna play most games (primarily AAA but also some AA ones)

From the 30 series not only were they crypto-exploited but they also aren't all that great (at least the budget ones),yeah sure a 6GB improvement and a couple more cuda cores from the 3060 wouldn't go amiss but doesn't justify buying one.

40 series,I do have my sights set on a 4070 Ti once the price drops (it's overall a 150% perf upgrade over mine) but the whole AI shit is a turnoff,not to mention the pricing is insane,before for a mid-range GPU paying 300-400€ was justified because it would be top for 4-5 years,now it's obsolete in 2 years.Also releasing a diff 4070 with a GDDR6 memory for the same price a year later LMAO

Sadly AI & WEB3 aren't gonna go away anytime soon,if anything the 50 series & AMD's new lineup are singularly centered on AI.

TL:DR - Last good gpu series was the 10 series.
Morningstar991Sep 18, 2024 5:12 PM
Can I Still Go To Heaven If I Kill Myself?
Sep 18, 2024 7:36 PM

Offline
Apr 2024
686
maybe I'm just too indie/retro-brained to understand what the point of even chasing photorealism is anymore. We've proven that we can make graphics that are only a couple steps removed from high-end camera capabilities, and most people don't have hardware to even experience that. This is becoming an absurd power-measuring feat not seen since powerscalers tried to scale Gurren Lagann's universe
"Dreams are worth fighting for"
Backloggery | YouTube | Heatmap
Sep 19, 2024 12:23 AM

Offline
Apr 2023
128
You can't do MSAA with deferred rendering and FXAA sucks, so it makes sense to use AI if you want to double or triple the samples around edges. Not sure about crazy upscaling hallucinating 32 pixels, but as an AA technique it seems like there's no better alternative
Sep 19, 2024 12:57 AM

Offline
Aug 2020
1581
Some people just HATE image clarity.
Keep scrolling
Sep 19, 2024 10:14 AM
ああああああああ

Offline
Apr 2013
5720
AI up scaling sucks, though. I want my crisp, low-poly graphics to remain the way they are, thank you very much!

This ground is soiled by those before me and their lies. I dare not look up for on me I feel their eyes
Sep 26, 2024 9:53 PM

Offline
Aug 2020
1581
Add Monster Hunter Wilds to the list of evergrowing DLSS/Frame Gen required games. Targetting 1080p 30fps on medium is......crazy
Keep scrolling
Sep 27, 2024 12:26 AM
avolition
Offline
Jan 2009
103552
Reply to RobertsahDHDA
Add Monster Hunter Wilds to the list of evergrowing DLSS/Frame Gen required games. Targetting 1080p 30fps on medium is......crazy
@RobertsahDHDA fans are complaining games today are not optimize i say nah games today are more compute demanding with how close to photorealism the visuals are especially with ray tracing so ye upscaling and frame generation are here to stay
Sep 27, 2024 12:29 AM
Émilia Hoarfrost

Offline
Dec 2015
4314
Reply to traed
That's really just a lie. What is actually meant is that you cant improve graphics speeds beyond a certain point without predictive branching but this is the same security hole CPUs had one by one. You will see waves of malware that attack GPUs now.
@traed Wait explain what security hole will be in predictive branching plz plz plz???



Sep 27, 2024 12:35 AM
avolition
Offline
Jan 2009
103552
Reply to EmiliaHoarfrost
@traed Wait explain what security hole will be in predictive branching plz plz plz???
@EmiliaHoarfrost @traed i guess he is talking about the likes of spectre and meltdown but those are not applicable to gpus since gpus are not general purpose like cpus are only good thing about gpus is how multitasking they are with how many small simple cores you can put with them and you cannot do that many cores with more complex cpus

more complexity means more bugs and security exploits so cpus are more complex than gpus
degSep 27, 2024 1:05 AM
Sep 27, 2024 1:07 AM

Offline
Mar 2008
50946
Reply to EmiliaHoarfrost
@traed Wait explain what security hole will be in predictive branching plz plz plz???
@EmiliaHoarfrost
It's how attacks like Spectre work. It allows a sneaky way to access content of the memory which could potentially contain sensitive information.




I think this would easily lead to a version of something similar.

@deg
I was thinking a GPU would make it act like a screenlogger. GPUs already are exploitable
https://www.wired.com/story/leftoverlocals-gpu-vulnerability-generative-ai/
traedSep 27, 2024 1:14 AM
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⣸⠋⠀⠀⠀⡄⠀⠀⡔⠀⢀⠀⢸⠀⠀⠀⡘⡰⠁⠘⡀⠀⠀⢠⠀⠀⠀⢸⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠁⠀⣀⠀⠀⡇⠀⡜⠈⠁⠀⢸⡈⢇⠀⠀⢣⠑⠢⢄⣇⠀⠀⠸⠀⠀⠀⢸⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⢰⡟⡀⠀⡇⡜⠀⠀⠀⠀⠘⡇⠈⢆⢰⠁⠀⠀⠀⠘⣆⠀⠀⠀⠀⠀⠸⠀⠀⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠤⢄⠀⠀⠀⠀⠀⠀⠀⠀⡼⠀⣧⠀⢿⢠⣤⣤⣬⣥⠀⠁⠀⠀⠛⢀⡒⠀⠀⠀⠘⡆⡆⠀⠀⠀⡇⠀⠀⠇⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⢵⡀⠀⠀⠀⠀⠀⡰⠀⢠⠃⠱⣼⡀⣀⡀⠀⠀⠀⠀⠀⠀⠀⠈⠛⠳⠶⠶⠆⡸⢀⡀⣀⢰⠀⠀⢸ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⣀⣀⣀⠄⠀⠉⠁⠀⠀⢠⠃⢀⠎⠀⠀⣼⠋⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠴⠢⢄⡔⣕⡍⠣⣱⢸⠀⠀⢷⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⡰⠃⢀⠎⠀⠀⡜⡨⢢⡀⠀⠀⠀⠐⣄⠀⠀⣠⠀⠀⠀⠐⢛⠽⠗⠁⠀⠁⠊⠀⡜⠸⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀ ⢀⠔⣁⡴⠃⠀⡠⡪⠊⣠⣾⣟⣷⡦⠤⣀⡈⠁⠉⢀⣀⡠⢔⠊⠁⠀⠀⠀⠀⢀⡤⡗⢀⠇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⢀⣠⠴⢑⡨⠊⡀⠤⠚⢉⣴⣾⣿⡿⣾⣿⡇⠀⠹⣻⠛⠉⠉⢀⠠⠺⠀⠀⡀⢄⣴⣾⣧⣞⠀⡜⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠐⠒⣉⠠⠄⡂⠅⠊⠁⠀⠀⣴⣿⣿⣿⣿⣻⣿⣿⡇⠀⠀⢠⣷⣮⡍⡠⠔⢉⡇⡠⠋⠁⠀⣿⣿⣿⣿⣄⠀⠀⠀⠀
Sep 27, 2024 1:28 AM
avolition
Offline
Jan 2009
103552
@traed

there is no such thing as perfect hardware anyway but im just saying more complexity means more bugs and exploits so gpus have less problems than the more complex cpus
Sep 27, 2024 1:35 AM

Offline
Mar 2008
50946
Reply to deg
@traed

there is no such thing as perfect hardware anyway but im just saying more complexity means more bugs and exploits so gpus have less problems than the more complex cpus
@deg
You have to consider energy consumption too. Every bit that doesnt actually get executed is wasted energy that gets converted to heat and more heat means more fan speed needed. It's a power drain bad for for the environment and electric bills. Focusing on efficiency would be a better move than speed because that also would improve performance some.
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⣸⠋⠀⠀⠀⡄⠀⠀⡔⠀⢀⠀⢸⠀⠀⠀⡘⡰⠁⠘⡀⠀⠀⢠⠀⠀⠀⢸⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠁⠀⣀⠀⠀⡇⠀⡜⠈⠁⠀⢸⡈⢇⠀⠀⢣⠑⠢⢄⣇⠀⠀⠸⠀⠀⠀⢸⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⢰⡟⡀⠀⡇⡜⠀⠀⠀⠀⠘⡇⠈⢆⢰⠁⠀⠀⠀⠘⣆⠀⠀⠀⠀⠀⠸⠀⠀⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠤⢄⠀⠀⠀⠀⠀⠀⠀⠀⡼⠀⣧⠀⢿⢠⣤⣤⣬⣥⠀⠁⠀⠀⠛⢀⡒⠀⠀⠀⠘⡆⡆⠀⠀⠀⡇⠀⠀⠇⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⢵⡀⠀⠀⠀⠀⠀⡰⠀⢠⠃⠱⣼⡀⣀⡀⠀⠀⠀⠀⠀⠀⠀⠈⠛⠳⠶⠶⠆⡸⢀⡀⣀⢰⠀⠀⢸ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⣀⣀⣀⠄⠀⠉⠁⠀⠀⢠⠃⢀⠎⠀⠀⣼⠋⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠴⠢⢄⡔⣕⡍⠣⣱⢸⠀⠀⢷⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⡰⠃⢀⠎⠀⠀⡜⡨⢢⡀⠀⠀⠀⠐⣄⠀⠀⣠⠀⠀⠀⠐⢛⠽⠗⠁⠀⠁⠊⠀⡜⠸⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀ ⢀⠔⣁⡴⠃⠀⡠⡪⠊⣠⣾⣟⣷⡦⠤⣀⡈⠁⠉⢀⣀⡠⢔⠊⠁⠀⠀⠀⠀⢀⡤⡗⢀⠇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⢀⣠⠴⢑⡨⠊⡀⠤⠚⢉⣴⣾⣿⡿⣾⣿⡇⠀⠹⣻⠛⠉⠉⢀⠠⠺⠀⠀⡀⢄⣴⣾⣧⣞⠀⡜⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠐⠒⣉⠠⠄⡂⠅⠊⠁⠀⠀⣴⣿⣿⣿⣿⣻⣿⣿⡇⠀⠀⢠⣷⣮⡍⡠⠔⢉⡇⡠⠋⠁⠀⣿⣿⣿⣿⣄⠀⠀⠀⠀
Sep 27, 2024 1:44 AM
avolition
Offline
Jan 2009
103552
Reply to traed
@deg
You have to consider energy consumption too. Every bit that doesnt actually get executed is wasted energy that gets converted to heat and more heat means more fan speed needed. It's a power drain bad for for the environment and electric bills. Focusing on efficiency would be a better move than speed because that also would improve performance some.
@traed ye for power efficiency risc cpus like arm are better than x86 of amd and intel anyway thats why all mobile phones uses arm cpus

there is risc-v cpus that is open source and royalty free that is progressing fast too and will challenge arm cpus on power efficiency
Sep 27, 2024 1:52 AM

Offline
Mar 2008
50946
Reply to deg
@traed ye for power efficiency risc cpus like arm are better than x86 of amd and intel anyway thats why all mobile phones uses arm cpus

there is risc-v cpus that is open source and royalty free that is progressing fast too and will challenge arm cpus on power efficiency
@deg
Technically you can improve more using vanadium dioxide instead of silicon but depends what goes into making it.
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⣸⠋⠀⠀⠀⡄⠀⠀⡔⠀⢀⠀⢸⠀⠀⠀⡘⡰⠁⠘⡀⠀⠀⢠⠀⠀⠀⢸⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠁⠀⣀⠀⠀⡇⠀⡜⠈⠁⠀⢸⡈⢇⠀⠀⢣⠑⠢⢄⣇⠀⠀⠸⠀⠀⠀⢸⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⢰⡟⡀⠀⡇⡜⠀⠀⠀⠀⠘⡇⠈⢆⢰⠁⠀⠀⠀⠘⣆⠀⠀⠀⠀⠀⠸⠀⠀⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠤⢄⠀⠀⠀⠀⠀⠀⠀⠀⡼⠀⣧⠀⢿⢠⣤⣤⣬⣥⠀⠁⠀⠀⠛⢀⡒⠀⠀⠀⠘⡆⡆⠀⠀⠀⡇⠀⠀⠇⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⢵⡀⠀⠀⠀⠀⠀⡰⠀⢠⠃⠱⣼⡀⣀⡀⠀⠀⠀⠀⠀⠀⠀⠈⠛⠳⠶⠶⠆⡸⢀⡀⣀⢰⠀⠀⢸ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⣀⣀⣀⠄⠀⠉⠁⠀⠀⢠⠃⢀⠎⠀⠀⣼⠋⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠴⠢⢄⡔⣕⡍⠣⣱⢸⠀⠀⢷⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⡰⠃⢀⠎⠀⠀⡜⡨⢢⡀⠀⠀⠀⠐⣄⠀⠀⣠⠀⠀⠀⠐⢛⠽⠗⠁⠀⠁⠊⠀⡜⠸⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀ ⢀⠔⣁⡴⠃⠀⡠⡪⠊⣠⣾⣟⣷⡦⠤⣀⡈⠁⠉⢀⣀⡠⢔⠊⠁⠀⠀⠀⠀⢀⡤⡗⢀⠇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⢀⣠⠴⢑⡨⠊⡀⠤⠚⢉⣴⣾⣿⡿⣾⣿⡇⠀⠹⣻⠛⠉⠉⢀⠠⠺⠀⠀⡀⢄⣴⣾⣧⣞⠀⡜⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠐⠒⣉⠠⠄⡂⠅⠊⠁⠀⠀⣴⣿⣿⣿⣿⣻⣿⣿⡇⠀⠀⢠⣷⣮⡍⡠⠔⢉⡇⡠⠋⠁⠀⣿⣿⣿⣿⣄⠀⠀⠀⠀
Sep 27, 2024 1:57 AM
avolition
Offline
Jan 2009
103552
Reply to traed
@deg
Technically you can improve more using vanadium dioxide instead of silicon but depends what goes into making it.
@traed is that material abundant though? because the reason silicon is use for chips is its only sand and sand is very abundant on earth
Sep 27, 2024 2:14 AM

Offline
Mar 2008
50946
Reply to deg
@traed is that material abundant though? because the reason silicon is use for chips is its only sand and sand is very abundant on earth
@deg
silicon is used because of it's properties not just abundance. But it takes a lot of heat to produce silicon chips. There had been a lower heat method for vanadium dioxide which im having trouble finding the paper on again. It's 20th most abundant but way less percentage than silicon which is a issue
https://en.wikipedia.org/wiki/Abundance_of_elements_in_Earth's_crust
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⣸⠋⠀⠀⠀⡄⠀⠀⡔⠀⢀⠀⢸⠀⠀⠀⡘⡰⠁⠘⡀⠀⠀⢠⠀⠀⠀⢸⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠁⠀⣀⠀⠀⡇⠀⡜⠈⠁⠀⢸⡈⢇⠀⠀⢣⠑⠢⢄⣇⠀⠀⠸⠀⠀⠀⢸⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⢰⡟⡀⠀⡇⡜⠀⠀⠀⠀⠘⡇⠈⢆⢰⠁⠀⠀⠀⠘⣆⠀⠀⠀⠀⠀⠸⠀⠀⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠤⢄⠀⠀⠀⠀⠀⠀⠀⠀⡼⠀⣧⠀⢿⢠⣤⣤⣬⣥⠀⠁⠀⠀⠛⢀⡒⠀⠀⠀⠘⡆⡆⠀⠀⠀⡇⠀⠀⠇⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⢵⡀⠀⠀⠀⠀⠀⡰⠀⢠⠃⠱⣼⡀⣀⡀⠀⠀⠀⠀⠀⠀⠀⠈⠛⠳⠶⠶⠆⡸⢀⡀⣀⢰⠀⠀⢸ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⣀⣀⣀⠄⠀⠉⠁⠀⠀⢠⠃⢀⠎⠀⠀⣼⠋⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠴⠢⢄⡔⣕⡍⠣⣱⢸⠀⠀⢷⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⡰⠃⢀⠎⠀⠀⡜⡨⢢⡀⠀⠀⠀⠐⣄⠀⠀⣠⠀⠀⠀⠐⢛⠽⠗⠁⠀⠁⠊⠀⡜⠸⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀ ⢀⠔⣁⡴⠃⠀⡠⡪⠊⣠⣾⣟⣷⡦⠤⣀⡈⠁⠉⢀⣀⡠⢔⠊⠁⠀⠀⠀⠀⢀⡤⡗⢀⠇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⢀⣠⠴⢑⡨⠊⡀⠤⠚⢉⣴⣾⣿⡿⣾⣿⡇⠀⠹⣻⠛⠉⠉⢀⠠⠺⠀⠀⡀⢄⣴⣾⣧⣞⠀⡜⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠐⠒⣉⠠⠄⡂⠅⠊⠁⠀⠀⣴⣿⣿⣿⣿⣻⣿⣿⡇⠀⠀⢠⣷⣮⡍⡠⠔⢉⡇⡠⠋⠁⠀⣿⣿⣿⣿⣄⠀⠀⠀⠀
Sep 27, 2024 2:18 AM
avolition
Offline
Jan 2009
103552
Reply to traed
@deg
silicon is used because of it's properties not just abundance. But it takes a lot of heat to produce silicon chips. There had been a lower heat method for vanadium dioxide which im having trouble finding the paper on again. It's 20th most abundant but way less percentage than silicon which is a issue
https://en.wikipedia.org/wiki/Abundance_of_elements_in_Earth's_crust
@traed honestly im hearing more about gallium nitride to replace silicon on chips
Sep 27, 2024 2:21 AM

Offline
Mar 2008
50946
Reply to deg
@traed honestly im hearing more about gallium nitride to replace silicon on chips
@deg
That also has a use case yeah. Looking it up for a refresher it works well in high temperatures.
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⣸⠋⠀⠀⠀⡄⠀⠀⡔⠀⢀⠀⢸⠀⠀⠀⡘⡰⠁⠘⡀⠀⠀⢠⠀⠀⠀⢸⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠁⠀⣀⠀⠀⡇⠀⡜⠈⠁⠀⢸⡈⢇⠀⠀⢣⠑⠢⢄⣇⠀⠀⠸⠀⠀⠀⢸⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⢰⡟⡀⠀⡇⡜⠀⠀⠀⠀⠘⡇⠈⢆⢰⠁⠀⠀⠀⠘⣆⠀⠀⠀⠀⠀⠸⠀⠀⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠤⢄⠀⠀⠀⠀⠀⠀⠀⠀⡼⠀⣧⠀⢿⢠⣤⣤⣬⣥⠀⠁⠀⠀⠛⢀⡒⠀⠀⠀⠘⡆⡆⠀⠀⠀⡇⠀⠀⠇⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⢵⡀⠀⠀⠀⠀⠀⡰⠀⢠⠃⠱⣼⡀⣀⡀⠀⠀⠀⠀⠀⠀⠀⠈⠛⠳⠶⠶⠆⡸⢀⡀⣀⢰⠀⠀⢸ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⣀⣀⣀⠄⠀⠉⠁⠀⠀⢠⠃⢀⠎⠀⠀⣼⠋⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠴⠢⢄⡔⣕⡍⠣⣱⢸⠀⠀⢷⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⡰⠃⢀⠎⠀⠀⡜⡨⢢⡀⠀⠀⠀⠐⣄⠀⠀⣠⠀⠀⠀⠐⢛⠽⠗⠁⠀⠁⠊⠀⡜⠸⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀ ⢀⠔⣁⡴⠃⠀⡠⡪⠊⣠⣾⣟⣷⡦⠤⣀⡈⠁⠉⢀⣀⡠⢔⠊⠁⠀⠀⠀⠀⢀⡤⡗⢀⠇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⢀⣠⠴⢑⡨⠊⡀⠤⠚⢉⣴⣾⣿⡿⣾⣿⡇⠀⠹⣻⠛⠉⠉⢀⠠⠺⠀⠀⡀⢄⣴⣾⣧⣞⠀⡜⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠐⠒⣉⠠⠄⡂⠅⠊⠁⠀⠀⣴⣿⣿⣿⣿⣻⣿⣿⡇⠀⠀⢠⣷⣮⡍⡠⠔⢉⡇⡠⠋⠁⠀⣿⣿⣿⣿⣄⠀⠀⠀⠀
Sep 27, 2024 9:10 AM

Offline
Jul 2013
8125
AI is totally scammy. If you trust it, then you are a complete fool. That AI stuff is a 100% scam.

Nothing will stop NTHE from happening. Anyone, saying otherwise, is seriously delusional.
Sep 27, 2024 9:11 AM
avolition
Offline
Jan 2009
103552
Reply to DesuMaiden
AI is totally scammy. If you trust it, then you are a complete fool. That AI stuff is a 100% scam.

Nothing will stop NTHE from happening. Anyone, saying otherwise, is seriously delusional.
@DesuMaiden this is about video game graphics using ai tools and not your doomer ai and nthe delusions
Oct 3, 2024 2:29 PM

Offline
Jan 2018
33287
i do agree about possibility of ai being exploitable or prone to getting hacked. still, if price is reasonable then i'd take it. i can live without 4k and beyond, even without RT. RT is great for people who didn't turn off shadow i guess.
Feb 20, 3:12 PM

Offline
Nov 2013
2142
Feb 20, 4:37 PM
avolition
Offline
Jan 2009
103552
Reply to DGemu
@DGemu whats wrong? the video is about visual quality not performance and upscaling is indeed increases performance

but ye frame generation is fake performance though at least for now
Feb 24, 5:56 PM

Offline
Jul 2021
9125
Reply to RobertsahDHDA
Some people just HATE image clarity.
@RobertsahDHDA That's the thing though, DLSS for antialiasing is absolutely insane.
Cucumber ice cream is the best!
Feb 24, 8:17 PM

Offline
Jun 2024
31
@traed you said that gpus can leak info. does that depend on how old the gpu is too?
Feb 24, 8:45 PM

Offline
Mar 2008
50946
Reply to -cold-
@traed you said that gpus can leak info. does that depend on how old the gpu is too?
@-cold-
It's not a known exploit just that the same exploit used on CPUs can likely apply to GPUs if they use predictive branching the same way. It's common for tech companies to make same mistakes over and over again rarely learning from others. I doubt older GPUs did predictive branching (though im not sure) so yes it would depend. Im no expert on computers just hang around some security info dumps so i picked up some things
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⣸⠋⠀⠀⠀⡄⠀⠀⡔⠀⢀⠀⢸⠀⠀⠀⡘⡰⠁⠘⡀⠀⠀⢠⠀⠀⠀⢸⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠁⠀⣀⠀⠀⡇⠀⡜⠈⠁⠀⢸⡈⢇⠀⠀⢣⠑⠢⢄⣇⠀⠀⠸⠀⠀⠀⢸⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⢰⡟⡀⠀⡇⡜⠀⠀⠀⠀⠘⡇⠈⢆⢰⠁⠀⠀⠀⠘⣆⠀⠀⠀⠀⠀⠸⠀⠀⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠤⢄⠀⠀⠀⠀⠀⠀⠀⠀⡼⠀⣧⠀⢿⢠⣤⣤⣬⣥⠀⠁⠀⠀⠛⢀⡒⠀⠀⠀⠘⡆⡆⠀⠀⠀⡇⠀⠀⠇⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⢵⡀⠀⠀⠀⠀⠀⡰⠀⢠⠃⠱⣼⡀⣀⡀⠀⠀⠀⠀⠀⠀⠀⠈⠛⠳⠶⠶⠆⡸⢀⡀⣀⢰⠀⠀⢸ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⣀⣀⣀⠄⠀⠉⠁⠀⠀⢠⠃⢀⠎⠀⠀⣼⠋⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠴⠢⢄⡔⣕⡍⠣⣱⢸⠀⠀⢷⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⡰⠃⢀⠎⠀⠀⡜⡨⢢⡀⠀⠀⠀⠐⣄⠀⠀⣠⠀⠀⠀⠐⢛⠽⠗⠁⠀⠁⠊⠀⡜⠸⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀ ⢀⠔⣁⡴⠃⠀⡠⡪⠊⣠⣾⣟⣷⡦⠤⣀⡈⠁⠉⢀⣀⡠⢔⠊⠁⠀⠀⠀⠀⢀⡤⡗⢀⠇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⢀⣠⠴⢑⡨⠊⡀⠤⠚⢉⣴⣾⣿⡿⣾⣿⡇⠀⠹⣻⠛⠉⠉⢀⠠⠺⠀⠀⡀⢄⣴⣾⣧⣞⠀⡜⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀ ⠐⠒⣉⠠⠄⡂⠅⠊⠁⠀⠀⣴⣿⣿⣿⣿⣻⣿⣿⡇⠀⠀⢠⣷⣮⡍⡠⠔⢉⡇⡠⠋⠁⠀⣿⣿⣿⣿⣄⠀⠀⠀⠀
Feb 25, 6:58 AM

Offline
Jul 2021
9125
Reply to traed
@-cold-
It's not a known exploit just that the same exploit used on CPUs can likely apply to GPUs if they use predictive branching the same way. It's common for tech companies to make same mistakes over and over again rarely learning from others. I doubt older GPUs did predictive branching (though im not sure) so yes it would depend. Im no expert on computers just hang around some security info dumps so i picked up some things
@traed You are a bit off in a few places.
For one, I don't think branch prediction will bring a lot of performance, that was specifically designed to make single threaded CPUs faster, whereas GPUs are designed to run a lot of threads at once, so any die area designated to predictive branching is probably better used for more threads.
Two, GPUs aren't particularly security critical. I just don't think it makes sense to upload that sort of load to the GPU, when you can't take advantage of the massive parallel processing power it has.
Cucumber ice cream is the best!
Mar 2, 6:17 AM

Offline
Jul 2013
8125
Using Windows 11 is an epic fail. That much is obvious.

More topics from this board

» Oblivion remake

lucjan - 10 minutes ago

0 by lucjan »»
10 minutes ago

» Do you achievement hunt?

Spast1c - Mar 8

41 by FanofAction »»
11 minutes ago

» Do you play Nintendo 64? Yes or no?

DesuMaiden - Mar 8

25 by lucjan »»
13 minutes ago

» Have you ever played the Legend of Zelda series?

DesuMaiden - 8 hours ago

5 by lucjan »»
14 minutes ago

» Masahiro Sakurai tells Japanese devs to Stop 'Westernizing' Games & focus on Japanese market

tchitchouan - Mar 19

12 by Tropisch »»
5 hours ago
It’s time to ditch the text file.
Keep track of your anime easily by creating your own list.
Sign Up Login