Carmack defends AI tools after Quake fan calls Microsoft AI demo “disgusting”
Carmack defends AI tools after Quake fan calls Microsoft AI demo “disgusting”

Carmack defends AI tools after Quake fan calls Microsoft AI demo “disgusting”

Carmack defends AI tools after Quake fan calls Microsoft AI demo “disgusting”
Carmack defends AI tools after Quake fan calls Microsoft AI demo “disgusting”
Considering that the AI craze is what's fueling the shortage and massive increase in GPU prices, I really don't see gamers ever embracing AI.
[...] I really don’t see gamers ever embracing AI.
They've spent years training to fight it, so that tracks.
The Nvidia GPUs in data centers are separate (and even on separate nodes than, with different memory chips than) gaming GPUs. The sole exception is the 4090/5090 which do see some use in data center forms, but at low volumes. And this problem is pretty much nonexistent for AMD.
…No, it’s just straight up price gouging and anti competitiveness. It’s just Nvidia being Nvidia, AMD being anticompetitive too (their CEOs are like cousins twice removed), and Intel unfortunately not getting traction, even though Battlemage is excellent.
For local AI, the only thing that gets sucked up are 3060s, 3090s, and for the rich/desperate, 4090s/5090s, with anything else being a waste of money with too little VRAM. And this is a pretty small niche.
Chip fabbing allocations are limited and what chips for Ai datacenters takeup, the desktop GPUs don't get made. And what's left of it are desktop chips sold for workstation Ai models like the RTX 5090 and even RX 7900 XTX because they have more memory. Meanwhile they still sell 8GB cards to gamers when it hasn't been enough for a while. Whole situation is just absurd.
Still have limited wafers at the fabs. The chips going to datacenters could have been consumer stuff instead. Besides they (nVidia, Apple, AMD) are all fabricated at TSMC.
Local AI benefits from platforms with unified memory that can be expanded. Watch platforms based on AMD's Ryzen AI MAX 300 chip or whatever they call it take off. Frameworks you can config a machine with that chip to 128 GB RAM iirc. It's the main reason why I believe Apple's memory upgrades cost a ton so that it isn't a viable option financially for local AI applications.
I'm pretty sure the fabs making the chips for datacenter cards could be making more consumer grade cards but those are less profitable. And since fabs aren't infinite the price of datacenter cards is still going to affect consumer ones.
Speak for yourself. As an avid gamer I am excitedly looking towards the future of AI in games. Good models (with context buffers much longer than the .9s in this demo) have the potential to revolutionise the gaming industry.
I really don't understand the amount of LLM/AI hate in Lemmy. It is a tool with many potential uses.
DLSS (AI upscaling) alone should see gamers embracing the tech.
You must not have heard the dis gamers use for this tech.
Fake frames.
I think they'd rather have more raster and ray tracing especially raster in competitive games.
Any time I’ve enabled this, the game looked worse to me. YMMV, etc.
Why? AI doing one good thing doesn't erase the dozens of bad ways it's utilized.
I'm interested to see AI used on a larger scale in really specific ways, but the industry seems more interested in using it to take giant shortcuts and replace staff. That is going to piss people off, and it's going to really piss people off when it also delivers a shit product.
I'm fine with DLSS, because I want to see AI enhance games. I want it to make them better. So far, all I can see is that it's making them worse with one single upside that I can just.. toggle off on my end if I don't like it.
first thing I turn off. It only works in tech demos with very slow moving cameras. Sometimes
They do. You'll see a lot of hate for DLSS on social media, but if you go to the forums or any newly-released game that doesn't have DLSS, you'll find at least one post demanding that they implement it. If it's on by default, most people don't ever touch that setting and they're fine with it.
There’s what AI could’ve been (collaborative and awesome), and then there’s what the billionaire class is pushing today (exploitative shit that they hit everyone over the head with until they say they like it). But the folks frothing at the mouth over it are unwilling to listen to why so many people are against the AI we’ve had forced upon us today.
Yesterday, Copilot hallucinated four different functions when I asked it to refactor a ~20 line TS function, despite me handing it 2 helper files that contained everything available for it to use. If I can’t confidently ask it to do anything, it’s immediately useless to me. It’s like being stuck with an impulsive liar that you have to get the truth out of.
A guy I used to work with would, at least I would swear it, submit shit code just so I would comment about the right way to do it. No matter how many times I told him how to do something. Sometimes it was code that didn't actually do anything. Working with co-pilot is a lot like working with that guy again.
Funny enough, here’s a description of AI I wrote yesterday that I think you’ll relate to:
AI is the lazy colleague that will never get fired because their dad is the CTO. You’re forced to pair with them on a daily basis. You try to hand them menial tasks that they still manage to get completely wrong, while dear ol’ dad is gassing them up in every all-hands meeting.
Dude I couldn't even get copilot to generate a picture with the size I wanted despite specifying the exact pixels for height and width.
It's fundamentally a make-shit-up device. It's like pulling words out of a hat. You cannot get mad at the hat for giving you poetry when you asked for nonfiction.
Get mad at the company which bolted the hat to your keyboard and promised you it was psychic.
I think that's exactly who they're mad at
Carmack is an AI sent from the future, so he's a bit biased.
AAA dev here.
Carmack is correct. I expect to be dogpiled by uninformed disagreements, though, because on social media all AI = Bad and no nuance is allowed. If that's your knee-jerk reaction, please refrain for a moment and calmly re-think through your position.
EDIT: lol called it
What AI tools are you personally looking forward to or already using?
Stable Diffusion does a lot already, for static pictures. I get good use out of Eleven for voice work, when I want something that isn't my own narration.
I'm really looking forward to all of these new AI features in DaVinci Resolve 20. These are actual useful features that would improve my workflow. I already made good use of the "Create Subtitles From Audio" feature to streamline subtitling.
Good AI tools are out there. They are just invisibility doing the work for people that pay attention while all of the billionaires make noise about LLMs that do almost nothing.
I compare it to CGI. The very best CGI are the effects you don't even notice. The worst CGI is when you try to employ it in every place that it's not designed for.
Not me personally, as AI can't really replicate my work (I'm a senior sound designer on a big game), but a few colleagues of mine have already begun reaping the workflow improvements of AI at their studio.
AAAA dev here.
Carmack is incorrect.
Prove it.
Demonstrating some crazy idea always confuses people who expect a finished product. The fact this works at all is sci-fi witchcraft.
Video generators offer rendering without models, levels, textures, shaders-- anything. And they'll do shocking photorealism as easily as cartoons. This one runs at interactive speeds. That's fucking crazy! It's only doing one part of one game that'd run on a potato, and it's not doing it especially well, but holy shit, it's doing it. Even if the context length stayed laughably short - this is an FMV you can walk around in. This is something artists could feed and prune and get real fuckin' weird with, until it's an inescapable dream sequence that looks like nothing we know how to render.
The most realistic near-term application of generative AI technology remains as coding assistants and perhaps rapid prototyping tools for developers, rather than a drop-in replacement for traditional game development pipelines.
Sure, let's pretend text is all it can generate. Not textures, models, character designs, et very cetera. What possible use could people have for an army of robots if they only do a half-assed job?
Imagine how much better bg3 would have been if there were more randomly distributed misc items of no value strewn across each map. Think of how fast you'd kill your mouse then!
This is what I'm talking about: an unwillingness to see anything but finished products. Not developing the content in a big-ass game... just adding stuff to a big-ass game. Like BG3 begins fully-formed as the exact product you've already played.
Like it'd be awful if similar new games took less than six years, three hundred people, and one hundred million dollars.
I get it, AI has some significant downsides, but people go way overboard. You don't have to tell people who use AI to kill themselves.
Somebody didn't watch Terminator 2
Oh man this thing is amazing, it's got some good memory as room layouts weren't changing on me whenever they left the view unlike previous attempts I've seen with Minecraft.
https://copilot.microsoft.com/wham?features=labs-wham-enabled
What are you talking about? It looks like shit, it plays like shit and the overall experience is shit. And it isn't even clear what the goal is? There are so many better ways to incorporate AI into game development, if one wanted to and I'm not sure we want to.
I have seen people argue this is what the technology can do today, imagine in a couple of years. However that seems very naive. The rate at which barriers are reached have no impact on how hard it is to break through those barriers. And as often in life, diminishing returns are a bitch.
Microsoft bet big on this AI thing, because they have been lost in what to do ever since they released things like the Windows Phone and Windows 8. They don't know how to innovate anymore, so they are going all in on AI. Shitting out new gimmicks at light speed to see which gain traction.
(Please note I'm talking about the consumer and small business side of Microsoft. Microsoft is a huge company with divisions that act almost like seperate companies within. Their Azure branch for example has been massively successful and does innovate just fine.)
I'm happy to see someone else pushing back against the inevitability line I see so much around this tech. It's still incredibly new and there's no guarantee it will continue to improve. Could it? Sure, but I think it's equally likely it could start to degrade instead due to ai inbreeding or power consumption becoming too big of an issue with larger adoption. No one actually knows the future and it's hardly inevitable.
What are you talking about? For the technology it looks and plays amazing. This is a nice steady increase in capability, all one can hope for with any technology.
I don't care about Microsoft, or what is commercially successful.