The anti-AI sentiment in the free software communities is concerning.
Whenever AI is mentioned lots of people in the Linux space immediately react negatively. Creators like TheLinuxExperiment on YouTube always feel the need to add a disclaimer that "some people think AI is problematic" or something along those lines if an AI topic is discussed.
I get that AI has many problems but at the same time the potential it has is immense, especially as an assistant on personal computers (just look at what "Apple Intelligence" seems to be capable of.) Gnome and other desktops need to start working on integrating FOSS AI models so that we don't become obsolete. Using an AI-less desktop may be akin to hand copying books after the printing press revolution.
If you think of specific problems it is better to point them out and try think of solutions, not reject the technology as a whole.
TLDR: A lot of ludite sentiments around AI in Linux community.
But ml is a type of ai. Just because the word makes you think of androids and skynet doesn't mean that's the only thing that can be called so. Personally never understood this attempt at limiting the word to that now while ai has been used for lesser computer intelligences for a long time.
Well not at all. What a word means is not defined by what you might think. When the majority starts to use a word for something and that sticks, it can be adopted. That happens all the time and I have read articles about it many times. Even for our current predicament. Language is evolving. Meanings change. And yes ai today includes what is technically machine learning. Sorry friend, that's how it works. Sure you can be the grumpy drunk at a bar complaining that this is not strictly ai by some definition while the rest of the world rolls their eyes and proceeds to more meaningful debates.
Words have meaning and, sure, they can be abused and change meaning over time but let's be real here: AI is a hype term with no basis on reality. We do not have AI, we aren't even all that close. You can make all the ad hominem comments you want but at the end of the day, the terminology comes from ignorant figureheads hyping shit up for profit (at great environmental cost too, LLM aka "AI" takes up a lot of power while yielding questionable results).
Kinda sounds like you bought into the hype, friend.
You missed the point again, oh dear!
Let me try again in simpler terms : you yourself dont define words, how they are used in the public does. So if the world calls it ai, then the word will mean what everybody means when they use it.
This is how the words come to be, evolve and are at the end put in the dictionary. Nobody cares what you think. Ai today includes ML. Get over it.
Nice try with deflection attempts, but I really don't care about them, I'm only here to teach you where words come from and to tell you, the article is written about you.
As someone who frequently interacts with the tech illiterate, no they don't. This sudden rush to put weighed text hallucination tables into everything isn't that helpful. The hype feels like self driving cars or 3D TVs for those of us old enough to remember that. The potential for damage is much higher than either of those two preceding fads and cars actually killed poeple. I think many of us are expressing a healthy level of skepticism toward the people who need to sell us the next big thing and it is absolutely warranted.
It’s exactly like self driving everyone is like this is the time we are going to get AGI. But it well be like everything else overhyped and under deliver. Sure it well have its uses companies well replace people with it and they enshitificstion well continue.
You can doubt all you like but we keep seeing the training data leaking out with passwords and personal information. This problem won't be solved by the people who created it since they don't care and fundamentally the technology will always show that lack of care. FOSS ones may do better in this regard but they are still datasets without context. Thats the crux of the issue. The program or LLM has no context for what it says. That's why you get these nonsensical responses telling people that killing themselves is a valid treatment for a toothache. Intelligence is understanding. The "AI" or LLM or, as I like to call them, glorified predictive textbars, doesn't understand the words it is stringing together and most people don't know that due to flowery marketing language and hype. The threat is real.
They act like its the computer daydreaming. No, its wrong. The machine that is supposed to provide me correct information. It didn't it. These marketing wizards are selling snake oil in such a lovely bottle these days.
Nothing was ever wrong with calling them “virtual assistants” - at least with them you’re conditioned to have a low bar of expectations. So if it performs past expectations, you’ll be excited, lol.
Look, the naming ship has sailed and sunk somewhere in the middle of the ocean. I think it's time to accept that "AI" just means "generative model" and what we would have called "AI" is now more narrowly "AGI".
People call videogame enemies "AI", too, and it's not the end of the world, it's just imprecise.
This is a bit philosophical but who is to say that mimicking intelligence with advanced math is not intelligence. LLMs can perform various thinking tasks better than humans we consider intelligent.
What AI means will change, what it refers to will change. Currently, the LLMs and other technologies are referred to as AI, like you say. In five years time we will have made huge leaps. Likely, this will result in technology also called AI.
In a similar vein, hover boards are still known as exactly that - like in films. Whereas the “real” hover board that exists has wheels. We didn’t stop calling the other ones hover boards, and if we ever get real ones they will likely also be called hoverboards.
To be 🤓 really really nitpicky, and i’m writing this because I find it interesting, not an attack or whatever. A tongue in cheek AcHtUaLlY 🤓
GNU/Linux is the “whole operating system”, and everything else is extra. The usefulness of an operating system without applications is debatable but they 🤓 technically aren’t required to complete the definition of an operating system.
But this is also basically the debate of Linux vs GNU/Linux vs also needing applications to make a useful operating system.
Quoting wiki summary,
In its original meaning, and one still common in hardware engineering, the operating system is a basic set of functions to control the hardware and manage things like task scheduling and system calls. In modern terminology used by software developers, the collection of these functions is usually referred to as a kernel, while an 'operating system' is expected to have a more extensive set of programmes. The GNU project maintains two kernels itself, allowing the creation of pure GNU operating systems, but the GNU toolchain is also used with non-GNU kernels. Due to the two different definitions of the term 'operating system', there is an ongoing debate concerning the naming of distributions of GNU packages with a non-GNU kernel.
Don't tell me Linux mint would still be Linux mint without the a desktop environment like Cinnamon. An os is the collection of all the software not just the low level code.
Linux doesnt need GNU components at all to be a functional operating system. And you wouldnt see any difference if your http server works on GNU/Linux or Linux without GNU.
On the other hand there is difference between an AI and LLM. The difference is signifacant enough to distinguish. You may mean LLMs if you talk about AI, but tbh I though you didnt. Because many people dont.
Alpine Linux is a Linux distribution designed to be small, simple, and secure. It uses musl, BusyBox, and OpenRC instead of the more commonly used glibc, GNU Core Utilities, and systemd. This makes Alpine one of few Linux distributions not to be based on the GNU Core Utilities.