nightsky @ nightsky @awful.systems Posts 0Comments 124Joined 8 mo. ago

That whole plot angle feels dead today
It doesn't have to be IMO, in particular when it's an older work.
I don't mind at all to rewatch e.g. AI-themed episodes of TNG, such as the various episodes with a focus on Data, or the one where the ship computer gains sentience (it's a great episode actually).
On the other hand, a while ago I stopped listening to a contemporary (published in 2022) audiobook halfway throuh, it was an utopian AI scifi story. The theme of "AI could be great and save the world" just bugged me too much in relation to the current real-world situation. I couldn't enjoy it at all.
I don't know why I feel so differently about these two examples. Maybe it's simply because TNG is old enough that I do not associate it with current events, and the first time I saw the episodes was so long ago. Or maybe it's because TNG plays in a far-future scenario, clearly disconnected from today, while the audiobook plays in a current-day scenario. Hm, it's strange.
(and btw queer loneliness is an interesting theme, wonder if I could find an audiobook involving it)
The AI problem is still in an earlier stage at my job, but I've already witnessed in a code review that code was pointed out as questionable, and then it was justified with what amounted to "the AI generated this, it wasn't me". I really don't like where this is going.
AI will see a sharp decline in usage as a plot device
Today I was looking for some new audiobooks again, and I was scrolling through curated1 lists for various genres. In the sci-fi genre, there is a noticeable uptick in AI-related fiction books. I have noticed this for a while already, and it's getting more intense. Most seem about "what if AI, but really powerful and scary" and singularity-related scenarios. While such fiction themes aren't new at all, it appears to me that there's a wave of it now, although it's possible as well that I am just more cognisant of it.
I think that's another reason that will make your prediction true: sooner or later demand for this sub-genre will peak, as many people eventually become bored with it as a fiction theme as well. Like it happened with e.g. vampires and zombies.
(1 Not sure when "curation" is even human-sourced these days. The overall state of curation, genre-sorting, tagging and algorithmic "recommendations" in commercial books and audiobooks is so terrible... but that's a different rant for another day.)
If someone creates the world's worst playlist, that would play right after RMS's free software song.
For the part on generative AI skills as job requirement: just came across this, and it's beautiful. Made even better by the answer post from an audiobook narrator.
Amazon publishes Generative AI Adoption Index and the results are something! And by "something" I mean "annoying".
I don't know how seriously I should take the numbers, because it's Amazon after all and they want to make money with this crap, but on the other hand they surveyed "senior IT decision-makers".. and my opinion on that crowd isn't the highest either.
Highlights:
- Prioritizing spending on GenAI over spending on security. Yes, that is not going to cause problems at all. I do not see how this could go wrong.
- The junk chart about "job roles with generative AI skills as a requirement". What the fuck does that even mean, what is the skill? Do job interviews now include a section where you have to demonstrate promptfondling "skills"? (Also, the scale of the horizontal axis is wrong, but maybe no one noticed because they were so dazzled by the bars being suitcases for some reason.)
- Cherry on top: one box to the left they list "limited understanding of generative AI skilling needs" as a barrier for "generative AI training". So yeah...
- "CAIO". I hate that I just learned that.
I'm not sure I want to know, but what is the relation from beef tallow to fascism, is it related to the whole seed oil conspiracy? Or is it one of these imagined ultra manly masculine man things for maxxing the intake of meat? (I'm losing track of all the insane bullshit, there's just too much.)
The myth of the "10x programmer" has broken the brains of many people in software. They appear to think that it's all about how much code you can crank out, as fast as possible. Taking some time to think? Hah, that's just a sign of weakness, not necessary for the ultra-brained.
I don't hear artists or writers and such bragging about how many works they can pump out per week. I don't hear them gluing their hands to the pen of a graphing plotter to increase the speed of drawing. How did we end up like this in programming?
Update on my comment from yesterday: it seems I fell for satire (?). (I don't know the people involved, so no idea, but it seems plausible.)
I hate this position so much, claiming that it's because "the left" wanted "too much". That's not only morally bankrupt, it's factually wrong too. And also ignorant of historical examples. It's lazy and rotten thinking all the way through.
Oh! Wasn't aware of that podcast. Yeah, could be!
Warning: you might regret reading this screenshot of elno posting a screenshot. (cw: chatbots in sexual context)
oh noooo no no no
...but that brings me back to questions about "what does interaction with LLM chatbots do to human brains".
EDIT: as pointed out by Soyweiser below, the lower reply in the screenshot is probably satire.
Really great article, IMO the best on pivot-to-ai so far! The rest of the tech media is mostly useless on these issues, thank you so much for doing this.
If markets really rewarded the best, they would have rewarded Opera way more. (By which I mean the original Opera, up to version 12, and not the terrible chromium-based thing that has its name slapped on it today. Do not use that one, it's bad.)
Much more important for Chrome's success than "being the best" (when has that ever been important in the tech industry?), was Google's massive marketing campaign. Heck, back when Chrome was new, they even had large billboard ads for it around here, i.e. physical billboards in the real world. And "here" is a medium-sized city in Europe, not Silicon Valley or anything... I never saw any other web browser being advertised on freaking billboards.
Hope he remembers this in case some day he is in a nursing home, where all staff has been replaced with Tesla Optimus robots powered by "AI".
Yeah. Also, I'm always confused by how the AI becomes "all powerful".. like how does that happen. I feel like there's a few missing steps there.
Microsoft brags about the amount of technical debt they're creating. Either they're lying and the number is greatly exaggerated (very possible), or this will eventually destroy the company.
I'm quite happy with my electric razor though ;) But yeah, single-use plastic products, and their implicit "subscription"-like business model, is a good analogy.
I also don't expect that Gen-"AI" will go away entirely anymore, it's too useful for generating low-quality crap for e.g. spam and disinformation and similar purposes. I also dread the thought that when I buy a translated book now, I won't know much it was actually translated by a person.
However, I still have hope left that it will eventually become more of a background noise. Like how cryptocurrency still exists now, but at least we don't have to hear anymore about how "NFTs are the future of art" (just remember what a common theme that was for a while, pretty recently). Likewise I think that "AI is the future of
<creative thing>
" will eventually fade away.And some people are already creating and spreading little "human made" seals that one can attach to projects, I hope that catches on, like labeling of food products. And not just in niches like open source software (where I've seen it so far), but widely across all kinds of creative things, like book translations and music and so on. I can hope, right?
Once the big hype is over, when the bubble has burst, the absolutely enormous costs of running all the server farms will have to be passed on to the product-making companies and they will have to further pass it on to their users. As a result I think that most of these "AI" "features" will be pulled from most products, because who's really willing to pay for it? And I don't expect that it will become cheap soon. In their desperate attempts to make their "AI" perform "better", the companies are currently cranking up the usage of compute power to ever-higher degree, because they're otherwise out of ideas how to improve anything about it. And from what I hear (out of principle I never use this stuff myself), the small models which one could run locally just aren't very good (not that the big ones are "good"...). (However, as written above, they will always be good enough for spam/disinformation and such where quality doesn't matter.)
So I don't believe that this will be like the 80s or 90s, where one could develop fancy big software with the expectation that within a few years even the cheap entry-level machines will be fast enough for it. That kind of performance progress stopped long ago. I expect that this will stay really expensive for the foreseeable future, at least for the "better" models. And then, maybe, with most of this crap pulled out of our tools for plain and simple reasons of "cost", together with the collapse of the hype around it, we can go back to this being mostly background noise.
Yeah, I've always been kind of a hopeless optimist...
On the (slim) upside, it's an opportunity to ditch Google, and maybe it will sooner or later break their monopoly position. I switched my main search engine to Ecosia a while ago, I think it uses Bing underneath (meh), but presumably it's more privacy friendly than Google (or Bing directly). I've had numerous such attempts over the years already to get away from Google, but always returned, because the search results were just so much better (especially for non-English stuff). But now Google has gotten so much worse that it created almost an equilibrium... sometimes it's still useful and better, but not that often anymore. So I rarely go to Google now, not because the others got better, but because Google got so much worse.
(found here:) O'Reilly is going to publish a book "Vibe Coding: The Future of Programming"
In the past, they have published some of my favourite computer/programming books... but right now, my respect for them is in free fall.