Love it or hate it (I hate it) … humanity entered a new era with this shit. It’s everywhere and most want it. Like plastic or nuclear radiation or fire or metal work … all the things that mark a point in time in the geological record … AI created content now marks the beginning of this era.
It may well simply mark the beginning of the 3rd Millennium. The 1900s are now dead and we’re on new shores.
I’m still disturbed by how little people think about ethically and structurally. All I’m seeing is consumption and tech hype.
Pushback is futile at this point. My boss uses it in every time in his quest to climb higher up on the career ladder and it’s making me want to quit because what the fuck is the point of all this shit if thinking is moot
It's so wild to me that so many people apparently hate the idea of learning something new, to the point that they'll let an algorithm do their thinking for them. Kids not wanting to do their assignments I understand, but it seems so many adults just flat out hate the meat thing in their skull and wish it would atrophy away into nothing.
As I’m sure you guys know by now, it is extremely difficult to stay alert and attentive, instead of getting hypnotised by the constant monologue inside your own head (may be happening right now). Twenty years after my own graduation, I have come gradually to understand that the liberal arts cliché about teaching you how to think is actually shorthand for a much deeper, more serious idea: learning how to think really means learning how to exercise some control over how and what you think. It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience. Because if you cannot exercise this kind of choice in adult life, you will be totally hosed. Think of the old cliché about “the mind being an excellent servant but a terrible master.”
This, like many clichés, so lame and unexciting on the surface, actually expresses a great and terrible truth. It is not the least bit coincidental that adults who commit suicide with firearms almost always shoot themselves in: the head. They shoot the terrible master. And the truth is that most of these suicides are actually dead long before they pull the trigger.
I’m still disturbed by how little people think about it ethically and structurally.
As pessimistic as I am - I would have never thought the public's reaction to a tech-related Pandora's box situation is "Ethical and other problems? Blah blah blah. Open it! And there's gotta be more than one box right? Open them all! Why wait?"
Yep. Just the other day someone dumped an AI response into chat out of nowhere. When I said I didn’t want to call it out as it might have been rude to point out they weren’t capable of knowing that stuff … their reply was that it’s everywhere now, so it couldn’t be rude.
And they’re likely right. But my god it shocked me.
Yes, but the uptake of this feels categorically different, like some new consumerism has been unlocked, and of course Streaming and under are still here aren’t they.
Spatial memory critically relies on the hippocampus and people who use these strategies have greater fMRI BOLD activity and greater grey matter in the hippocampus. Our findings suggest that people with greater GPS habits may rely less on their hippocampus for navigation, as they exhibit a reduced use of spatial memory strategies, reduced cognitive mapping abilities, reduced landmark encoding, and as they have more difficulty learning navigational information. This is consistent with a recent study by Spiers’ group, in which participants who were given instructions on where to turn at decision points while navigating in a film simulation, akin to using GPS, exhibited less fMRI BOLD activity in the hippocampus than when participants self-guided and had to make decisions unaided
I’m so stoked for that to happen to the prefrontal cortex in our ChatGPT society dude, it’s gonna be so rad
All the teens I work this openly admit to doing this.
I'm also in college right now and there was this international student I was in class with last semester who straight up could not speak English, and I saw him using AI all the time. Last day of class he did a presentation that was totally incoherent cuz it was clearly written by ChatGPT.
The problem is the euphemism here. I can't imagine more than a tenth of the people who use LLMs when doing their homework are being restrained and not copying the output, i.e. using it to help. And part of the issue for that is that you'd have to be stupid not to just use the output directly if you know how to avoid getting caught. Only students that value learning for its own sake will ever bother using these tools appropriately and capitalism has only ever taught children that education is a means to an end, getting hired, so there's only more cheating ahead.
Being honest, I’ve found it helpful for rewording complex ideas into simpler language so i can get another perspective on books I’m reading, or give me references/jumping off points for more reading. The idea of an interactive book is quite cool, even if uttterly unreliable thanks to the architecture of LLMs inevitably leading to hallucinations (a misleading term - everything is a hallucination).
But our ignorant, short term, profit driven culture means that I’m a fucking oddity for doing that and not just using it to replace thinking.
I think ChatGPT is a good tool for studying, similar to how saying things aloud/writing things out is a good tool for studying because it forces you to think, except with ChatGPT you can have a back and forth. Verifying the output for truth also seems like a good exercise for learning (as long as you don't use ChatGPT to verify itself)
The issue I see is you need to have some base of critical thinking skills and general knowledge to work from in order to get the most out of ChatGPT and I doubt most students are at that level. I don't even consider myself at that level. I assume the worst and students are just taking the output and rewording it.
If we collectively agree ChatGPT is here to stay, then maybe we ought to teach how to use it in a productive way that doesn't dumb us down. I know this isn't a well-defined thing though but probably it would be worth researching. We live under capitalism though so the odds of that happening are probably next to none, but I'm thinking about what we should do if ChatGPT existed under a different system.
As far as the horrifying shit I’ve seen from LLMs. I have to let this slide so long as it’s not forced on anyone. Hell, Google’s AI crap would be a lot more tolerable if there was a third button right next to “I’m feeling lucky” that just said “Try Gemini today!”
Like if I can ask a specific question and have a concept explained to me. Cool! Now it’s just a matter of making that more environmentally friendly.
tbh this is one of those instances im not too worried. I fucking hated homework more than anything as a kid, and did basically whatever i could to avoid it, i would calculate how many assignments i could skip or leave half undone etc. while still maintaining an A. honestly not sure if i would have even used chatgpt because it would have meant copying the answers down lol. also as a sidenote i think thats why math always used to be my favorite class, the homework would only be worth 10% of the grade so i could basically ignore them and just do a few here and there during class and be good lol
all that to say i think homework 99% of the time is very dumb and useless and im glad kids are using the latest and greatest tech to get around doing it lol. if you want kids to learn discipline or smthing like that it should be part of an extracurricular like sports or music or something
I know people who have outsourced basic literacy to AI. Like they’re too lazy to actually read through the results page on google so they have ChatGPT summarize it for them. They won’t look for an actual recipe but ask ChatGPT to make one for them.
But seriously everything has to be a "hack" or a "shortcut" or whatever and that's not good. The way capitalism creates an urgency in all things in order to monetize it has gotta be bad for our brains.
I went to a technical college recently, and I feel like pretty much everyone but me used chatGPT for every short answer or essay question. And not like... Bouncing back and forth (or bypassing industrial copyrights. idk why Australian Standards are paywalled), but literally just asking it and copying and pasting. And then watching multiple arguments between the class and the lecturers about how that's what they'd be doing in industry (rather than, say, which job not existing because management doesn't need to hire people to put things into chatGPT).