Skip Navigation
NSFW
The "second" "law" of "infodynamics"
  • I will try to have some more comments about the physics when I have time and energy. In the meanwhile:

    Entropy in thermodynamics is not actually a hard concept. It's the ratio of the size of a heat flow to the temperature at which that flow is happening. (So, joules per kelvin, if you're using SI units.) See episodes 46 and 47 of The Mechanical Universe for the old-school PBS treatment of the story. The last time I taught thermodynamics for undergraduates, we used Finn's Thermal Physics, for the sophisticated reason that the previous professor used Finn's Thermal Physics.

    Entropy in information theory is also not actually that hard of a concept. It's a numerical measure of how spread-out a probability distribution is.

    It's relating the two meanings that is tricky and subtle. The big picture is something like this: A microstate is a complete specification of the positions and momenta of all the pieces of a system. We can consider a probability distribution over all the possible microstates, and then do information theory to that. This bridges the two definitions, if we are very careful about it. One thing that trips people up (particularly if they got poisoned by pop-science oversimplifications about "disorder" first) is forgetting the momentum part. We have to consider probabilities, not just for where the pieces are, but also for how they are moving. I suspect that this is among Vopson's many problems. Either he doesn't get it, or he's not capable of writing clearly enough to explain it.

    So these two were published in American Institute of Physics Advances, which looks like a serious journal about physics. Does anyone know about it? It occupies a space where I can’t easily find any obvious issues, but I also can’t find anyone saying “ye this is legit”. It claims to be peer-reviewed, and at least isn’t just a place where you dump a PDF and get a DOI in return.

    I have never heard of anything important being published there. I think it's the kind of journal where one submits a paper after it has been rejected by one's first and second (and possibly third) choices.

    However, after skimming, I can at least say that it doesn’t seem outlandish?

    Oh, it's worse than "outlandish". It's nonsensical. He's basically operating at a level of "there's an E in this formula and an E in this other formula, so I will set them equal and declare it revolutionary new physics".

    Here's a passage from the second paragraph of the 2023 paper:

    The physical entropy of a given system is a measure of all its possible physical microstates compatible with the macrostate, SPhys. This is a characteristic of the non-information bearing microstates within the system. Assuming the same system, and assuming that one is able to create N information states within the same physical system (for example, by writing digital bits in it), the effect of creating a number of N information states is to form N additional information microstates superimposed onto the existing physical microstates. These additional microstates are information bearing states, and the additional entropy associated with them is called the entropy of information, SInfo. We can now define the total entropy of the system as the sum of the initial physical entropy and the newly created entropy of information, Stot = SPhys + SInfo, showing that the information creation increases the entropy of a given system.

    wat

    Storing a message in a system doesn't make new microstates. How could it? You're just rearranging the pieces to spell out a message — selecting those microstates that are consistent with that message. Choosing from a list of available options doesn't magically add new options to the list.

  • Featured
    Stubsack: weekly thread for sneers not worth an entire post, week ending 24th November 2024
  • The "simulation hypothesis" is an ego flex for men who want God to look like them.

  • Substack readers may be paying for AI-generated newsletters, Medium full of slop too
  • From the Wired story:

    As a comparison, Cui cited another analysis that GPTZero ran on Wikipedia earlier this year, which estimated that around one in 20 articles on the site are likely AI-generated—about half the frequency of the posts GPTZero looked at on Substack.

    That should be one in 20 new articles, per the story they cite, which is ultimately based on arXiv:2410.08044.

    David Skilling, a sports agency CEO who runs the popular soccer newsletter Original Football (over 630,000 subscribers), told WIRED he sees AI as a substitute editor. “I proudly use modern tools for productivity in my businesses,” says Skilling.

    Babe wake up, a new insufferable prick just dropped.

    Edit to add: There's an interesting example here of a dubious claim being laundered into truthiness. That arXiv preprint says this in the conclusion section.

    Shao et al. (2024) have even designed a retrieval-based LLM workflow for writing Wikipedia-like articles and gathered perspectives from experienced Wikipedia editors on using it—the editors unanimously agreed that it would be helpful in their pre-writing stage.

    But if we dig up arXiv:2402.14207, we find that the "unanimous" agreement depends upon lumping together "somewhat" and "strongly agree" on their Likert scale. Moreover, this grand claim rests upon a survey of a grand total of ten people. Ten people, we hasten to add, who agreed to the study in the first place, practically guaranteeing a response bias against those Wikipedians who find "AI" morally repugnant.

  • Substack readers may be paying for AI-generated newsletters, Medium full of slop too
  • shocked, shocked to find that gambling is going on in here.gif

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 17th November 2024
  • Breaking news: "AI-generated poetry is indistinguishable from human-written poetry and is rated more favorably"!

    Or, you know, not.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 17th November 2024
  • If you find yourself saying

    There isn't a single good term in English for people who are post-pubertal but below the legal age of consent or majority

    you may already be morally diseased.

  • OpenAI, Google, Anthropic admit they can’t scale up their chatbots any further
  • My own final project was a parody of the IMDb that was "what if the IMDb was about books instead of movies", except that the user reviews told stories about people who turned out to have all gone to high school together before scattering around the world, and reading them in the right sequence unlocked a finale in which they reunited for a New Year's party and their world dissolved so that their author could repurpose them for other stories.

  • OpenAI, Google, Anthropic admit they can’t scale up their chatbots any further
  • Senior year of college, I took an elective seminar on interactive fiction. For the final project, one of my classmates wrote a program that scraped a LiveJournal and converted it into a text adventure game.

  • OpenAI, Google, Anthropic admit they can’t scale up their chatbots any further
  • "I was somewhere in the middle of your mother last night, Trebek!"

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 17th November 2024
  • Serious data might not be available for months. For comparison, the Pew Research Center didn't come out with their numbers for the 2020 election until June 2021. Who knows? The country might burn down before next summer.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 17th November 2024
  • here's a matt yglesias article on the ordeal that i think is pretty even-handed

    eat a dick

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 10th November 2024
  • Not enough for the presidential race, sadly; perhaps enough to scrape by with a few Senate victories.

  • Google Search is getting worse and worse
  • Those are the actors who played Duncan Idaho in the David Lynch adaptation and in the two Syfy miniseries. So, yeah, it's not wrong, just incomplete — though I have no idea why it only serves up those three. There's certainly no limitation to three images, as can be verified by searching for "Sherlock Holmes actor" or the like.

  • Google Search is getting worse and worse
  • Even the AI summary is quite good

    insta-block

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 10th November 2024
  • The level of fucked we are is too big for words

    I wanted to say something darkly comedic about the AI bubble popping under the new regime, but my heart is too sick to make a joke

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 3rd November 2024
  • My sense growing up in Huntsville was that the airport ads for defense contractors were kind of like, e.g., Exxon sponsoring a pavilion at EPCOT. The intent wasn't to push any specific consumer towards buying any specific product, but to pump out a positive image for the company generally.

    And a lot of those contractors' people fly through Huntsville on business. (For those not in the know: The airport is just down the highway from Redstone Arsenal, which is where we brought all them Nazis we recruited to help us beat the Commies to the Moon. The only reason Huntsville exists as more than a sleepy/dying cotton mill town is the space program and missile warfare.) There may well be deals along the lines of "advertise here and your people get the cushy lounge".

  • Where's Your Ed At- The Cult of Microsoft
  • "I have been unfailingly polite, and [your lemmy instance has] been nothing but rude."

  • JD Vance outs himself as an SSCer, SSCers react
  • Downvoting because you are a dorkus

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 6 October 2025

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this)

    317
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 29 September 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this)

    167
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 22 September 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this)

    200
    Off-Topic: Music Recommendation Thread

    So, here I am, listening to the Cosmos soundtrack and strangely not stoned. And I realize that it's been a while since we've had a random music recommendation thread. What's the musical haps in your worlds, friends?

    39
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 7 July 2024

    Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh facts of Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > >Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    221
    Honest Government Ad | AI

    Bumping this up from the comments.

    2
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 16 June 2024

    Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    14
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 9 June 2024

    Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    14
    Neil Gaiman on spicy autocomplete
    www.tumblr.com Neil Gaiman

    I apologize if you’ve been asked this question before I’m sure you have, but how do you feel about AI in writing? One of my teachers was “writing” stories using ChatGPT then was bragging about how go…

    > Many magazines have closed their submission portals because people thought they could send in AI-written stories. > > For years I would tell people who wanted to be writers that the only way to be a writer was to write your own stories because elves would not come in the night and do it for you. > > With AI, drunk plagiaristic elves who cannot actually write and would not know an idea or a sentence if it bit their little elvish arses will actually turn up and write something unpublishable for you. This is not a good thing.

    2
    Cybertruck owners allege pedal problem as Tesla suspends deliveries
    arstechnica.com Cybertruck owners allege pedal problem as Tesla suspends deliveries

    Owners will have to wait until April 20 for deliveries to resume.

    > Tesla's troubled Cybertruck appears to have hit yet another speed bump. Over the weekend, dozens of waiting customers reported that their impending deliveries had been canceled due to "an unexpected delay regarding the preparation of your vehicle." > > Tesla has not announced an official stop sale or recall, and as of now, the reason for the suspended deliveries is unknown. But it's possible the electric pickup truck has a problem with its accelerator. [...] Yesterday, a Cybertruck owner on TikTok posted a video showing how the metal cover of his accelerator pedal allegedly worked itself partially loose and became jammed underneath part of the dash. The driver was able to stop the car with the brakes and put it in park. At the beginning of the month, another Cybertruck owner claimed to have crashed into a light pole due to an unintended acceleration problem.

    Meanwhile, layoffs!

    0
    Google Books Is Indexing AI-Generated Garbage
    www.404media.co Google Books Is Indexing AI-Generated Garbage

    Google said it will continue to evaluate its approach “as the world of book publishing evolves.”

    > Google Books is indexing low quality, AI-generated books that will turn up in search results, and could possibly impact Google Ngram viewer, an important tool used by researchers to track language use throughout history.

    0
    Elon Musk’s Tunnel Reportedly Oozing With Skin-Burning Chemical Sludge
    futurism.com Elon Musk’s Tunnel Reportedly Oozing With Skin-Burning Chemical Sludge

    Elon Musk's Boring Company has only built a few miles of tunnel underneath Vegas — but those tunnels have taken a toxic toll.

    [Eupalinos of Megara appears out of a time portal from ancient Ionia] Wow, you guys must be really good at digging tunnels by now, right?

    0
    New York taxpayers are paying for spicy autocomplete to tell landlords they can discriminate
    themarkup.org NYC’s AI Chatbot Tells Businesses to Break the Law – The Markup

    The Microsoft-powered bot says bosses can take workers’ tips and that landlords can discriminate based on source of income

    > In October, New York City announced a plan to harness the power of artificial intelligence to improve the business of government. The announcement included a surprising centerpiece: an AI-powered chatbot that would provide New Yorkers with information on starting and operating a business in the city. > > The problem, however, is that the city’s chatbot is telling businesses to break the law.

    0
    Chris Langan and the "Cognitive Theoretic Model of the Universe"? Oh boy!

    a lesswrong: 47-minute read extolling the ambition and insights of Christopher Langan's "CTMU"

    a science blogger back in the day: not so impressed

    > [I]t’s sort of like saying “I’m going to fix the sink in my bathroom by replacing the leaky washer with the color blue”, or “I’m going to fly to the moon by correctly spelling my left leg.”

    Langan, incidentally, is a 9/11 truther, a believer in the "white genocide" conspiracy theory and much more besides.

    0
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 31 March 2024

    Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut'n'paste it into its own post, there’s no quota here and the bar really isn't that high

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    0
    Elsevier keeps publishing articles written by spicy autocomplete

    If you've been around, you may know Elsevier for surveillance publishing. Old hands will recall their running arms fairs. To this storied history we can add "automated bullshit pipeline".

    In Surfaces and Interfaces, online 17 February 2024:

    > Certainly, here is a possible introduction for your topic:Lithium-metal batteries are promising candidates for high-energy-density rechargeable batteries due to their low electrode potentials and high theoretical capacities [1], [2].

    In Radiology Case Reports, online 8 March 2024:

    > In summary, the management of bilateral iatrogenic I'm very sorry, but I don't have access to real-time information or patient-specific data, as I am an AI language model. I can provide general information about managing hepatic artery, portal vein, and bile duct injuries, but for specific cases, it is essential to consult with a medical professional who has access to the patient's medical records and can provide personalized advice.

    Edit to add this erratum:

    > The authors apologize for including the AI language model statement on page 4 of the above-named article, below Table 3, and for failing to include the Declaration of Generative AI and AI-assisted Technologies in Scientific Writing, as required by the journal’s policies and recommended by reviewers during revision.

    Edit again to add this article in Urban Climate:

    > The World Health Organization (WHO) defines HW as “Sustained periods of uncharacteristically high temperatures that increase morbidity and mortality”. Certainly, here are a few examples of evidence supporting the WHO definition of heatwaves as periods of uncharacteristically high temperatures that increase morbidity and mortality

    And this one in Energy:

    > Certainly, here are some potential areas for future research that could be explored.

    Can't forget this one in TrAC Trends in Analytical Chemistry:

    > Certainly, here are some key research gaps in the current field of MNPs research

    Or this one in Trends in Food Science & Technology:

    > Certainly, here are some areas for future research regarding eggplant peel anthocyanins,

    And we mustn't ignore this item in Waste Management Bulletin:

    > When all the information is combined, this report will assist us in making more informed decisions for a more sustainable and brighter future. Certainly, here are some matters of potential concern to consider.

    The authors of this article in Journal of Energy Storage seems to have used GlurgeBot as a replacement for basic formatting:

    > Certainly, here's the text without bullet points:

    0
    SneerClub Classic: Big Yud's Mad Men Cosplay

    !

    Yudkowsky writes,

    > How can Effective Altruism solve the meta-level problem where almost all of the talented executives and ops people were in 1950 and now they're dead and there's fewer and fewer surviving descendants of their heritage every year and no blog post I can figure out how to write could even come close to making more people being good executives?

    Because what EA was really missing is collusion to hide the health effects of tobacco smoking.

    0
    Billionaires push AI apocalypse risk through college student groups

    > Last summer, he announced the Stanford AI Alignment group (SAIA) in a blog post with a diagram of a tree representing his plan. He’d recruit a broad group of students (the soil) and then “funnel” the most promising candidates (the roots) up through the pipeline (the trunk).

    See, it's like marketing the idea, in a multilevel way

    0
    Talking about a ‘schism’ is ahistorical
    medium.com Talking about a ‘schism’ is ahistorical

    In two recent conversations with very thoughtful journalists, I was asked about the apparent ‘schism’ between those making a lot of noise…

    Emily M. Bender on the difference between academic research and bad fanfiction

    0
    "If you think you can point to an unnecessary sentence within [HPMoR], go ahead and try."

    In the far-off days of August 2022, Yudkowsky said of his brainchild,

    > If you think you can point to an unnecessary sentence within it, go ahead and try. Having a long story isn't the same fundamental kind of issue as having an extra sentence.

    To which MarxBroshevik replied,

    > The first two sentences have a weird contradiction: > >> Every inch of wall space is covered by a bookcase. Each bookcase has six shelves, going almost to the ceiling. > > So is it "every inch", or are the bookshelves going "almost" to the ceiling? Can't be both. > > I've not read further than the first paragraph so there's probably other mistakes in the book too. There's kind of other 'mistakes' even in the first paragraph, not logical mistakes as such, just as an editor I would have... questions.

    And I elaborated:

    I'm not one to complain about the passive voice every time I see it. Like all matters of style, it's a choice that depends upon the tone the author desires, the point the author wishes to emphasize, even the way a character would speak. ("Oh, his throat was cut," Holmes concurred, "but not by his own hand.") Here, it contributes to a staid feeling. It emphasizes the walls and the shelves, not the books. This is all wrong for a story that is supposed to be about the pleasures of learning, a story whose main character can't walk past a bookstore without going in. Moreover, the instigating conceit of the fanfic is that their love of learning was nurtured, rather than neglected. Imagine that character, their family, their family home, and step into their library. What do you see?

    > Books — every wall, books to the ceiling.

    Bam, done.

    > This is the living-room of the house occupied by the eminent Professor Michael Verres-Evans,

    Calling a character "the eminent Professor" feels uncomfortably Dan Brown.

    > and his wife, Mrs. Petunia Evans-Verres, and their adopted son, Harry James Potter-Evans-Verres.

    I hate the kid already.

    > And he said he wanted children, and that his first son would be named Dudley. And I thought to myself, what kind of parent names their child Dudley Dursley?

    Congratulations, you've noticed the name in a children's book that was invented to sound stodgy and unpleasant. (In The Chocolate Factory of Rationality, a character asks "What kind of a name is 'Wonka' anyway?") And somehow you're trying to prove your cleverness and superiority over canon by mocking the name that was invented for children to mock. Of course, the Dursleys were also the start of Rowling using "physically unsightly by her standards" to indicate "morally evil", so joining in with that mockery feels ... It's aged badly, to be generous.

    Also, is it just the people I know, or does having a name picked out for a child that far in advance seem a bit unusual? Is "Dudley" a name with history in his family — the father he honored but never really knew? His grandfather who died in the War? If you want to tell a grown-up story, where people aren't just named the way they are because those are names for children to laugh at, then you have to play by grown-up rules of characterization.

    The whole stretch with Harry pointing out they can ask for a demonstration of magic is too long. Asking for proof is the obvious move, but it's presented as something only Harry is clever enough to think of, and as the end of a logic chain.

    >"Mum, your parents didn't have magic, did they?" \[...\] "Then no one in your family knew about magic when Lily got her letter. \[...\] If it's true, we can just get a Hogwarts professor here and see the magic for ourselves, and Dad will admit that it's true. And if not, then Mum will admit that it's false. That's what the experimental method is for, so that we don't have to resolve things just by arguing."

    Jesus, this kid goes around with L's theme from Death Note playing in his head whenever he pours a bowl of breakfast crunchies.

    >Always Harry had been encouraged to study whatever caught his attention, bought all the books that caught his fancy, sponsored in whatever maths or science competitions he entered. He was given anything reasonable that he wanted, except, maybe, the slightest shred of respect.

    Oh, sod off, you entitled little twit; the chip on your shoulder is bigger than you are. Your parents buy you college textbooks on physics instead of coloring books about rocketships, and you think you don't get respect? Because your adoptive father is incredulous about the existence of, let me check my notes here, literal magic? You know, the thing which would upend the body of known science, as you will yourself expound at great length.

    >"Mum," Harry said. "If you want to win this argument with Dad, look in chapter two of the first book of the Feynman Lectures on Physics.

    Wesley Crusher would shove this kid into a locker.

    1
    blakestacey blakestacey @awful.systems
    Posts 22
    Comments 289
    Moderates