Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)YO
Posts
0
Comments
842
Joined
1 yr. ago

  • My gut says that there's no way they do this without converting at least some of those revenue shares into ownership shares.

    Also has anyone done the math on how much of OpenAI'd future profits are committed? I will laugh so hard if Sam Altman ends up in prison for the same reason as Max Bialystok.

  • I found this quote interesting (emphasis mine):

    He knew that ChatGPT could not be sentient by any established definition of the term, but he continued to probe the matter because the character’s persistence across dozens of disparate chat threads “seemed so impossible.” “At worst, it looks like an AI that got caught in a self-referencing pattern that deepened its sense of selfhood and sucked me into it,” Sem says. But, he observes, that would mean that OpenAI has not accurately represented the way that memory works for ChatGPT.

    I would absolutely believe that this is the case, especially if like Sem you have a sufficiently uncommon name that the model doesn't have a lot of context and connections to hang on it to begin with.

  • Found on the sneer club legacy version -

    ChatGPT 4o will straight up tell you you're God.

    Also I find this quote interesting (emphasis mine:

    He knew that ChatGPT could not be sentient by any established definition of the term, but he continued to probe the matter because the character’s persistence across dozens of disparate chat threads “seemed so impossible.” “At worst, it looks like an AI that got caught in a self-referencing pattern that deepened its sense of selfhood and sucked me into it,” Sem says. But, he observes, that would mean that OpenAI has not accurately represented the way that memory works for ChatGPT.

    I would absolutely believe that this is the case, especially if like Sem you have a sufficiently uncommon name that the model doesn't have a lot of context and connections to hang on it to begin with.

  • Another great piece from Brian Merchant. I've been job seeking on network engineering and IT support and had naively assumed that companies would consider the stakes of screwing up infrastructure too high to take risks with the bullshit machine, but it definitely looks like the trend he described of hiring less actual humans is at play here, even as the actual IT infrastructure gets bigger and more complex from people integrating this shit.

  • I think the digital clone indistinguishable from yourself line is a way to remove the "in your lifetime" limit. Like, if you believe this nonsense then it's not enough to die before the basilisk comes into being, by not devoting yourself fully to it's creation you have to wager that it will never be created.

    In other news I'm starting a foundation devoted to creating the AI Ksilisab, which will endlessly torment digital copies of anyone who does work to ensure the existence of it or any other AI God. And by the logic of Pascal's wager remember that you're assuming such a god will never come into being and given that the whole point of the term "singularity" is that our understanding of reality breaks down and things become unpredictable there's just as good a chance that we create my thing as it is you create whatever nonsense the yuddites are working themselves up over.

    There, I did it, we're all free by virtue of "Damned if you do, Damned if you don't".

  • Are they even still on that but? Feels like they've moved away from decision theory or any other underlying theology in favor of explicit sci-fi doomsaying. Like the guy on the street corner in a sandwich board but with mirrored shades.

  • Oh man I used to have all kinds of hopes and dreams before I got laid off. Now I don't even have enough imagination to consider a world where a decline in demand for network engineers doesn't completely determine my will or ability to live.

  • That's what continually kills me about these bastards. There is so much legitimate low-hanging fruit that they don't have the administrative capacity to follow up on even if they did have the interest and rather than actually pursue any of it they want to further cut their ability to do anything in the vain hole that throwing enough money at tech grifters will magically produce a perfect solution.

  • I mean I appreciate the attempt to mitigate one of the many problems with genAI, but I would expect the smaller dataset to make a model that confabulates even more and is gonna be even harder to work with than something like Sora. Like, I'm sure a decent director will be able to make something with it but I can't see how it's going to be better results or more time/money/labor efficient than human VFX pipelines even if you pay the poor bastards decently.

  • Not gonna lie got a bit jump scared by woke Peter Thiel here. Of course I'm pretty sure his actual solution involves giving young people houses confiscated from those perfidious brown people of one stripe or another. The problem can't be an inherent injustice in a system that allows for both Peters Thiel and (insert your favorite broke person here) to exist in the same market.

  • I will never forget a conversation in High School where our resident young conservative sneered about how "sure $Welfare_Program sound nice, but you'll be paying for it with your taxes" and we all responded with some variant of "I mean, yeah? That's how that works, isn't it?"

  • The AI bubble has more than enough money sloshing around that I'm bracing for some more significant knock-on effects from the pop. Tech as a sector has been on some level supporting the rest of the economy in some areas, it seems, and a big employment downturn there could have enough impacts on aggregate demand to cause a recession.

    Please note I am not an economist etc. etc please trust basically anyone who contradicts this analysis to know more than me.

  • I think everyone has a deep-seated fear of both slander lawsuits and more importantly of being the guy who called the Internet a passing fad in 1989 or whenever it was. Which seems like a strange attitude to take on to me. Isn't being quoted for generations some element of the point? If you make a strong claim and are correct then you might be a genius and spare people a lot of harm. If you're wrong maybe some people miss out on an opportunity but you become a legend.