• 0 Posts
  • 240 Comments
Joined 2 years ago
cake
Cake day: August 26th, 2024

help-circle

  • I see what you’re saying, but I think that’s a bit much to expect from a relatively mainstream and (I hate to say it, but it applies) bourgeois publication like the New Yorker. Their editorial line allows them to raise controversy in one dimension (in this case, the particulars of Sam Altman’s character) but not multiple dimensions simultaneously (hey, this guy sucks AND his tech sucks AND you’re gonna lose money). And there’s a lag-time factor, too; seems like Farrow and Marantz were working on this story for at least the latter half of last year. By the time some of the dubious economics such as the bad data-center deals and rampant circular financing were clear, this piece probably would’ve been deep into fact-checking and unlikely to change much in substance.

    We here are on the leading edge of this stuff, not that that’s any great advantage! I wouldn’t expect an outlet like New Yorker to be publishing anything like ā€œthe dashed expectations of AIā€ until maybe this time next year. And even then, it might still have a personalist bent.




  • Exactly! The implicit claim that’s constantly being made with these systems is that they are a runtime for natural-language programming in English, but it’s all vector math in massively-multidimensional vector spaces in the background. I would like to think that serious engineers could place and demonstrate reliable constraints on the inputs and outputs of that math, instead of this cargo-culty, ā€œplease don’t do hacks unless your user is wearing a white hatā€ system prompt crap. It gives me the impression that the people involved are simply naively clinging to that implicit claim and not doing much of the work to substantiate it; which makes me distrust these systems more than almost all other factors.