• 1 Post
  • 5 Comments
Joined 2 years ago
cake
Cake day: July 9th, 2023

help-circle

  • the most generous reading of your arguments is that you are philosophically defeatist

    That’s probably a fair assessment.

    But I feel like the core of my argument remains: I’m not disputing that MS or Google or Amazon or Apple services are sold to people and orgs who use them to commit evil. Of course they are.

    But these aren’t munitions. They are general-purpose computing products being turned to evil outcomes by bad actors. The article, for example, cites Microsoft’s open-source LAVENDER, which is a general purpose image and video analysis tool for AI. Describing it as:

    ‘Lavender’, an AI-powered system designed to identify bombing targets

    This simply isn’t true. Somebody in the Israeli military used LAVENDER to process video data to identify bombing targets, like somebody might use a hammer to smash someone’s head in. The articles you cite are full of rhetorical tricks to imply that Microsoft corporate had some hand in the decision making, but it’s genuinely all “well the Israeli military has some Azure servers, therefore Microsoft killed people”.

    Which militaries should Microsoft (or Google or Apple or Amazon, etc) be allowed to sell products to? Who makes that determination? A cohort of employees or consumers? NGOs?

    If government makes the call – distilling a public consensus on the matter, one hopes – then I can see some reasonable way to approach this question.

    EDIT: Details on LAVENDER:

    https://www.microsoft.com/en-us/research/publication/lavender-unifying-video-language-understanding-as-masked-language-modeling/



  • Honestly, I struggle to draw a connection between world conflict and non-military technology like Windows or cell phones or whatever.

    Is every single Israeli resident complicit in what their government is doing? None of them should be allowed to use Windows? What about Israelis outside of Israel? What about people who support Israel? What about (gasp) Jews? How do you even enforce any of this without massive overreach by the companies?

    Call on Microsoft or Apple all you want, ultimately I don’t think a company should ban sales to customers on the argument that those customers might not have morals aligned to the company. Not that it’s even possible, with world supply chains being what they are.



  • With respect to the article, it’s wrong. AI help desk is already a thing. Yes, it’s terrible, but human help desk was already terrible. Businesses are ABSOLUTELY cutting out tier 1 call center positions.

    LLMs are exceptionally good at language translation, which should be no surprise as that kind of statistical chaining is right up their alley. Translators are losing jobs. AI Contract analysis & legal blacklining are going to put a lot of junior employees and paralegals out of business.

    I am very much an AI skeptic, but I also recognize that people who do the things LLMs are already pretty good at are in real trouble. As AI tools get better at more stuff, that target list of jobs will grow.