This cannot possibly hold up in court. You cant just advertise a product and then be like “*but actually we might be lying about some or all of these things”??? What the fuck are you selling then

  • Echo Dot@feddit.uk
    link
    fedilink
    English
    arrow-up
    6
    ·
    5 days ago

    The product information still needs to be accurate though. You don’t need to restrict the use of AI, this falls under consumer protection.

    • Cattail@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      5 days ago

      or we can simply apply false advertising if the ai gets something wrong, like how that lawyer that disbarred for using ai to write his court papers and cited a case that never happened

      • Ephera@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 days ago

        Yeah, when the other commenters mentioned consumer protection laws, they meant that it would be a case of false advertising.

        There’s also been a similar case before, where Air Canada got sued, because their chatbot promised to a customer that he could apply for one of their bonus programs after the flight, which then got denied by Air Canada: https://www.mccarthy.ca/en/insights/blogs/techlex/moffatt-v-air-canada-misrepresentation-ai-chatbot

        In that particular case, it might have helped, if the chatbot had a disclaimer that it’s lying, but only because Air Canada provided the truthful information elsewhere on their webpage. That’s not gonna be the case for such product descriptions…