Otter Raft@lemmy.ca to Medicine@mander.xyzEnglish · 3 months agoAI medical tools found to downplay symptoms of women, ethnic minorities - Ars Technicaarstechnica.comexternal-linkmessage-square4linkfedilinkarrow-up138arrow-down12cross-posted to: technology@lemmit.onlinetechnology@lemmy.zip
arrow-up136arrow-down1external-linkAI medical tools found to downplay symptoms of women, ethnic minorities - Ars Technicaarstechnica.comOtter Raft@lemmy.ca to Medicine@mander.xyzEnglish · 3 months agomessage-square4linkfedilinkcross-posted to: technology@lemmit.onlinetechnology@lemmy.zip
minus-squareWhyDoYouPersist@lemmy.worldlinkfedilinkEnglisharrow-up5·3 months agoSeriously, human generated data is what AI is trained on–this should come as a no-brainer.
minus-squareOtter Raft@lemmy.caOPlinkfedilinkEnglisharrow-up2·edit-23 months agoYep, the companies are pushing AI models as being a “fair” and “unbiased” alternative to human workers. In reality LLLMs are going to be biased depending on the training data
Seriously, human generated data is what AI is trained on–this should come as a no-brainer.
Yep, the companies are pushing AI models as being a “fair” and “unbiased” alternative to human workers. In reality LLLMs are going to be biased depending on the training data