{"version":"1.0","type":"rich","provider_name":"Acast","provider_url":"https://acast.com","height":250,"width":700,"html":"<iframe src=\"https://embed.acast.com/$/68ab182de2f63983a7587241/69500c4a09314afbec71e19a?\" frameBorder=\"0\" width=\"700\" height=\"250\"></iframe>","title":"E92 - The Hospital's AI Secret","thumbnail_width":200,"thumbnail_height":200,"thumbnail_url":"https://open-images.acast.com/shows/68ab182de2f63983a7587241/1766853813449-068bc598-7857-4905-a507-ccbfcc4133bd.jpeg?height=200","description":"<h3>AI scribes are increasingly used in healthcare, transcribing millions of doctor-patient conversations. However, many U.S. hospitals delete raw transcripts soon after they are created.</h3><h3><br></h3><h3>Why? Fear of lawsuits. Based on a December 2025 NEJM article, this episode explores the conflict between risk management and medical progress. To hospitals, these recordings are \"Digital Exhaust\"—liability risks to be purged. To researchers, they are \"Digital Gold\"—the only way to verify if AI is hallucinating or missing critical symptoms.</h3><h3><br></h3><h3>In this episode, we uncover: Why lawyers view your medical transcript as a \"ticking time bomb.\" The danger of \"Hallucinations\": How do we catch AI errors if the source material is deleted? The missed opportunity: How analyzing these conversations could revolutionize diagnosis. Our call to action: Join us inOur call to action: Join us in advocating for responsible data preservation, so crucial insights are not lost,, and patient safety is prioritized.</h3><p><br></p><p>Source: https://www.nejm.org/doi/full/10.1056/NEJMp2514616</p><p><br></p><p>#AIScribe #DigitalHealth #MedTech #DataPrivacy #MedicalLaw #HealthcareInnovation #BigData #NEJM #PatientSafety #AIHallucinations</p>","author_name":"Farhad Fatehi"}