Attorney General Ken Paxton has announced a settlement with Pieces Technologies, a Dallas-based artificial intelligence healthcare technology firm, over allegations of misleading claims about the safety and accuracy of its generative AI products.
This settlement marks a significant step in regulating the intersection of AI technology and its impact on the healthcare industry in Texas.
An investigation revealed that Pieces Technologies had deployed its generative AI solutions at several major Texas hospitals, providing real-time patient healthcare data to generate summaries of patient conditions to help identify appropriate and effective treatments. However, according to a recent press release, Paxton’s office found that the company made deceptive claims about the accuracy of its AI systems, potentially jeopardizing patient safety.
Pieces Technologies advertised its products with metrics suggesting a remarkably low error rate of less than 1 per 100,000, leading healthcare providers to believe in an overestimated reliability of its AI applications. The Attorney General’s investigation raised concerns that these claims were grossly exaggerated, misinforming hospitals and possibly compromising patient care.
“AI companies offering products used in high-risk settings owe it to the public and to their clients to be transparent about their risks, limitations, and appropriate use. Anything short of that is irresponsible and unnecessarily puts Texans’ safety at risk,” stated Paxton.
As part of the settlement agreement, Pieces Technologies has committed to revising its marketing practices to provide clear and accurate disclosures regarding the capabilities and limitations of its healthcare products. Furthermore, the company must ensure that healthcare or medical staff who use its generative AI tools are adequately trained to understand how much they can rely on these systems in clinical settings.
The settlement may set a precedent for future regulations on AI in the healthcare industry, emphasizing the need for accountability among technology providers.
As hospitals continue to explore innovative solutions to improve patient care, Paxton hopes the oversight provided by this settlement will help safeguard public health in the face of rapidly evolving technology.
The Attorney General’s office has encouraged other healthcare providers to reassess their use of AI products and prioritize training and safety protocols to protect patients and companies effectively.
“Hospitals and other healthcare entities must consider whether AI products are appropriate and train their employees accordingly,” stated Paxton.
Paxton may have his hands full, as “90% of hospitals will use AI-powered technology for early diagnosis and remote patient monitoring by 2025,” according to a report from Radix.