By Vuk Zdinjak TheStreet
Published on June 9, 2025, 5:33 PM
Content Summary
This article uncovers critical cybersecurity blunders that have rendered the US vulnerable. A recent study revealed that numerous human-machine interfaces at US water facilities were improperly accessible from the Internet, raising flags about potential security risks. The initial assumption from cybersecurity firms was that it was a prank due to the sheer number of exposed systems. Following these findings, support was enlisted to remediate the issues, with 58% of the problematic configurations fixed within a few weeks.
Further, the article discusses the misuse of AI in managing contracts at the Department of Veterans Affairs, where an AI tool flagged numerous contracts inaccurately. The tool’s performance raised concerns over the reliability of AI in critical assessments, particularly within government-related functions.
Key Points
- 400 human-machine interfaces in US water facilities were found publicly accessible, highlighting severe cybersecurity lapses.
- Initial assumptions by cybersecurity experts regarded these exposures as a prank due to the large volume of systems involved.
- Following the discovery, significant progress was made in securing the systems, with nearly 58% protected within weeks.
- An AI tool deployed to review VA contracts produced numerous inaccuracies, including inflating contract sizes excessively.
- The challenges faced by the VA demonstrate the risks of placing trust in potentially faulty AI applications in sensitive sectors.
Why should I read this?
If you’re a techie or just someone who’s concerned about cybersecurity, this is a must-read! The revelations about US vulnerabilities are eye-opening, and the misuse of AI in the VA highlights the potential chaos that can ensue when technology runs amok. You’ll get a clearer picture of the current state of security and technology management, which might just inspire you to take extra precautions in your digital life. We’ve sifted through the details so you don’t have to!