The Health Sector Cybersecurity Coordination Center (HC3) recently noted an increase in “threat actors targeting health sector IT help desks and employing sophisticated social engineering tactics to gain initial access to targeted organizations.” issued a warning.
The social engineering scheme involves “calling an IT help desk from a target organization’s local area code, claiming to be an employee in a financial role (specifically a revenue cycle or management role). Threat actors can provide sensitive information needed to verify their identity, such as the last four digits of a target employee's social security number (SSN), company ID number, or other demographic details. These details may have been obtained from professional networking sites or other publicly available sources such as previous data breaches. The attacker claimed that his phone was broken so he could not log in and receive the MFA token. The attackers were then able to convince her IT help desk to enroll the new device for multi-factor authentication (MFA), allowing them to access corporate resources. ”
Once the attacker gains access, they target the login information associated with the payer's website and submit a form to make ACH changes to the payer account. “Once access was gained to the employee's email account, instructions were sent to the payment processor to route the legitimate payment to a U.S. bank account controlled by the attackers.The funds were then transferred to an overseas account. During the malicious campaign, the attackers also registered a domain with a single letter variation of the target organization and created an account impersonating the target organization's chief financial officer (CFO).''
Attackers use spear-phishing audio techniques to impersonate employees, also known as “vishing.” “Threat actors may also seek to leverage AI voice spoofing techniques to socially engineer their targets, and advances in these technologies make remote identity verification increasingly difficult,” IC3 said. ” states. A recent global survey found that one in four out of 7,000 people surveyed have experienced, or know someone who has experienced, an AI voice cloning scam. I answered. ”
IC3 provides a number of mitigations to help prevent these malicious schemes, as outlined in the alert.