Fooling security tools into believing malicious code was signed by Apple
The way developers of third-party security tools use the Apple code signing API could be exploited by attackers to make malicious code linger undetected on Macs, a security researcher has discovered.
“Security, incident response, and forensics processes and personnel use code signing to weed out trusted code from untrusted code. To undermine a code signing implementation for a major OS would break a core security construct that many depend on for day to day security operations,” Josh Pitts, a researcher on the Okta REX (Research and Exploitation) team, pointed out.
What’s the problem?
On macOS/iOS, code signing focuses on the Mach-O binary and application bundles to ensure only trusted code is executed in memory.
Pitts found that virtually all third-party Apple-focused security products that featured verifying cryptographically signed code using the official Apple APIs did not verify the cryptographic signature properly, which would make them view unsigned malicious code as signed by Apple.
As he explained, this vulnerability exists in the difference between how the Mach-O loader loads signed code vs how improperly used Code Signing APIs check signed code, and can be exploited via a malformed universal (fat) binary (a format that contains several Mach-O files with each targeting a specific native CPU architecture).
To prove his point, he created several malformed PoC fat/universal files for developers to use to test their products.
Exploitation
In order to exploit the vulnerability, the first Mach-O in the fat/universal file must be signed by Apple, and the malicious binary included in the file must be adhoc signed and i386 compiled for an x86_64 bit target macOS.
“I can take any already Apple-signed Mach-O file (there are plenty that come as part of macOS installed) and turn this into a malicious fat/universal file. For a simple example, I could take ‘/bin/ls’, add my malicious code to it with the output a fat/universal file and make modifications so that ‘ls’ still works, then move it to /usr/local/bin/ls. Whenever someone executes ‘ls’, my malicious code would execute and to the affected third-party developers it would seem signed by Apple,” he told Help Net Security.
The malicious binary could provide the attacker with the same level of access as the affected user, i.e, he would have access to the user’s documents and potentially sensitive personal or business information.
Another requirement for the attack to work is that the CPU_TYPE in the Fat header of the Apple binary must be set to an invalid type or a CPU Type that is not native to the host chipset. This instructs the macOS Mach-O loader to skip over that binary and execute the malicious code, while the code signing API just checks the first Mach-O in the fat/universal file and reports that it is signed by Apple.
What now?
“This technique could, in a post-exploitation and/or phishing attack as a 2nd stage payload, allow for long term persistence in plain sight. After testing, Okta REX concluded that this technique bypassed the gambit of whitelisting, incident response, and process inspection solutions by appearing to be signed by Apple’s own root certificate,” the company told us.
Okta first notified Apple of the issue and sent in a PoC to prove its existence, but Apple does not see it as a security issue that they should directly address.
“Apple stated that documentation could be updated and new features could be pushed out, but: ‘[…], third-party developers will need to do additional work to verify that all of the identities in a universal binary are the same if they want to present a meaningful result,’” Pitts said.
So, they involved CERT/CC and then set to notify all known affected third party developers. Among the affected (open and closed source) software is Virus Total, Objective Development’s LittleSnitch, F-Secure’s xFence, Facebook’s OS Query, Google’s Santa, Carbon Black’s Cb Response, several software by Patrick Wardle/ Objective-See, and so on.
Patches for these are inbound and will likely be released soon now that this issue has been shared with the greater public, so if you’re using one of those tools check for updates in the coming days.
However, more third party security, forensics, and incident response tools that use the official code signing APIs are possibly affected.
“Third party developers will need to carve out and verify each architecture in the fat/universal file and verify that the identities match and are cryptographically sound,” he said, and offered instructions on how to do that.
Code signing is complicated, Pitts noted, and while the documentation could be better and updated to help developers writing security tools, this is something developers need to pay close attention to themselves.
“We have no data to support this issue being abused; our hope is that by making this public, EDR vendors can look through all their historical data to determine if there has been any evidence of it,” he concluded.