October 9th, 2024

Two never-before-seen tools, from same group, infect air-gapped devices

Researchers identified two advanced toolsets from the suspected Russian hacking group GoldenJackal, targeting air-gapped devices in 2019 and 2022, indicating a sophisticated threat to sensitive networks.

Read original articleLink Icon
Two never-before-seen tools, from same group, infect air-gapped devices

Researchers have identified two advanced toolsets used by a suspected Russian nation-state hacking group, known as GoldenJackal, to compromise air-gapped devices. These devices are isolated from the internet to protect sensitive data. The first toolset was deployed in 2019 against a South Asian embassy in Belarus, while a different set was used in 2022 against a European Union government organization. Both toolsets share components with malware previously documented by Kaspersky, indicating a common origin. The tools include capabilities for delivering malicious executables via USB drives, backdoors for remote access, and file exfiltration methods. The newer toolkit, developed in 2022, is more modular and sophisticated, allowing for flexible operations and multiple exfiltration methods. This evolution highlights the group's resourcefulness and technical expertise. Although the exact country of origin remains unconfirmed, there are indications of a possible connection to the Russian FSB's Turla group. The findings underscore the ongoing threat posed by advanced persistent threats targeting sensitive networks, particularly in governmental and diplomatic contexts.

- Two sophisticated toolsets from the GoldenJackal group have been discovered, targeting air-gapped devices.

- The first toolset was used in 2019, and a more advanced version was deployed in 2022.

- Both toolsets share components with previously documented malware, suggesting a common origin.

- The newer toolkit features a modular design, enhancing flexibility and resilience against detection.

- There are potential links between GoldenJackal and the Russian FSB's Turla group, indicating a high level of sophistication in their operations.

Link Icon 28 comments
By @Arch-TK - 4 months
I think the things to blame here are the design of Windows and the overall design of the air-gapped environment.

Yes, at the end of the day you're going to need to move stuff from non-air-gapped devices to air-gapped devices and vice-versa. You can assume the non-air-gapped devices are completely compromised. But why is the air-gapped device not configured to always show file extensions?

This is literally working because Windows is configured to hide common file extensions, and the attack relies on hiding a folder and replacing it with an executable with a folder icon and the same name +.exe.

If you're designing an airgapped system, this is literally the first thing you should be worried about after ensuring that the system is actually airgapped.

At least windows explorer should have been configured to show extensions (and some training delivered to ensure that the people using these systems have the diligence to notice unusual file extensions with familiar looking icons).

It would be even better if the file explorer was replaced with something less easy to fool. Something which does not load custom icons, never hides directories, and maybe even prevents access if the flash drive has indications of shenanigans (unusually named files, executables, hidden folders) which would indicate something weird was going on.

It's a good job that unlike with Stuxnet nobody plugged in a flash drive from the literal car park, but this is pretty poor on the part of the people designing/implementing the airgapped environment.

By @pontifier - 4 months
For many years I've just viewed all of my devices as possibly compromised. It's one of the reasons I've been very down on cryptocurrencies in general. I don't actually see USB as something that can maintain a true, robust airgap, because the amount of data transferred is not inspectable.

In my view, the best use of an airgapped machine would be for storage of extremely dense and sensitive information such as cryptographic keys. Signing or encryption should be accomplished through an inspectable data channel requiring manual interaction such as QR codes. When every bit in and out of a machine serves a purpose, it's much less likely to leak.

Example: show a qr code to a camera on the airgapped machine and get a qr code on the screen containing the signature of the data presented to it. There is very little room for nefarious code execution or data transmission.

By @mikewarot - 4 months
Why does anyone run an operating system that grants ambient authority to executables?

This is analogous to a power grid stripped of all fuses and circuit breakers to make it easier to design toasters.

We've studied this problem since 1972[1]. Solutions were found (but the Internet Archive is down, so I can't be sure [2] points to the right files now).

[1] https://csrc.nist.rip/publications/history/ande72.pdf

[2] https://web.archive.org/web/20120919111301/http://www.albany...

By @computerfriend - 4 months
I once designed and built an air-gapped system and it did not involve operators plugging USB devices to and from MS Windows machines. (We used data diodes).

It is strange to me that a security-conscious organisation such as a ministry of foreign affairs would build an air-gapped system this way. Possibly it's a compliance checklist item from their parent organisation, but with no oversight?

The US has "forward deployed" state department personnel that handle information security of embassies and consulates in a standardised way, probably this SE asian country (and the EU organisation) should follow suit.

By @vlovich123 - 4 months
Given that air gapped American government machines would be vulnerable to similar techniques, why don’t common operating systems build mechanisms to make this stuff more difficult?

* force prompting executing anything off external media

* disallow anything other than input devices for USB

* disallow unsigned binaries from running

* work to require usb peripherals to carry a unique cryptographic signature so that you can easily lock the set of allowed devices once the machine is set up

Heck, a lot of this stuff could be valuable to help secure corporate IT machines too.

By @gnabgib - 4 months
Discussion (272 points, 4 days ago, 245 comments) https://news.ycombinator.com/item?id=41779952
By @PMunch - 4 months
I've been wondering lately how these USB execution attacks happen. Surely no modern system auto-runs things from a USB, so there has to be some kind of executable on the drive which the user of the drive either A. Expects to be there, or B. Doesn't notice is there. A sounds a bit strange, but maybe the system is updated over USB, that means that the hackers got into the update pipeline which is very bad. B might be more likely, create an EXE with the thumbnail of an image and maybe you could trick a user into clicking it. Or maybe some nefarious excel macro. But in this case it's strange that the system allows these things to be executed.

Does anyone have more details on how this is done?

By @dreadlordbone - 4 months
Both tools are "stick infected USB into air gapped device"
By @1970-01-01 - 5 months
>HTTP server, an HTTP server whose precise function isn’t well understood

OK, you may be overthinking this one

By @bankcust08385 - 4 months
Every well-funded nationstate has an "equations group", but it is rare to detect, much less publicize, their actions by design.

External buses and RF comms present massive attack surfaces that must be locked down with religious fervor including auditing, disabling, and management.

By @hcfman - 4 months
So running programs by sticking in a USB drive is a critical security bug. Have Microsoft reported this to the European Union within a day? Is Microsoft now going to be fined 2.5% of their turn over now from the Cyber Resilience Act?
By @mattlondon - 4 months
I worked in a data center situation once where there was a physical switch that disconnected the outside network connection which was turned on only in very special circumstances. The USB ports were filled with super-glue.

I always thought that the big switch was probably still a massive vulnerability - is it air-gapped or not? When the switch is flicked it only takes milliseconds for an exploit.

Anyway, not sure what happened to those guys in the end.

By @dyauspitr - 4 months
I was expecting targeting wireless bit flipping or something. This is a let down.
By @djmips - 4 months
How can they exfiltrate to Google drive on an airgapped machine. Airgapped implies not connected to networks.

(This should also include sneakernet!)

By @dboreham - 4 months
Is it just me that thinks "USB device plugged in" != "airgapped"??

Heck every TV show has someone downloading the nuclear plans off Dr. Evil's laptop by...plugging in a USB device when he's distracted by spilling his coffee.

By @fsflover - 4 months
This is exactly why Qubes OS has been developed, with its strong hardware-assisted virtualization. All USB devices belong to a dedicated VM, which is reset every time it's restarted. My daily driver, can't recommend it enough! https://qubes-os.org
By @physicsguy - 4 months
If you’re a startup in the industrial world - this is why connectivity of PLC systems is often viewed skeptically.
By @m3kw9 - 4 months
Gapped or non gapped, the use of USB to deliver back doors is pretty common technique
By @bikamonki - 4 months
Did I get it wrong or the infection was through USB? I thought it was something related to a remote/wireless attack. Is such attack even possible on air-gapped systems?
By @ekianjo - 4 months
Is it really considered air gapped if people connect USB drives to such machines? USB is a well known vector for malware
By @RecycledEle - 4 months
(oof-topic) I would be very suspicious of a dongle that plugged into an Ethernet port or USB port that said "AIRGAPPED - NO NOT CONNECT."
By @lofaszvanitt - 4 months
Well, a proper air gapped system would have zero surface area to the outside world, isn't it? Like desoldered USB/Firewire/whatever ports is a basic thing to do. I had a colocated server where the usbs were neutered this way and had an internal USB riser which had the port removed and facing inwards, so if you took the system apart you could plug in something, but that means offline time.

This was more like a controlled environment, but everyone knows that USB/WIFI is a steaming shitpile, with its own firmware and other shit.

By @phendrenad2 - 4 months
Stuxnet was 14 years ago, guys.
By @zahlman - 4 months
A lot of people ITT don't seem to understand very well what's going on with this attack. The Ars Technica article doesn't seem very well written, but we've had previous discussion[0].

Quick FAQ:

> Haven't we known about USB vulnerabilities forever (agent.btz, BadUSB etc.)?

The fact that USB devices were used to transfer the files is irrelevant to the attack.

The attack doesn't depend on running the malware directly off the USB device, on any kind of auto-run vulnerability, etc. It would have worked out the same way if files had been transferred, for example, by burning them to DVD. The attack only depends on the machines on the non-air-gapped side, being compromised such that the attackers can control what is put onto the USB. But the USB drives themselves are only being used as dumb storage here.

The attack instead primarily depends on social engineering that is helped along by the design of the Windows GUI. On the air-gapped machine, the user sees a "folder" which is actually an executable file. By default, Windows hides the .exe file extension (which it uses to determine executability of the file) in the GUI; and the icon can be customized. Double-clicking thus launches the malware installer when it was only supposed to open a folder. The folder has a name that the user expected to see (modulo the hidden extension).

It appears that the original setup involves hiding[1] (but still copying) the folder that was supposed to be transferred, and then renaming the malware to match. (Presumably, the malware could then arrange for Windows to open the hidden folder "normally", as part of its operation.) Windows can be configured to show "hidden" files (like `ls -a`), but it isn't the default.

Notice that this is social engineering applied only to the process of attempting to view the files - nobody was persuaded to use any storage devices "from outside".

> Isn't that, like, not actually air gapped?

The definition of an air gap generally allows for files to be passed across the air gap. Which is all the attack really depends on. See also "sneakernet". The point is that you can easily monitor and control all the transfers. But this attack is possible in spite of that control, because of the social engineering.

> How is it possible to exfiltrate data this way?

The actual mechanism isn't clearly described in media coverage so far, from what I can tell. But presumably, once malware is set up on the air-gapped machine, it copies the files back onto the USB, hiding them. When the device is transferred back to the non-air-gapped side, malware already present there monitors for the USB being plugged in, retrieves the files and uploads them (via the "GoldenMailer" or "GoldenDrive" components) elsewhere.

[0] https://www.welivesecurity.com/en/eset-research/mind-air-gap..., via https://news.ycombinator.com/item?id=41779952.

[1]: Windows file systems generally don't have an "executable bit" for files, but do have a "hidden bit", rather than relying on a leading-dot filename convention. So it's the opposite of what Linux does.

By @forgot-im-old - 4 months
PocketAdmin is capable of all that's described here and it's open source! https://github.com/krakrukra/PocketAdmin
By @highwayman47 - 4 months
So just don’t use USB’s? Article was poorly written - couldn’t even finish it