fast16: cyber sabotage was already a state capability in 2005, five years before stuxnet
sentinellabs published a deep reverse-engineering writeup this week on a windows malware sample that's been quietly sitting on virustotal since 2016. the build timestamp on the sample is august 30, 2005. the framework targets high-precision engineering simulation software. and the kernel driver patches FPU calculation routines to produce controlled alternative outputs.
if you've been carrying around the public timeline of "state-grade cyber-sabotage starts with stuxnet in 2010," you can update it now. the capability was production-ready at least five years before stuxnet went public. probably earlier. we just didn't see it.
let's walk through what fast16 actually does, why it's structurally important, and what defenders should take from a 21-year-old malware sample.
the artifact
the file the sentinellabs team reverse-engineered carries a creation timestamp of august 30, 2005, per the PE header on the windows kernel driver. the sample was uploaded to virustotal on october 8, 2016, more than a decade after it was built. it appears in shadowbrokers references and a small handful of other historical traces but had not received a full RE pass until now.
the framework has three components:
- lua bytecode, which handles configuration parsing, propagation coordination, and operational logic. fast16 was the first windows malware observed to embed a Lua engine. that primitive (script-engine-in-malware-as-modular-payload-format) didn't become normal in the offensive ecosystem until the mid-2010s. fast16 had it in 2005.
- an auxiliary DLL named `svcmgmt.dll`, internally referred to as ConnotifyDLL. this is the user-space coordinator that handles service installation, persistence and inter-component communication.
- a kernel driver named `fast16.sys`. this is the payload tier, the part that does the actual sabotage work. and what it does is the part that should change how you think about physical-process security.
the FPU patch
`fast16.sys` parses configuration provided by the Lua tier, then attaches itself to the running processes of three specific applications:
- LS-DYNA 970, a finite-element analysis engine widely used in civil engineering, automotive crash simulation, and aerospace structural modeling.
- PKPM, a chinese structural-analysis suite used heavily in civil engineering, particularly for high-rise and seismic analysis.
- the MOHID hydrodynamic modeling platform, used for ocean and coastal hydrodynamic simulation, including industrial applications around offshore engineering and water infrastructure.
once attached, the driver patches the floating-point unit calculation routines used by those applications. specifically, it corrupts the FPU instructions that handle high-precision floating-point arithmetic, in a controlled way that produces alternative outputs.
this is the precise pattern stuxnet would later use against centrifuge controllers: corrupt the physical-process calculation, present normal status to operators, produce divergent physical outcomes. the difference is the target domain. stuxnet went after PLCs running enrichment cascades. fast16 went after windows-hosted engineering simulation tools that civil engineers, structural analysts and hydrodynamicists used to design things in physical space.
what does that mean operationally? if you ran a corrupted LS-DYNA 970 to simulate a crash test, you got back numbers that looked plausible but weren't the right numbers. if you ran a corrupted PKPM to model a tall building's seismic response, you got plausible-looking outputs that diverged from physical reality. if you ran a corrupted MOHID to model an offshore platform's hydrodynamic loading, same pattern.
the implication for incident retrospective is uncomfortable. any structural failure, crash-test miscalibration, or hydrodynamic-design surprise in the 2005-2012 window where the design environment included unaudited windows endpoints with potentially compromised calculation tools is now a candidate for "should have looked at the math."
propagation and evasion
the lateral-movement design is mid-2000s state-grade. fast16 ships a Service Control Manager (SCM) wormlet that scans the network for windows 2000 and XP servers running with weak or default credentials, exploits SCM access to install itself as a service, and propagates outward.
the defense-evasion checklist on the kernel driver explicitly checks for security products from:
- agnitum (defunct russian av vendor, big in eastern europe in the 2000s)
- f-secure
- kaspersky
- mcafee
- microsoft
- symantec
- sygate technologies (acquired by symantec in 2005)
- trend micro
that's a pretty complete coverage of the major endpoint vendors of the era. the bypass logic is conditional: in some cases the malware aborts, in others it adapts the persistence approach. it's not a "just works on machines without AV" tool. it's a tool whose author thought carefully about what would be running on its targets.
why this is structurally important
three things to take from fast16 even though the sample is 21 years old:
first, the public timeline of state cyber-sabotage capability shifts left. stuxnet was the first publicly attributed instance, but it was not the first instance. by 2005, at least one state actor (sentinellabs ties the discovery context loosely to the US/Iran cyber tension cycle of that era) had production-ready capability against physical-process calculation. the implication for current threat modeling is that the gap between "publicly attributed first use" and "state has had this capability for years" is wider than the public threat-intel record makes it look.
second, the cyber-physical attack surface includes engineering software, not just industrial control systems. when defenders talk about "OT security" they mostly mean PLCs, ICS, SCADA. fast16 is a reminder that the design tools that produce the engineering specifications for physical systems are themselves an attack surface. a corrupted simulator that produces plausibly wrong outputs in the design phase is a different threat than a corrupted PLC at runtime, and most enterprise security programs have very thin coverage of the design-tool tier.
third, scripting-engine-as-malware-runtime predates the offensive-ecosystem mainstreaming by a decade. fast16's embedded Lua engine is not a curiosity. it shows that the design pattern (script-driven modular malware framework, where the payload logic lives in interpreted bytecode rather than compiled binaries, for portability and rapid retooling) was understood and shipped by 2005. the security community has been catching up to that primitive ever since.
what to do about it
obviously you don't need to apply a 2026 patch for 2005 fast16. the immediate operational value is more about the threat model than the indicators.
practical takeaways:
- engineering simulation software is in your supply-chain risk model now. add LS-DYNA, PKPM, MOHID, ANSYS, ABAQUS, and equivalent design-tool installations to your software-asset inventory if they're not there yet. patch SLAs and integrity-monitoring controls should match the criticality of the physical systems they design.
- integrity monitoring at the FPU-instruction level is not what current EDR does. detection for fast16-class threats requires either application-tier output validation (compare simulation outputs against reference cases periodically) or kernel-tier FPU-instruction integrity hooks. neither is mainstream commodity. for high-stakes design environments, this is a custom build.
- historical incident retrospective is worth one cycle of attention. for organizations with long-running physical engineering programs, the question "did any of our 2005-2012 design environments include unaudited windows endpoints" is at least worth asking. probably the answer is no signal, but the asymmetry favors checking once.
- the public threat-intel record is incomplete by structural design. assume capability lead times of 5-10 years between "first state use" and "first public attribution" when threat-modeling against state actors. plan accordingly.
the underlying point
twenty-one-year-old malware doesn't usually carry contemporary operational lessons. fast16 carries one anyway: state cyber-sabotage capability against physical-process calculation has been production-ready for far longer than the public record shows, and design-tool integrity has been an attacker target throughout. the visible incidents that drive defender awareness lag the underlying capability by years.
the more uncomfortable corollary: whatever the next public incident in this category looks like, the underlying capability is probably already five-to-ten years old.
Gigia Tsiklauri is a Security Architect and founder of Infosec.ge. Get in touch if your design-tool integrity story needs a fresh look.