{{Header}}
{{Title|title=
{{project_name_long}} Track Record against Real Cyber Attacks
}}
{{#seo:
|description=A list of many past anonymity attacks where {{project_name_short}} kept its users safe.
|image=Realworldexplosion.jpg
}}
{{audit_mininav}}
[[File:Realworldexplosion.jpg|thumb]]
{{intro|
{{project_name_short}} helps to protect users against various deanonymization attacks. This chapter lists many notable [https://googleprojectzero.blogspot.com/p/0day.html "in the wild" attacks] that deanonymized users who did not rely on the {{project_name_short}} platform, but which failed against {{project_name_short}} due to its fundamental design.
}}
__TOC__
= Introduction =
'''The threats posed by "zero day (0-day) exploits" are not just a hypothetical concern, but a very real one'''. Since 2014, the Google Project Zero initiative has discovered a new exploit every 17 days, with the majority (68 per cent) related to memory corruption bugs. There has been a large increase in the number of "in-the-wild" 0-day attacks in 2021, with 33 publicly disclosed already halfway through the year; this is 11 more than all of 2020. The spike in zero-day attacks is attributed to an overall rise in the total number of exploits being used, as well as greater detection and disclosure by researchers.
'''Zero-day exploits (also "0-day" or "0day" attacks) refer to previously unknown vulnerabilities in software'''. "Day zero" refers to the hypothetical day before "day one" on which the vendor of the software realizes the vulnerability. In other words: "Zero day exploits" are sudden and unexpected attacks that use a newly found out vulnerability BEFORE the software vendor even knows this vulnerability exists.
'''In one sense the proliferation of 0-days is a positive sign''', because it indicates improved overall security now requires attackers to rely more on them for successful exploits. On the downside, there is a significant increase in the number of vendors providing 0-days, particularly for mobile devices due to their popularity. [
https://blog.google/threat-analysis-group/how-we-protect-users-0-day-attacks/
]
'''A [https://www.wired.com/story/untold-history-americas-zero-day-market/ widespread and highly profitable industry] has also developed around 0-days''', illustrating that all sectors of society are potential targets of the hack-for-hire market, including those involved in politics, advocacy, global commerce and government. Hundreds of commercialized hacking entities have emerged to trade in offensive security, posing threats to individuals and groups who do not have sophisticated cybersecurity resources. In the past, 0-day capabilities were solely developed and used by a handful of nation states to operationalize exploits, but now it is more common for commercial providers to develop exploits and sell them to government-backed actors.
'''This reality explains why government regulation is sorely lacking in this area''' -- [https://i.blackhat.com/USA-19/Wednesday/us-19-Shwartz-Selling-0-Days-To-Governments-And-Offensive-Security-Companies.pdf government is one of the largest clients of offensive security companies] in order to bolster their surveillance and espionage capabilities. Government tolerance and perceived normalization for these types of services reinforces the notion that private individuals and organizations must utilize all available measures to better protect themselves.
'''On this page there are many examples how {{project_name_short}} prevented even "Zero-day" attacks''' that were successful against other software but ineffective against {{project_name_short}} due to its fundamental security and anonymity design.
= Real Cyber Attacks =
Below there is a list of examples of real cyber attacks - many of them zero-day exploits - which affected thousands of users but did not affect {{project_name_short}} users due to its design advantages. In its more than 10 year history {{project_name_short}} has a strong track record even against software vulnerabilities that have not been known previously.
== BitTorrent (2010) ==
In 2010 [https://blog.torproject.org/bittorrent-over-tor-isnt-good-idea IP leaks] have been reported when using ordinary proxification methods. Since {{project_name_short}} prevents information leaks however, using BitTorrent in {{project_name_short}} did not leak a user's real external IP address (see [[File Sharing]]). The reason is {{project_name_workstation_long}} ({{project_name_workstation_vm}}
) has no knowledge of the external, ISP-facing IP address and therefore cannot leak it.
== Browser Plugins: Flash, Java etc (2011 - currently) ==
{{project_name_short}} prevents information leaks from browser plugins since it has no knowledge of the real external IP address. This protection also applies to Flash-based applications used by [https://en.wikipedia.org/wiki/Operation_Torpedo advanced adversaries]. Nevertheless, it is NOT recommended to install browser plugins such as Flash when anonymity is the goal. [[https://www.whonix.org/wiki/Comparison_with_Others#Flash_.2F_Browser_Plugin_Security https://www.whonix.org/wiki/Comparison_with_Others#Flash_.2F_Browser_Plugin_Security]] See [[ Browser Plugins]] for further details.
== Everyday Trackers (currently) ==
[[Data_Collection_Techniques#Browser_Fingerprinting|Browser fingerprinting]] can be used to track the user similar to an IP address. This is common practice on the internet. The fingerprint.com tracking software alone is used by [[The_World_Wide_Web_And_Your_Privacy#Fingerprint.com|12% of the largest 500 websites use fingerprint.com]].
When using a normal browser such as Mozilla Firefox or Google Chrome, the browser fingerprint of the user will always be the same and therefore trackable. No matter what IP the user is using, any website using one of the easily accessible and popular browser fingerprinting services can always link each visit by the user to the same pseudonym.
{{project_name_short}} comes with [[Tor Browser]] installed by default. Each time Tor Browser is restarted (or its [[Tor_Browser#New_Identity_Function|new identity function]] used), the [[Browser_Tests#Fingerprint.com|fingerprint.com browser test]] fails to detect the same browser fingerprint, fails to assign the same unique identifier for the user. In other words, at time of writing even fingerprint.com fails to track users of Tor Browser using its browser fingerprinting techniques.
== Nautilus and other file managers (2017) ==
A [https://bugzilla.gnome.org/show_bug.cgi?id=777991 security bug] was reported in Nautilus file manager that allows an attacker to disguise a malicious script as a .desktop
file. [
https://linux.debian.bugs.dist.narkive.com/Z4frRjjC/bug-860268-desktop-files-can-hide-malware-in-nautilus
] This vulnerability was later reported to affect other file managers too. In this attack, an adversary tricks the user into downloading the .desktop
file from a website or sends the file in an email. Once the file is on the target's computer, the file (PDF, ODT) only has to be opened by the user for the script to execute.
This security bug was used to craft an exploit which was able to break the [https://subgraph.com/sgos/index.en.html Subgraph OS] security model.[https://micahflee.com/2017/04/breaking-the-security-model-of-subgraph-os/
] Since Subgraph does not contain Nautilus in an Oz sandbox, [
The Subgraph OS sandbox framework is known as Oz, which is unique to the platform. It is designed to isolate applications from each other and the rest of the system.
] once the malicious script was executed, it would have enabled access to much of the user's data; PGP keys, SSH keys, stored email, documents, password databases, MAC addresses and nearby Wi-Fi access points.
This sensitive information could be used by attackers to deanonymize the user, [
To be fair, when this bug was reported Subgraph OS was still in Alpha status.
] but {{project_name_short}} defeats this attack and others like it. Since {{project_name_workstation_short}} ({{project_name_workstation_vm}}
) is isolated from the host and {{project_name_gateway_long}} ({{project_name_gateway_vm}}
), even if a malicious .desktop
script is executed, no information can be gathered about the external IP address, hardware serials or sensitive data outside of {{project_name_workstation_short}} ({{project_name_workstation_vm}}
). [All data inside {{project_name_workstation_short}} (]{{project_name_workstation_vm}}
) would be available to the attacker and aid deanonymization. Once Subgraph developers were informed of the vulnerability, the Nautilus package was patched on the platform. [
{{Twitter_link|subgraph/status/852000407253594114|https://twitter.com/subgraph/status/852000407253594114}}
]
== P2P (2011) ==
In 2011 an [https://hal.inria.fr/inria-00574178/en/ attack] was published that targets P2P applications in order to trace and profile Tor users. {{project_name_short}} defeats this attack and others like it because {{project_name_workstation_short}} ({{project_name_workstation_vm}}
) has no knowledge of the external IP address. Furthermore, {{project_name_short}} provides extended protection by using [[Stream Isolation|Stream Isolation]].
== Pidgin (2012) ==
A bug was found in Pidgin that would have leaked the real IP address. [
https://web.archive.org/web/20160913002655/https://mailman.boum.org/pipermail/tails-dev/2012-September/001704.html
] {{project_name_short}} did not exist when this bug was discovered. Nevertheless, the security by isolation model adopted by {{project_name_short}} prevents this kind of leak from occurring. Notably, this bug only existed in the developmental source code and it was patched before the release date.
== Skype / VoIP (currently) ==
{{project_name_short}} will not leak a user's IP address / location while using [[Voip#Skype|Skype]] or other [[VoIP|VoIP]] protocols, although [[VoIP#Introduction|it is fairly difficult to anonymize voice]] over these channels.
== Targeted Clock Skew Correlation (currently) ==
This type of correlation allows an adversary to acquire the time stamp of an Onion Service http header and measure the clock skew. The adversary then compares the acquired time stamp against Tor relays or other publicly reachable web servers. If the time skew of the Onion Service server matches any publicly reachable servers or Tor relays, it is very likely the Onion Service is hosted on the same server.
{{project_name_short}} defeats this and other [[Time Attacks| time attacks]] since it uses [[Boot Clock Randomization]] and [[sdwdate]]. This program connects to a variety of servers (likely to be hosted on different hardware) at random intervals and extracts time stamps from the http headers. [
[https://github.com/{{project_name_short}}/sdwdate https://github.com/{{project_name_short}}/sdwdate]
] [
To be fair, when this attack was first described {{project_name_short}} did not exist.
]
{{mbox
| image = [[File:Ambox_warning_pn.svg.png|40px]]
| text = It is not recommended to run Tor relays on a publicly reachable server along with an Onion Service on the same server. This configuration aids traffic correlation and fingerprinting.
}}
== Thunderbird (2009) ==
[https://tails.boum.org Tails] reported that [https://tails.boum.org/security/IP_address_leak_with_icedove/index.en.html Thunderbird leaked] the real external IP address. Although {{project_name_short}} did not exist when this bug was discovered, it would have been impossible for the real external IP address to leak, since {{project_name_workstation_short}} ({{project_name_workstation_vm}}
) has no knowledge of it. In fairness to Tails, this kind of leak is now considered unlikely since they no longer use transparent torification. [[https://web.archive.org/web/20160913002655/https://mailman.boum.org/pipermail/tails-dev/2012-September/001704.html https://mailman.boum.org/pipermail/tails-dev/2012-September/001704.html]]
{{anchor|Tor_Browser_Bundle}}
== Tor Browser Bundle (2012, 2013, 2017) ==
# A severe [https://blog.torproject.org/firefox-security-bug-proxy-bypass-current-tbbs/ bug] was discovered in FireFox which related to WebSockets bypassing the SOCKS proxy DNS configuration. [[https://gitlab.torproject.org/legacy/trac/-/issues/5741 https://gitlab.torproject.org/legacy/trac/-/issues/5741]] {{project_name_short}} defeats this bug since {{project_name_gateway_short}} ({{project_name_gateway_vm}}
) forces all traffic through the Tor network or it is blocked. At worst, a proxy bypass would have emitted traffic through Tor's TransPort. In this scenario, the only information that could leak is the IP address of another Tor Exit Relay, which would not affect anonymity.
# An old [https://lists.torproject.org/pipermail/tor-announce/2013-August/000089.html attack] [
https://blog.torproject.org/tor-security-advisory-old-tor-browser-bundles-vulnerable
] was observed in the wild that exploited a JavaScript vulnerability in Firefox. [
JavaScript was enabled by default in Tor Browser at the time this exploit was discovered.] The observed version of the attack collected the [[Hostnames|hostname]] and [[MAC_Address|MAC address]] of the victims' computers, and sent that information to a remote web server. This threat is partially mitigated nowadays by the development of a [[Tor_Browser#Security_Slider|security slider]] in the Tor Browser Bundle, which prevents the execution of JavaScript code completely with the correct settings. Nevertheless, {{project_name_short}} would have protected against a MAC address leak [Since {{project_name_workstation_short}} (]{{project_name_workstation_vm}}
) is unaware of the MAC address.
and at worst leaked a [[Hostnames|hostname which is common to all {{project_name_short}} users]].
# TorMoil: A [https://blog.torproject.org/tor-browser-703-released security bug] was reported in version 7.0.2 that allowed systems with GVfs/GIO support to bypass Firefox proxy settings using a specially crafted URL, leading to an IP address leak. [
https://www.wearesegment.com/research/tormoil-torbrowser-unspecified-critical-security-vulnerability/
] Since {{project_name_gateway_short}} ({{project_name_gateway_vm}}
) forces all traffic through Tor, and information leaks are blocked, {{project_name_short}} users were not affected by this bug. [
Quote [https://blog.torproject.org/tor-browser-703-released The Tor Project]:
]
Tails and {{project_name_short}} users, and users of our sandboxed Tor Browser are unaffected, though.
# A [https://gitlab.torproject.org/legacy/trac/-/issues/8751 defect] was discovered which allowed an adversary to use targeted clock skew correlation to identify a user. Since Tor Browser transmits TLS "Hello Client" gmt_unix_time
there are two scenarios in which these transmissions could be used to track users.
#* In the first scenario an adversary either compromises [https://en.wikipedia.org/wiki/Network_Time_Protocol NTP] servers or uses a man-in-the-middle to intercept NTP server replies and introduces a unique clock skew. Since "Hello Client" transmissions are visible to ISPs that host Tor Exit Relays as well as destination servers, an adversary could use clock skew correlation to track users' movements.
#* In the second scenario, a user visits a clearnet website under adversary control without Tor Browser and the unique clock skew of the TLS "Hello Client" gmt_unix_time
is recorded. Afterwards, the user visits the same or a different adversary-controlled website using Tor Browser. If both clock skews match, this could indicate the two visitors were the same person. At the very least this would significantly degrade anonymity. Since {{project_name_short}} uses [[Boot Clock Randomization]] and [[sdwdate]] and not NTP to keep time, these instances of targeted clock skew correlation and many like it are defeated.
= See Also =
* [[Security Reviews and Feedback]]
* [[Dev/Leak_Tests|Leak Tests]]
= Footnotes =
{{reflist|close=1}}
{{Footer}}
[[Category:Documentation]] [[Category:Design]]