Placing Trust in Kicksecure

From Kicksecure
Jump to navigation Jump to search

Is Kicksecure trustworthy? Is there a backdoor in Kicksecure? How does Kicksecure protect itself from backdoors?


Trust is a very problematic issue. This is the essence of why security is difficult in every field, including general computing and Internet communication. A skeptical user might ask themselves the following questions before relying upon Kicksecure for sensitive activities on a daily basis:

  • Can Kicksecure and its developers be trusted?
  • Are backdoors present in Kicksecure that can take control over a computer or exfiltrate data?
  • Does Kicksecure generate compromised encryption keys to enable government spying?
  • How trustworthy and sincere are the stated security goals of the Kicksecure project?

Opinions will vary widely, but the reasoning process used to reach the conclusion should be closely examined. It is important that both trust and distrust are based on facts, and not gut feelings, instincts, paranoid conceptions, unfounded hearsay or the words of others.

It is unsurprising that the Kicksecure project and other security platforms / tools claim to be honest, but written assurances are worthless. For an informed decision, it is worth looking at the bigger Kicksecure picture: core components, affiliations, project track record, and how reasonable trust might be established.

Freedom Software and Public Scrutiny[edit]

Kicksecure and other Freedom Software makes it possible to check the source code to determine how a software distribution functions and what it consists of. Suitably skilled individuals can thoroughly audit the code to search for the presence of any malicious code, like a backdoor. In addition, software can be manually built from source code and the result compared against any versions that are pre-built and already being distributed, like the Kicksecure ova images that can be downloaded from This comparison can determine whether any malicious changes were made, or if the distributed version was actually built with the source code.

Naturally most people do not have the requisite knowledge, skills or time to properly audit software. However, the public scrutiny of popular, open source software implies a certain degree of trustworthiness. The axiom attributed to Linus Torvalds [1] -- "Given enough eyeballs, all bugs are shallow" -- is a reasonable assumption in user communities that are large, vibrant, and focused on fixing security vulnerabilities quickly. [2] The Freedom Software community has a strong tradition of publicly reporting and resolving serious issues, and a large pool of developers and beta testers can help to identify and remedy problems. [3]

The opposite of Freedom Software is non-freedom software. Freedom Software provides strong advangages over non-freedom software, which should be avoided. The case for Freedom Software is made on the avoid non-freedom software wiki page.

Table: Finding Backdoors in Freedom Software vs Non-Freedom Software

Non-Freedom Software (precompiled binaries) Freedom Software (source-available)
Original source code is reviewable No Yes
Compiled binary file can be decompiled into disassembly Yes Yes
Regular pre-compiled binaries Depends [4] Yes (anti-disassembly, anti-debugging, anti-VM) [5] is usually not used Depends [6] Yes [7]
Price for security audit searching for backdoors Very high [8] Lower
Difference between precompiled version and self-compiled version Unavailable [9] Small or none [10] is not required No Yes
Assembler language skills required Much more Less
Always legal to decompile / reverse-engineer No [11] [12] Yes [13]
Possibility of catching backdoors via observing incoming/outgoing Internet connections Very difficult [14] Very difficult [14]
Convenience of spotting backdoors Lowest convenience [15] Very high convenience [16]
Difficulty of spotting "direct" backdoors [17] [18] [19] Much higher difficulty [20] Much lower difficulty [21]
Difficulty of spotting a "bugdoor" [22] Much higher difficulty [23] Lower difficulty
Third parties can legally release a software, a patched version without the backdoor No [24] Yes [25]
Third parties can potentially make (possibly illegal) modifications like disabling serial key checks [26] Yes Yes
Software is always modifiable No [27] Yes
Third parties can use static code analysis tools No Yes
Third parties can judge source code quality No Yes
Third parties can find logic bugs in the source code No Yes
Third parties can find logic bugs in the disassembly Yes Yes
Benefits from population-scale scrutiny No Yes
Third parties can benefit from debug during analysis Depends [28] Yes
Display source code intermixed with disassembly No Yes [29]
Effort to audit subsequent releases Almost same [30] Usually lower [31]
Forum discussion: Finding Backdoors in Freedom Software vs Non-Freedom

Spotting backdoors is already very difficult in Freedom Software where the full source code is available to the general public. Spotting backdoors in non-freedom software composed of obfuscated binaries is exponentially more difficult. [32] [33] [34] [35] [36] [37] [38] [39] [40]

To further improve the situation in the future, the Freedom Software community is working on the Reproducible project. Quote:

Reproducible builds are a set of software development practices that create an independently-verifiable path from source to binary code.

Whilst anyone may inspect the source code of free and open source software for malicious flaws, most software is distributed pre-compiled with no method to confirm whether they correspond.

This incentivises attacks on developers who release software, not only via traditional exploitation, but also in the forms of political influence, blackmail or even threats of violence.

This is particularly a concern for developers collaborating on privacy or security software: attacking these typically result in compromising particularly politically-sensitive targets such as dissidents, journalists and whistleblowers, as well as anyone wishing to communicate securely under a repressive regime.

Whilst individual developers are a natural target, it additionally encourages attacks on build infrastructure as an successful attack would provide access to a large number of downstream computer systems. By modifying the generated binaries here instead of modifying the upstream source code, illicit changes are essentially invisible to its original authors and users alike.

The motivation behind the Reproducible Builds project is therefore to allow verification that no vulnerabilities or backdoors have been introduced during this compilation process. By promising identical results are always generated from a given source, this allows multiple third parties to come to a consensus on a “correct” result, highlighting any deviations as suspect and worthy of scrutiny.

This ability to notice if a developer has been compromised then deters such threats or attacks occurring in the first place as any compromise would be quickly detected. This offers comfort to front-liners that they not only can be threatened, but they would not be coerced into exploiting or exposing their colleagues or end-users.

Several free software already, or will soon, provide reproducible builds.

Trusting Debian GNU/Linux[edit]

Nearly all the software shipped in Kicksecure comes from the Debian GNU/Linux Debian's packages are heavily scrutinized as it is one of the largest Linux at present. Debian is also one of the most popular distributions for derivative platforms; Ubuntu is a Debian derivative, and the same applies to all Ubuntu derivatives such as Linux

The sheer number using Debian's software packages and the large developer pool inspecting software integrity are significant factors in Debian's favor. Debian regularly identifies and patches serious security like the infamous SSH PRNG vulnerability [41], but backdoors or other purposeful security holes have never been discovered to date. Debian's focus on security is further evidenced by their Security Audit team which constantly searches for new or unfixed security issues. [42]

Trusting Kicksecure[edit]

In one sense, Kicksecure is the simple union of Debian and Tor and a mechanism to glue them together. If a user already trusts Debian and The Tor Project, then a method for assessing Kicksecure trustworthiness is also necessary.

The Kicksecure project was founded on 11 January, 2012, see also history. As mentioned earlier, Kicksecure is Freedom Software which makes the source code available for inspection. In the main, Kicksecure is comprised of specifications for which Debian software packages should be installed and their appropriate configuration. See also this list of notable reviews and feedback about the security of Kicksecure.

With a relatively small development team and estimated user base, the "many eyeballs" theory may work against Kicksecure at present. However, the source code is comparably small and devoid of complexities, meaning the project is in relatively good shape compared to many other similar projects. Interested readers can learn more about the Kicksecure specification and design here. [43]

With these factors in mind, the reader can now make an informed decision about the trustworthiness of Kicksecure.

Kicksecure Warrant Canary[edit]


The Kicksecure warrant is intended to provide a means of communication to users in the event Kicksecure is served with a secret subpoena, despite legal prohibitions on revealing its existence. For any canary in force, once the signature of the canary file is verified with OpenPGP and/or signify, this confirms that no warrants have been served on the Kicksecure project.

Note: the canary date of issue is represented by the gpg signature date. A new canary should be released within 4 weeks. [44]

The canary and signature are available here:

As a backup, the canary and signature are also available on github: [45]

Readers are reminded this canary scheme is not infallible. The canary declaration is provided without any guarantee or warranty, and it is not legally binding upon any parties in any form. The signer should never be held legally responsible for any statements made in the canary.


Trusting Downloaded Images[edit]

Users should not blindly trust the Kicksecure project or its developers. Logically it is unwise to trust unknown persons, especially on the Internet. On that basis, trust in Kicksecure developers should not rely on his public persona or the appearance of the Kicksecure project alone. Kicksecure may be or could become a high profile target, and it is risky to assume that developer's build machines would remain clean under those circumstances.

Trusting the Download Location[edit]

Binary images can be trusted to some extent if a user verifies that they received exactly the same code as thousands of other users, and no one has found or publicly reported any serious security issues. This requires verification of the Kicksecure and Kicksecure images using the available OpenPGP signatures. [46] All source code tags for releases are OpenPGP-signed by lead Kicksecure developer Patrick Schleizer.

In order of increasing security, the Kicksecure images can be:

  1. Downloaded via TLS provides some trust and integrity of the hash file, but it is still advisable to check the site's certificate and perform digital software signature verification (instructions).
  2. Downloaded over the Kicksecure v3 onion addressonion with Tor Browser before digital software signature verification. Onion addresses provide a higher standard of authentication than clearnet addresses.
  3. Built from source since it is a relatively easy procedure.

Trusting Kicksecure Images[edit]

Table: Maintainer Overview - Platform, Source Code, Binary Images, Permissions

Kicksecure VirtualBox Kicksecure KVM Kicksecure for Qubes Built from Source Code
Source Code Creation Patrick Patrick Qubes project and Patrick Patrick
Source Code Trust Patrick Patrick Qubes project and Patrick Patrick
Binary Image Creation Patrick HulaHoop Qubes project [47] -
Binary Images Trust Patrick HulaHoop Qubes project and Patrick -
Package Upgrades Creation Patrick Patrick Qubes project and Patrick -

Since Kicksecure is based on Debian, Debian releases package upgrades. See also: Trusting Debian GNU/Linux.

Binary Images Policy[edit]

  • A) Kicksecure binary image maintainers: Only the currently existing Kicksecure binary image maintainers are permitted to redistribute software forked binary builds of Kicksecure and advertise these on the project website or forums.
  • B) Unofficial software fork maintainers: Rebranding required. Change the project name Kicksecure and Kicksecure logo to something else, and host these on a different website to avoid confusing users about the origin of the software.


  • Build verification: Verifiable, let alone reproducible builds, let alone automatic verification of reproducible builds, rebuilders [48] are unavailable and their availability many years ahead. In other words, the binary builds by Kicksecure are not yet verifiably created from project's own source code [49].
  • Trust: Binary images unfortunately still require trust.
  • Freedom Software rights: Others are welcome to exercise their right to software the Kicksecure project under the respective licenses. (Why Kicksecure is Freedom Software)
  • No additional restrictions: There is no restriction of software redistribution rights under the respective licenses. (Kicksecure:Copyrights)
  • Trademark: Kicksecure is a trademark, see Kicksecure Trademark Policy.
  • Reputation: To protect the reputation of the Kicksecure project, quality control, non-maliciousness, potential Evil Developer Attack.
  • Fork friendly: Kicksecure is Software Fork Friendly.
  • Common practice: Trademark protection is commonplace even for Freedom Software. For example, see Tor Project, which also develops Freedom Software, The Tor Project Trademark or Debian Trademark,, Red, and most if not all Linux distributions doing the same which can easily be seen by a web search for the name of the Linux distribution and trademark.

A reasonable process on how new Kicksecure binary image maintainers could be securely admitted is yet to be developed. Please start a new Kicksecure forum thread if you are interested to develop the process and/or to become a Kicksecure binary image maintainer.

Please contribute towards Kicksecure source code generally or specifically towards verifiable builds, reproducible builds and rebuilders[48] so the trust requirement can be removed from the equation.

Builds from Source Code versus Builds including Binary Packages[edit]

What "building from source code" means in context of building Linux distributions from source code is not well defined. To explain the situation for Kicksecure it is first important to mention the general, unspecific situation for other Linux distributions.

Rhetoric [50] exercises left for the reader:

  1. Build a Debian installer ISO completely from source code without using any binary packages from Unfortunately, this is difficult. There was once a bounty Build Debian Packages from Source worth $ 3000 USD but no implementation was ever contributed. See also How to update all Debian packages from source code?, and [51]

    In this context bootstrapping refers to building binaries for an architecture from source without using any pre-built binaries for that architecture. In the past 20 years about 20 architectures have been bootstrapped for Debian. At all times this has been a manual and non-repeatable process. rebootstrap is trying to address the very early bootstrap phase involving the gcc/eglibc dance.

  2. Install Debian using installer ISO (from completely source code) without using any binary packages from
  3. Upgrading Debian completely from source code without using any binary packages from
  4. Using any Debian (from source code or not): Build and install from source code while also acquiring all the build dependencies without using any binary packages from is the closest approximation of that but it uses binary packages from to fulfill build dependencies. The development of apt-build stalled and the package was
  5. Build any Qubes template image or a Qubes installer ISO from source code without using any binaries from This will probably be very difficult or impossible without Qubes source code modifications. Qubes build script function enables, i.e. binary packages from are used during Qubes build process.
  6. Attempt the previous example but without using and its Fedora equivalent.
  7. Attempt to upgrade an installed Qubes installation using Qubes source code only without using any binary packages from
  8. Create your own, and/or under a different domain name or on local disk.

All of the above tasks are very difficult tasks for which unfortunately very little documentation exists. Even description and awareness of the issues rarely exists. These issues are however unspecific to Kicksecure. These are mentioned to explain the context and set realistic expectations of what the Kicksecure project might be able to provide. The situation in Kicksecure is as follows:

Undeniably this situation is unsatisfactory. In the future, reproducible, and will hopefully reduce or eliminate the requirement to trust binaries.

Asking about these issues will very most likely not result in people writing documentation how to accomplish these very difficult tasks. The only realistic option to improve this situation is to contribute to reproducible builds and/or build from source code related projects.


Trusting the Kicksecure Website[edit]

Web Application Shortcomings[edit]

As noted in the Privacy on the Kicksecure Website chapter, the following separate web-based platforms are currently in use:

  1. for the Kicksecure forums.
  2. for online documentation.

The problem is these web applications (web apps) are developed independently from Kicksecure. This means Kicksecure developers have little to no control over the course these projects take. Since privacy and security issues often take a back seat to "enhanced features", websites relying on these or similar web apps can at best only provide privacy by policy, which is equivalent to a promise.

It is infeasible from a monetary, time and manpower perspective to address perceived shortcomings in these web apps. This means the Kicksecure community should not place undue trust in the live version of this site on the Internet, due to the potential for interference.

Distrusting Infrastructure[edit]

In an identical fashion to the Qubes project, Kicksecure has adopted the principle that all infrastructure should be explicitly distrusted. Infrastructure in this context refers to " providers, CDNs, DNS services, package repositories, email servers, PGP keyservers, etc."

Third parties who operate infrastructure are "known unknowns" and potentially hostile. It is safer to voluntarily place trust in a few select entities, such as the contributors of Kicksecure packages, the holder(s) of Kicksecure signing keys and so on. By sufficiently securing endpoints, it is unnecessary to try and improve the trustworthiness of those operating the "mid-points". This also provides two benefits: Kicksecure forgoes the need to invest valuable resources on the problem, and no illusory security expectations are raised in the Kicksecure community. Qubes (security-focused operating system):

What does it mean to “distrust the infrastructure”?

A core tenet of the Qubes philosophy is “distrust the infrastructure,” where “the infrastructure” refers to things like hosting providers, CDNs, DNS services, package repositories, email servers, PGP keyservers, etc. As a project, we focus on securing endpoints instead of attempting to secure “the middle” (i.e., the infrastructure), since one of our primary goals is to free users from being forced to entrust their security to unknown third parties. Instead, our aim is for users to be required to trust as few entities as possible (ideally, only themselves and any known persons whom they voluntarily decide to trust).

Users can never fully control all the infrastructure they rely upon, and they can never fully trust all the entities who do control it. Therefore, we believe the best solution is not to attempt to make the infrastructure trustworthy, but instead to concentrate on solutions that obviate the need to do so. We believe that many attempts to make the infrastructure appear trustworthy actually provide only the illusion of security and are ultimately a disservice to real users. Since we don’t want to encourage or endorse this, we make our distrust of the infrastructure explicit.

Also see: Should I trust this website?

Self-Hosting vs Third Party Hosting[edit]

Some users mistakenly believe that servers of security-focused projects are virtually impenetrable and hosted in the homes of developers; this is not the case. The server is actually hosted at an Internet hosting company. Similarly, The Tor and servers are not hosted in a developer's home either. Hosting at home is the exception, rather than the rule. At the time of writing, there are no known cases where servers are hosted in a developer's home. This means employees of the associated Internet hosting company have physical access rights to the server, along with any other capable, malicious actors.

Since virtually every project is hosted by a third party (an Internet hosting company), the capability to physically secure server hardware is largely forfeited. Without physical security and due to the risk of untrusted visitors, a hardware backdoor could easily compromise the security of the server.

Any demand that servers ought to be super secure and hosted in a developer's home is idealistic. Home Internet connections are generally too slow to meet the requirements of a public web server in terms of traffic quota and connection upload speed. Internet service providers (ISPs) do not usually allow a busy public web server to be hosted on home connections; throttled connections or terminated contracts are likely if that happens.

The "proper solution" would require purchase of a business Internet uplink, similar to becoming an Internet hosting company. This would incorporate a business building with a good Internet uplink, full camera security, security officers and so forth. Unfortunately this is economically infeasible at the current stage of project development.

Security Level[edit]

Many web applications in use by did not provide software signatures at the time of installation or still do not provide them. Therefore, in stark contrast to software installed by default in Kicksecure, for the server it was not possible to always enforce verification of software signatures.

Many web application and extensions updaters did not, or still do not, securely verify software signatures. Therefore, the security level of most servers is probably only equivalent to plaintext. In the case of the server, the system security level is only equivalent to always use TLS and not always use software signatures verification.

Server Privacy[edit]

In the past, various suggestions for "perfect server privacy" [53] were made such as "self-hosting in developers' homes" or "host the server outside the five (nine eyes, fourteen countries". Despite the good intentions, these suggestions do not easily translate into an actionable plan.

First, these suggestions assume there is a sane method of rating the privacy protections afforded by a specific country. Moreover, the privacy rights granted for local citizens in a specific jurisdiction do not necessarily extend to non-citizens. Kicksecure developers are unaware of any project that rates privacy protections in this way, considers the feasibility of operating servers (by running tests), and then makes recommendations for locations which provide the best possible privacy.

In today's following the Snowden disclosures, it has to be assumed that if surveillance is possible it is being done. The likelihood is that surveillance is undertaken in all jurisdictions, and it is only a matter of degree.

Even The Tor Project -- a much older, established and better funded organization -- does not attempt to implement any suggestion concerning "perfect server privacy". As noted on their sponsor's

Fastly generously hosts our Tor Browser update downloads that can be fetched anonymously. is providing content delivery network (CDN) services and is headquartered in America (arguably the most aggressive member of the five eyes network). Even Debian uses CDNs Amazon AWS and

In a similar fashion to the Distrusting Infrastructure chapter, Kicksecure has concluded it is not worthwhile investing valuable resources to try and provide "perfect server privacy", because it is simply uneconomical. For this reason, the viewpoint that no undue trust should be placed in the server arrangements is made explicit.

Server Security[edit]

Server security issues should not be conflated with software security issues. If an advanced adversary wanted to tarnish the reputation of any security-focused project, then breaking into the data center where it was hosted and "hacking" them would be one way to achieve that aim. Projects that are honest need to mention this possibility beforehand, so it is not unexpected.

The world's largest and most profitable technology companies like Google, Facebook, Microsoft and Amazon can easily afford to employ large, dedicated and skilled teams of system administrators to work around the clock to protect their servers. [54] For small projects, this scale of server protection is completely unrealistic.

Software Update APT Repository Security[edit]

A compromise of the server would not result in a compromise of users attempting to upgrade Kicksecure. This is because of a standard APT security feature: digital signature verification of APT repository metadata ( An adversary who compromised the server would lack the signing key required to generate valid signed APT repository metadata. Invalid APT repository metadata would be rejected by the user's APT updater software. APT repository metadata is signed locally on the developer's computer before the signed metadata is uploaded to the server. The Kicksecure APT repository signing key is never exposed to the server. Thanks to digital signature verification the Kicksecure software update APT repository can be in theory considered more secure than than the website.

Software Build Process Security[edit]

A compromise of the server would not result in a compromise of software packages (image downloads, Debian packages, .debs) because all software is built locally on the developer's computer. No binary builds offered to download for users of software developed under the Kicksecure umbrella is ever created on remote servers hosted by third parties. Users who always correctly verify software signatures could detect malicious software before use. However, note these Consequences of Server Compromise.

Server Privacy vs Server Security[edit]

In an ideal world, both server privacy and server security would be maximized at the same time. However, in the real world this is an impossibility.

In a world with specialization and division of labour, those companies who excel at hosting web applications have more focus, time, energy, knowledge and money to work on server security; it is their raison d'etre (reason for being). In contrast, small projects use web applications only as a means to an end. Therefore, using third party web application hosters may provide better security than self-hosting, but better server privacy demands self-hosting. This means it is impossible to optimize both security and privacy simultaneously; the goals are at odds with each other.

Server Downtime[edit]

The almost perfect uptime of popular web services such as Google, Facebook, and Amazon (perhaps 99.99 per cent) might lead some to conclude this is an easy goal to achieve; this is a false assumption.

Expecting the same uptime from much smaller projects like Kicksecure is unrealistic. At best, maybe only 99.0 per cent uptime can be provided because no resources are spent on server uptime statistics, server upgrades need to be performed, and reboots are necessary. These factors necessarily lead to downtime when the website is unavailable. With a huge budget it would be possible to approach the 99.99 per cent uptime that popular websites have via technical solutions such as server farms, load balancing, and, but this is infeasible for small projects. Similarly, large companies can afford to pay for whole teams of system administrators who are working 24/7, in concert with these technical options. Again, small projects do not have that option.

Finally, server downtime is not evidence of a server compromise, but normally relates to server issues (for example, failing hard drives) and routine server maintenance.

Consequences of Server Compromise[edit]

A compromised would result in one or more of the following issues:

  • Provision of poor and/or malicious advice to visitors of the website.
  • Offer malicious software downloads to users who do not always verify software signatures.
  • Unavailability of legitimate software downloads and updates.
  • Reputational damage for Kicksecure.


Due to the multiple issues outlined in this section, the software produced by the Kicksecure project is theoretically considered more secure than the website provided by the Kicksecure project ( The Kicksecure software is the main product delivered by the Kicksecure project, while the server is only a tool to document and deliver Kicksecure. For further reading on this topic, see: Website and Server Tests.



  • Digital signatures: A tool enhancing download security. Commonly used across the internet.
  • Learn more: Curious? Learn more about digital software signatures.
  • Optional: Digital signatures are optional. If you've never used them before, there might be no need to start now.
  • No worries: New to digital software signatures? It's okay, no need to worry.
  • Not a requirement: Not mandatory for using Kicksecure, but an extra security measure for advanced users.

Fingerprint Trust[edit]

Most users retrieve OpenPGP fingerprints directly from a website and then download an associated key from a key server. The problem with this method is that TLS is fallible and the connection could be insecure or broken. Greater security necessitates a key signing party, whereby a direct and trusted path of communication can be confirmed by all attendees. If this step is not followed, OpenPGP is only secure as TLS.

It is often impossible to meet this condition of meeting in person. To mitigate the risk, any OpenPGP fingerprint should be cross-referenced on multiple "secure" (https://) sites. An additional fail-safe is to use an alternative authentication system, for example comparing the Tor signing keys on both the and .oniononion domains.

Onion services offer strong authentication via multiple layers of This does not prohibit an advanced adversary from trying to impersonate an onion service, but together with multiple fingerprint sources, it becomes increasingly difficult and improbable that a single entity could impersonate them all.

Kicksecure Binaries and Git Tags[edit]

All Kicksecure binaries are OpenPGP-signed by Kicksecure developer Patrick Schleizer. [55] The source code is directly available on github over TLS, and it can be cloned using git over https://. Git tags for each release are also OpenPGP-signed by Kicksecure developer Patrick Schleizer. Users can also request signed git development tags from the same developer.

Even if Kicksecure developers are distrusted, verifying binary downloads or git tags with OpenPGP is still useful. For example in order to audit Kicksecure, it is important to verify the download came from Kicksecure developers and that it was not tampered with by third parties. This is a realistic threat, as these recent examples show:

The OpenPGP key also ensures that if the Kicksecure infrastructure is ever compromised by a powerful adversary (such as a domain takeover), the original Kicksecure developers can at least prove they owned the infrastructure.

Kicksecure Developer OpenPGP Guidelines[edit]

All long-term Kicksecure developers are encouraged to:

  • Create a 4096/4096 RSA/RSA OpenPGP key.
  • Retrieve the latest gpg.conf which comes with Kicksecure for stronger hashes, no-emit-version, and other improved settings.
  • Store the private key inside an encrypted file.
  • Make a backup of that encrypted file.
  • Remember the password and regularly test one's memory of it.
  • Upload the encrypted file to a (free) online cloud-based host to protect against theft, fire, natural events and so on.

From the beginning of the Kicksecure project, greater trust has been placed in developers who publish their OpenPGP public key earlier on, since this reduces the probability of an evil developer attack.

Kicksecure Updates[edit]


An optional updater has been available in Kicksecure since version 6 of the platform. [56] When it comes to trust, there is a large difference between building Kicksecure from source code and using the Default-Download-Version.

APT Repository and Binary Builds Trust[edit]

When Kicksecure is built from source code using the build script and the source code is audited by the builder to be non-malicious and reasonably bug-free, Kicksecure developers are unable to access the system. On the other hand, if Kicksecure APT repository is enabled, developers holding a Kicksecure repository signing key could release a malicious update to gain full access to the machine(s). [57]

Even if the Kicksecure APT repository is not used with the Default-Download version, it is still theoretically possible for Kicksecure developers to sneak a backdoor into the binary builds which are available for download. Although an unpleasant threat, using Kicksecure APT repository poses a greater risk: a malicious Kicksecure developer might sneak in a backdoor at any time.

It is easier to sneak backdoors into binary builds, since they contain compiled code in binary packages which are downloaded from the Debian repository when built.

APT Repository Default Settings[edit]


  • Building from source code: Kicksecure APT Repository is disabled by default.
  • Default binary download: Kicksecure APT Repository is enabled by default.


  • Qubes/Install: Kicksecure APT Repository is enabled by default.
  • Building from source code: Kicksecure APT Repository is enabled by default. [58]

Most users will have the Kicksecure APT repository enabled. This means when updated Kicksecure debian packages are uploaded to the Kicksecure APT repository, these packages will be automatically installed when the system is upgraded. [59] If this behavior is unwanted, this can be disabled. Refer to the previous section outlining security implications before proceeding.

Security Conclusion[edit]


  • *: poor security.
  • ****: best security.

Table: Build and APT Repository Security Comparison

Binary Download with Kicksecure APT Repository Binary Download without Kicksecure APT Repository Built from Source Code and Kicksecure APT Repository Enabled Built from Source Code and Kicksecure APT Repository Disabled
Security * ** * ****
Convenience **** * ** *

In summary:

  • The Kicksecure binary download using the Kicksecure APT repository is the most convenient method, but also the least secure.
  • It is somewhat safer to use the Kicksecure binary download and then disable the Kicksecure APT repository. However, the user must then manually download updated Kicksecure deb packages upon release, and independently verify and install them.
  • The greatest security comes from building Kicksecure and updated packages from source code, particularly if the source code is verified before building Kicksecure.


What Digital Signatures Prove[edit]

  • Digital signatures: A tool enhancing download security. Commonly used across the internet.
  • Learn more: Curious? Learn more about digital software signatures.
  • Optional: Digital signatures are optional. If you've never used them before, there might be no need to start now.
  • No worries: New to digital software signatures? It's okay, no need to worry.
  • Not a requirement: Not mandatory for using Kicksecure, but an extra security measure for advanced users.

See Verifying Software Signatures for details on what digital signatures prove.

In short, a user must be careful to ensure the public keys that are used for signature verification are the Kicksecure key pair belonging to the Kicksecure developer of the component specific component. At time of writing there are two different components and signing keys.


TLS, SSL and HTTPS are all flawed since they rely on the vulnerable Certificate Authority (CA) model; see here for further details and SSL/TLS alternatives. [60]

Evil Developer Attack[edit]


An "evil developer attack" is a narrow example of an insider threat: [61]

Software development teams face a critical threat to the security of their systems: insiders.


An insider threat is a current or former employee, business partner, or contractor who has access to an organization’s data, network, source code, or other sensitive information who may intentionally misuse this information and negatively affect the availability, integrity, or confidentiality of the organization’s information system.

In the case of software, a disguised attack is conducted on the integrity of the software platform. While this threat is only theoretical, it would be naive to assume that no major software project has ever had a malicious insider. Kicksecure and all other open source software projects face this problem, particularly those that are focused on anonymity such as VeraCrypt, [62] Tails, I2P, The Tor Project and so on.

Attack Methodology[edit]

A blueprint for a successful insider attack is as follows:

  1. Either start a new software project or join an existing software project.
  2. Gain trust by working hard, behaving well, and publishing your sources.
  3. Build binaries directly from your sources and offer them for download.
  4. Attract a lot of users by making a great product.
  5. Continue to develop the product.
  6. Make a second branch of your sources and add malware.
  7. Continue to publish your clean sources, but offer your malicious binaries for download.
  8. If undetected, a lot of users are now infected with malware.

An evil developer attack is very difficult for end users to notice. If the backdoor is rarely used, then it may remain a secret for a long time. If it was used for something obvious, such as adding all the users to a botnet, then it would be quickly discovered and reported on.

Open source software has some advantages over proprietary code, but certainly not for this threat model. For instance, no one is checking if the binaries are made from the proclaimed source and publishing the results, a procedure called "deterministic builds". [63] [64] This standard is quite difficult to achieve, but is being worked towards. [65]

Related Attacks[edit]

While most security experts are focused on the possibility of a software backdoor, other insider attacks can have equally deleterious effects. For instance, the same methodology can be used to infiltrate a targeted project team but in a role unrelated to software development; for example, as a moderator, site administrator, wiki approver and so on. This approach is particularly effective in smaller projects that are starved of human resources.

Following infiltration, disruption is caused within the project to affect productivity, demoralize other team members and (hopefully) cause primary contributors to cease their involvement. For example, using a similar blueprint to that of the evil developer attack, a feasible scenario is outlined below:

  1. Join an existing software project as a general member.
  2. Gain trust by working hard, behaving well, assisting readily in forums, making significant wiki contributions and so on.
  3. Attract a lot of community admiration by outwardly appearing to be a bona fide and devoted project member.
  4. Eventually attain moderator, administrator or other access once team membership is extended. [66]
  5. Continue to behave, moderate and publish well.
  6. Once trust is firmly established, subtly undermine the authority, character and contributions of other team members. [67]
  7. If the insider threat is undetected for a significant period, this can lead to a diminished software product due to a fall in contributions in numerous domains and team ill will.


The insider threat nicely captures how difficult it is to trust developers or other project members, even if they are not anonymous. Further, even if they are known and have earned significant trust as a legitimate developer, this does not discount the possibility of serious mistakes that may jeopardize the user. The motives and internal security of everyone contributing to major software projects like Tor, distribution developers and contributors, and the hundreds of upstream developers and contributors is a legitimate concern. [68]

The trusted computing base of a modern operating system is enormous. There are so many people involved in software and complex hardware development, that it would be unsurprising if none of the bugs in existence were intentional. While detecting software changes in aggregate may be easy (by diffing the hash sums), finding and proving that a change is a purposeful backdoor rather than a bug in well designed source code is near impossible.

Legal Issues[edit]

Info The following is not legal advice and does not refer to any specific laws; it is a theoretical consideration by non-lawyers.

Users occasionally raise the possible privacy and security implications if contemporary, draft laws were to be passed. For example, how the Kicksecure project would react to laws:

  • banning end-to-end encryption
  • outlawing anonymity tools like Tor
  • demanding that operating systems include a backdoor

It is important to note that government members have diverse and conflicting interests. Bills which are hostile to Internet privacy and security are regularly introduced in various jurisdictions, but common sense usually prevails and ill-conceived legislation normally stalls and fails to become law. Conversely, bills that allocate funding to support cryptographic development and privacy tools garner support and are normally passed because most legislators understand their importance in an open society. While Internet privacy advocacy groups should remain vigilant, it is unproductive to become unduly stressed whenever a bill hostile to privacy or security is proposed.

Although it may be counter-intuitive, in the event privacy-hostile laws are passed this is not a Kicksecure-specific issue, even though Kicksecure would obviously be affected. For the most part, Kicksecure is a compilation of existing software packages provided by third parties which allow re-use in a compilation due to permissive licensing (Freedom Software). In this context, noteworthy components which Kicksecure relies on directly or indirectly are the base operating system (Debian at time of writing) and an anonymizer (Tor at time of writing). At first, such a law would very likely harm the security properties of these and other projects (see footnote). [69]

Legal Jurisdiction[edit]

In response to the possibility of privacy-hostile laws being implemented, it is usually suggested that the Kicksecure legal entity should relocate to a different country. The effectiveness of moving to another jurisdiction would of course depend upon the specifics of the legal text, however it is unlikely that simple legal loopholes would exist. For example, legal entity relocation does or did not help people who would like to sell controlled substances (such as medicine) or goods (such as weapons) without all authorizations required by law. Another example are financial services; this is also why unnamed stock certificates on the blockchain do not exist.

Some U.S. laws apparently apply to all international jurisdictions. Take the case of Kim Dotcom who is a German/Finnish dual national. Although a permanent resident of and physically present in New Zealand at the time of alleged copyright infringement charges brought forth by the USA, he had his assets seized, worldwide bank accounts frozen, was arrested, and is fighting extradition to the USA. As Kim Dotcom summarized on

I never lived there
I never traveled there
I had no company there

But all I worked for now belongs to the U.S.

How The US Government Legally Stole Millions From Kim

Legal Compliance[edit]

Sometimes it is suggested to simply not comply with new laws impacting privacy, however this is an unreasonable request. Most laws include an enforcement mechanism, although it can be selectively applied depending on government interests. Serious penalties apply if a law is not being complied with, especially for repeat and continuous offenses. Penalties may include:

  • imprisonment
  • monetary fines
  • for failure to pay monetary fines, the threat of asset seizure, imprisonment or worse

Law enforcement has incredibly long arms. In most cases there is no way to openly defy the law for an extended period and get away with it. To a large degree policy issues cannot be fixed only via technological means; it must be combined with peaceful resistance on a political level. Government policy is affected by popular opinion, and those who support privacy-enhancing technologies can help the cause by sharing their reasoned opinions with others. Casual supporters are also important to raise public awareness.

Even if privacy-hostile laws are in place, it might still be permitted to contribute Open Source code to Open Source projects. For example, perhaps only the person(s) redistributing binary builds to the public would be held personally accountable. This is pure speculation until a new draft law catastrophic to security software eventuates.

If Kicksecure was ever forced to add a backdoor by law, users would be notified and the project would be shut down before the law took effect. Fortunately, as yet there are no outrageous law proposals that would force the continued running of backdoored projects. In this case, efforts might focus on a new Linux-based project centered on stability, reliability, documentation, recovery, and usability.

Forum discussion:
EU Wants To Create Device/OS Level

Other Projects Discussing Trust[edit]


  1. Creator of the Linux kernel.
  3. On the flip-side, there is no guarantee that just because software is open to review, that sane reviews will actually be performed. Further, people developing and reviewing software must know the principles of secure coding.
  4. Some use binary obfuscators.
  6. Some use obfuscation.
  7. An Open Source application binary could be obfuscated in theory. However, depending on the application and the context -- like not being an Open Source obfuscator -- that would be highly suspicious. An Open Source application using obfuscators would probably be criticized in public, get scrutinized, and lose user trust.
  8. This is because non-freedom software is usually only available as a pre-compiled, possibly obfuscated binary. Using an anti-decompiler:
    • Auditors can only look at the disassembly and cannot compare a pre-compiled version from the software vendor with a self-compiled version from source code.
    • There is no source code that is well-written, well-commented, and easily readable by design.
  9. Since there is no source code, one cannot self-build one's own binary.
    • small: for non-reproducible builds (or reproducible builds with bugs)
    • none: for reproducible builds
  10. Decompilation is often expressly forbidden by license agreements of proprietary software.
  11. Skype used DMCA (Digital Millenium Copyright Act) to shut down reverse engineering of
  12. Decompilation is always legal and permitted in the license agreements of Freedom Software.
  13. 14.0 14.1 This is very difficult because most outgoing connections are encrypted by default. At some point the content must be available to the computer in an unencrypted (plain text) format, but accessing that is not trivial. When running a suspected malicious application, local traffic analyzers like cannot be trusted. The reason is the malicious application might have compromised the host operating system and be hiding that information from the traffic analyzer or through a backdoor. One possible option might be running the application inside a virtual machine, but many malicious applications actively attempt to detect this configuration. If a virtual machine is identified, they avoid performing malicious activities to avoid being detected. Ultimately this might be possible, but it is still very difficult.
  14. It is necessary to decompile the binary and read "gibberish", or try to catch malicious traffic originating from the software under review. As an example, consider how few people would have decompiled Microsoft Office and kept doing that for every upgrade.
  15. It is possible to:
    1. Audit the source code and confirm it is free of backdoors.
    2. Compare the precompiled binary with a self-built binary and audit the difference. Ideally, and in future, there will be no difference (thanks to the Reproducible Builds project) or only a small difference (due to non-determinism introduced during compilation, such as timestamps).
  16. An example of a "direct" backdoor is a hardcoded username and password or login key only known by the software vendor. In this circumstance there is no plausible deniability for the software vendor.
  17. List of “direct” backdoors in
  18. One interesting “direct” backdoor was this bitcoin copay wallet backdoor:
  19. Requires strong disassembly auditing skills.
  20. If for example hardcoded login credentials were in the published source code, that would be easy to spot. If the published source code is different from the actual source code used by the developer to compile the binary, that difference would stand out when comparing pre-compiled binaries from the software vendor with self-compiled binaries by an auditor.
  21. A "bugdoor" is a vulnerability that can be abused to gain unauthorized access. It also provides plausible deniability for the software vendor. See also: Obfuscated C Code
  22. Such issues are hard to spot in the source code, but even harder to spot in the disassembly.
  23. This is forbidden in the license agreement. Due to lack of source code, no serious development is possible.
  24. Since source code is already available under a license that permits software forks and redistribution.
  25. This entry is to differentiate from the concept immediately above. Pre-compiled proprietary software is often modified by third parties for the purposes of privacy, game modifications, and exploitation.
  26. For example, Intel ME could not be disabled in Intel CPUs yet. At the time of writing, a Freedom Software re-implementation of Intel microcode is unavailable.
  27. Some may publish debug symbols.
  28. It is possible to review the disassembly, but that effort is duplicated for subsequent releases. The disassembly is not optimized to change as little as possible or to be easily understood by humans. If the compiled version added new optimizations or compilation flags changed, that creates a much bigger of the disassembly.
  29. After the initial audit of a source-available binary, it is possible to follow changes in the source code. To audit any newer releases, an auditor can compare the source code of the initially audited version with the new version. Unless there was a huge code refactoring or complete rewrite, the audit effort for subsequent versions is lower.
  30. The consensus is the assembler low programming language is more difficult than other higher level programming languages. Example web search terms: assembler easy, assembler easier, assembler difficult.
  31. Source code written in higher level abstraction programming languages such as C and C++ are compiled to object using a compiler. See this for an introduction and this Source code written in lower level abstraction programming language assembler is converted to object code using an assembler. See the same article above and this Reverse engineering is very difficult for a reasonably complex program that is written in C or C++, where the source code is unavailable; that can be deduced from the high price for it. It is possible to decompile (meaning re-convert) the object code back to C with a decompiler like To put a price tag on it, consider this quote -- Boomerang: Help! I've lost my source

    How much will it cost? You should expect to pay a significant amount of money for source recovery. The process is a long and intensive one. Depending on individual circumstances, the quality, quantity and size of artifacts, you can expect to pay upwards of US$15,000 per man-month.

  32. The following resources try to solve the question of how to disassemble a binary (byte code) into assembly source code and re-assemble (convert) to binary.

    1. Take a hello world assembler source code.

    2. Assemble.

    nasm -felf64 hello.asm

    3. Link.

    ld hello.o -o hello

    4. objdump (optional).

    objdump -d hello

    5. Exercise for the reader: disassemble hello and re-assemble.

  33. The GNU program source file at the time of writing contains 170 lines. The objdump -d /usr/bin/hello on Debian buster has 2757 lines.

    Install hello. To accomplish that, the following steps A. to D. need to be done.

    A. Update the package lists.

    sudo apt update

    B. Upgrade the system.

    sudo apt full-upgrade

    C. Install the hello package.

    Using apt command line parameter --no-install-recommends is in most cases optional.

    sudo apt install --no-install-recommends hello

    D. Done.

    The procedure of installing hello is complete.

    objdump -d /usr/bin/hello

  • For example, consider how difficult it was to reverse engineer Skype: Skype Reverse Engineering : The (long) journey ;)
    • Consider all the Debian package maintainer scripts. Clearly these are easier to review as is, since most of them are written in sh or bash. Review would be difficult if these were converted to a program written in C, and were closed source and precompiled.
    • Similarly, it is far preferable for OnionShare to stay Open Source and written in python, rather than the project being turned into a precompiled binary.
  • Salary comparison ($K):
  • It is obvious the cost of a security audit involving reverse engineering will be far greater than for source-available code.
  • Quote research paper Android Mobile OS Snooping By Samsung, Xiaomi, Huawei and Realme

    Reverse Engineering A fairly substantial amount of non-trivial reverse engineering is generally required in order to decrypt messages and to at least partially decode the binary plaintext. 1) Handset Rooting: The first step is to gain a shell on the handset with elevated privileges, i.e. in the case of Android to root the handset. This allows us then to (i) obtain copies of the system apps and their data, (ii) use a debugger to instrument and modify running apps (e.g. to extract encryption keys from memory and bypass security checks), and (iii) install a trusted SSL root certificate to allow HTTPS decryption, as we explain below. Rooting typically requires unlocking the bootloader to facilitate access to the so-called fastboot mode, disabling boot image verification and patching the system image. Unlocking the bootloader is often the hardest of these steps, since many handset manufacturers discourage bootloader unlocking. Some, such as Oppo, go so far as to entirely remove fastboot mode (the relevant code is not compiled into the bootloader). The importance of this is that it effectively places a constraint on the handset manufacturers/ mobile OSes that we can analyse. Xiaomi and Realme provide special tools to unlock the bootloader, with Xiaomi requiring registering user details and waiting a week before unlocking. Huawei require a handset-specific unlock code, but no longer supply such codes. To unlock the bootloader on the Huawei handset studied here, we needed to open the case and short the test point pads on the circuit board, in order to boot the device into the Huawei equivalent of Qualcomm’s Emergency Download (EDL) mode. In EDL mode, the bootloader itself can be patched to reset the unlock code to a known value (we used a commercial service for this), and thereby enable unlocking of the bootloader.

    Decompiling and Instrumentation On a rooted handset, the Android application packages (APKs) of the apps on the /system disk partition can be extracted, unzipped and decompiled. While the bytecode of Android Java apps can be readily decompiled, the code is almost always deliberately obfuscated in order to deter reverse engineering. As a result, reverse engineering the encryption and binary encoding in an app can feel a little like exploring a darkened maze. Perhaps unsurprisingly, this is frequently a time-consuming process, even for experienced researchers/practitioners. It is often very helpful to connect to a running system app using a debugger, so as to view variable values, extract encryption keys from memory, etc.

    The research paper describes in far more detail the highly complicated technical challenges of reverse engineering.

  • Debian also participates in security standardization efforts and related overarching projects.
  • This is a good starting point to understand how Kicksecure works.
  • Meaning doubts should surface if a new canary was not issued for longer than 4 weeks.
  • If issues arise with the server, this ensures the canary is always available online.
  • This feature has been available since Kicksecure 0.4.5.
  • Builds can be initiated by Patrick but the template build server and template repository are hosted by the Qubes project.
  • 48.0 48.1 rebuilders as defined here:
  • Hidden source code is defined as code which is added by an adversary. They may have: compromised a build machine, conducted compiling prior to the binary build process, or be responsible for building the actual binary. The secret source code will remain unpublished and it will appear (or be claimed) that the software was built from the published source code. Reliably detecting such hidden code - added on purpose or due to build machine compromise - requires comparison with deterministic builds, which are discussed above. Other methods like watching network traffic are less reliable, since a backdoor can only be spotted when it is used. Backdoors are even less likely to be found through reverse, because very few people are using a
  • Rhetoric exercises since doing these things is unfortunately so difficult, that most readers will not attempt it.
  • The locally-upgrade-kicksecure-debian-packages script should probably moved to a different package and installed by default. Only approximately 1 user in approximately 10 years expressed interest in using it. Usage would be difficult. The high level overview is:
    1. Get Kicksecure source code.
    2. Checkout the desired Kicksecure version source code git tag.
    3. Optionally, recommended: Digital signature verification Kicksecure source code git tag.
    4. Optionally, recommended: Review Kicksecure source code.
    5. Create the Kicksecure packages from Kicksecure source code and create a local Kicksecure APT repository using the build step. (Ideally, these build steps would be packaged and installed by default. Contributions welcome.)
    6. Upgrade Kicksecure from local Kicksecure APT repository using APT.
    How you can help? As possible:
    1. Try the script.
    2. Contribute to the script.
    3. Package Kicksecure build script.
    4. Contribute documentation.
    5. Keep maintaining it.
  • Using quotes since this is not well defined.
  • Even then, capable adversaries have hacked their servers in the recent past; see
  • Kicksecure developer, named proper in the, renamed himself to, published his OpenPGP key on 05/29/ (wiki Revealed his identity on 01/18/ Patrick Schleizer posted his OpenPGP key transition message on 01/18/14, signed by both his old and new
  • When Kicksecure APT repository is disabled, there is no updater - as was the case in Kicksecure 0.5.6 and below.
  • At the moment, Kicksecure developer Patrick Schleizer is the only one holding the Kicksecure APT repository OpenPGP signing key.
  • To disable this setting, see: broken link: [{{project_name_short}}/qubes-template-{{project_name_short}} qubes-template-Kicksecure]: broken link: [{{project_name_short}}/qubes-template-{{project_name_short}}/blob/master/builder.conf builder.conf], and set DERIVATIVE_APT_REPOSITORY_OPTS = off
  • After running sudo apt update && sudo apt full-upgrade manually or via a GUI updater.
  • Kicksecure developers place little trust in the CA model. Even if the numerous implementation problems were solved, such as problematic revocation and the ability for every CA to issue certificates for anything (including "*"), third party trust cannot be established. Until an alternative arrives and is widely adopted, everybody has to rely upon SSL/TLS to some extent.
  • TrueCrypt has been discontinued.
  • Interested readers can investigate its complexity by searching with the phrase "trusting trust".
  • The time period is likely to be shorter for smaller projects, perhaps less than 12 months.
  • For example, by casting unjustified aspersions.
  • In the case of Kicksecure, binaries are not distributed nor created. Only unmodified upstream binaries are distributed, along with shell scripts. This claim is much easier to verify than if Kicksecure were distributing binaries from project source code.
  • To learn more about this organizational structure, see: Linux User Experience versus Commercial Operating Systems.
  • License[edit]

    Kicksecure Trust wiki page Copyright (C) Amnesia <amnesia at boum dot org>
    Kicksecure Trust wiki page Copyright (C) 2012 - 2024 ENCRYPTED SUPPORT LP <

    This program comes with ABSOLUTELY NO WARRANTY; for details see the wiki source code.
    This is free software, and you are welcome to redistribute it under certain conditions; see the wiki source code for details.

    Unfinished: This wiki is a work in progress. Please do not report broken links until this notice is removed, use Search Engines First and contribute improving this wiki.

    We believe security software like Kicksecure needs to remain Open Source and independent. Would you help sustain and grow the project? Learn more about our 12 year success story and maybe DONATE!