Secure Contexts

From Pin Eight
Jump to: navigation, search

This is a mini-rant, a short essay refuting a common misconception among users of an Internet forum. If you think this essay is FUD, feel free to explain why on the essay's talk page.

In a nutshell: Web browsers require HTTPS for some JavaScript features, making their use on private home web servers impractical.

As of 2016, certain major web browser publishers such as Mozilla[1] and Google have announced plans to make sensitive JavaScript APIs available only within sites that use HTTPS, not cleartext HTTP, as a way of encouraging more sites to implement HTTPS.

Secure Contexts

The W3C's Secure Contexts spec defines a "secure context" roughly as a document all of whose markup and scripts come from a "potentially trustworthy origin". In turn, a "potentially trustworthy origin" is any of these:

  • An origin using an authenticated scheme (https: or wss:)
  • An origin on the local computer using IP addresses in 127/8 (IPv4) or ::1/128 (IPv6)
  • An origin on the local computer using a scheme that the browser considers authenticated (file: and possibly packaged browser extensions)
  • An origin that the user has "configured as a trustworthy origin", through a user interface that has not been specified

In particular, this does not include local area networks using private address space as described in RFC 1918 (10/8, 172.16/12, or 192.168/16). This means that by default, a browser will treat a server on your home LAN no differently from a random PC connected to a public Wi-Fi hotspot in a restaurant. So until user agents provide a way to mark an origin as "configured as a trustworthy origin", the only way to make a computer other than the local computer trusted is to implement HTTPS.


HTTPS is built on Transport Layer Security (TLS), which associates a public key with a server using an X.509 certificate whose subject alternate name (SAN) matches the server's hostname. The certificate is issued and digitally signed by a certificate authority (CA) that the device trusts. In order to gain the trust of device makers and web browser publishers, a CA must show evidence that it follows the practices described in the Baseline Requirements (BR) published by the CA/Browser Forum. The BR require the hostname in a certificate to be a fully-qualified domain name (FQDN), not a private IP address or a made-up top-level domain (TLD), such as .local or .internal.

The requirement of a FQDN poses little trouble for developers of applications that can be accessed through the Internet, even hobbyists, as the owner of a domain can obtain certificates for plenty of machines that it operates in that domain through the Let's Encrypt CA. An internal web application accessed by corporate-owned devices on a corporate network can use this method as well: put the server on a subdomain of a corporate-owned domain and use split-horizon DNS to show some records to the general public (including the CA) and other records only to the internal network. This has the side effect of disclosing internal server hostnames through DNS and Certificate Transparency.

For developers testing applications internally before public deployment, it's enough to operate a private CA using free software such as OpenSSL. The developer can then give the test server a private name, use split-horizon DNS to make this name resolve internally, issue a certificate for that name to the test server, and deploy the private CA's root certificate on all client devices used for internal testing. A business can also use this method to avoid publishing internal hostnames at the expense of making access more tedious for employees who bring their own device (BYOD). However, this method does not work for Android 7 and later, as the Google Chrome browser no longer trusts user-installed CAs.[2]

But some web applications are intended for use by those people who happen to be authorized to connect to a particular private network at any given time. This may include friends and family visiting a home or guests at a museum.[3] Requiring users to trust a private CA isn't practical for production use, especially for servers intended for use by the non-technical users associated with these use cases. Mozilla acknowledges unavailability of a certificate from a public CA when "a device doesn't have a globally unique name" as a problem but has put off solving it.[1]

To work around the problem of no PKI for the Internet of Things, it was suggested to install the certificate on a smartphone somehow using NFC or QR codes,[4] but this hasn't yet been deployed.

One might try obtaining a globally unique name from a dynamic DNS provider, but Let's Encrypt issues only 20 certificates per registrable domain per week,[5] which is far too low for a dynamic DNS provider with substantial traffic. For example, only 20 certificates per week may be issued for subdomains of dyn.example. But if dyn.example is on Mozilla's Public Suffix List (PSL), this makes abc.dyn.example a registrable domain because it has exactly one more label than a public suffix, this label being abc.[6] Being a registrable domain has two main consequences: cookies set on abc.dyn.example won't be visible from sites hosted on other subdomains of dyn.example, and Let's Encrypt will allow up to 20 certificates per week within abc.dyn.example itself. But since Let's Encrypt entered general availability, there has been a huge backlog for dynamic DNS providers to get onto the PSL.[7] Thus each operator of a private server open to LAN visitors must purchase his own domain name, and if an application ends up installed on a million home servers, that makes a million domains that have to be purchased and renewed annually.

In addition, Let's Encrypt doesn't issue wildcard certificates. This causes problems for users of software that follows a security model like that of the Sandstorm productivity suite, which uses a wildcard DNS record and generates a new random subdomain for each new user session as a means of thwarting cross-site attacks. Though users of Sandstorm itself can use the CA, other self-hosted free software projects may not have that luxury.

Means of validation

A certificate authority can determine whether a particular customer qualifies for a domain-validated certificate in one of three ways. Not all CAs support all methods.

  • Sending a challenge value to a well-known administrative address at that domain, such as,, or, and having the user paste the value into a web form at the CA.
  • Having the customer place a particular challenge value in the web root of a publicly visible HTTP server on that hostname, and reading it back from that server.
  • Having the customer place a particular challenge value in a TXT record for the server's hostname in its DNS zone, and reading it back from DNS several minutes later, once the old value has expired from intermediate DNS caches.

As of the third quarter of 2016, Let's Encrypt supports the HTTP and DNS methods, not the mail method. But because an internal web server is not publicly visible, only the DNS method will work. This means the operator of a LAN will have to choose a dynamic DNS provider that both is on the PSL and allows prompt updates to a hostname's TXT record.


  1. 1.0 1.1 "Deprecating Non-Secure HTTP: Frequently Asked Questions". Mozilla, 2015-05. Accessed 2016-08-12.
  2. "Add & remove certificates". Accessed 2017-04-18.
  3. greggman's question
  4. Minutes of a W3C meeting. W3C, 2015-10.
  5. "Rate Limits". Let's Encrypt. Accessed 2016-08-12.
  6. "View the Public Suffix List". Mozilla. Accessed 2016-09-16.
  7. Public Suffix List issue #236. GitHub, 2016-06-15. Accessed 2016-09-16.

External links