Hacker News new | past | comments | ask | show | jobs | submit login
WordPress.com turns on HTTPS encryption for all websites (techcrunch.com)
447 points by jblz on April 8, 2016 | hide | past | favorite | 127 comments



Kudos to the Let's Encrypt and Wordpress teams. This is what the future looks like. Every webpage needs to be encrypted, and http (as opposed to https) needs to go the way of telnet (as compared to ssh).

What's particularly great is that there is no configuration of any kind for Wordpress authors or their readers. Like they have done, we need to always default to secure.


There should still exist one well-known unencrypted page. Sometimes, I need to log in to hotel or airport wifi and therefore need to accept a MitM attack. I would prefer this not to be the case.


Apple products use this one for hotspot detection:

http://captive.apple.com/hotspot-detect.html

I just have it bookmarked now. It's the one bookmark I use.


Hotel or airport wifi operates in that way because they can do so without inconvenience too many customers. If it becomes too cumbersome that they need to send in a technician every time a customer is not a sysadministrator who can figure out how to get the wifi to work, then market forces will make sure that they use something else.



I use http://xkcd.com, but that works.


What would one well-known unencrypted page do to mitigate that?


If you go to a http page, and haven't logged it, it will redirect you to the login page. If you try that over https, it just fails.


Oh, now I feel dumb, I thought this was a security comment and not a practical one. Thanks for clarifying.


The solution for all wifi clients is to do what iOS and Mac OS X has been doing for years, which is validate internet connectivity when connecting via wifi. If there's a captive portal standing in the way, it pops up a simple webkit view automatically. You don't even have to open a web browser.

It would be nice to have a more formal standard (e.g. supplying an authentication URL along with the DHCP response) but to be honest, this emerging de-facto standard is perfectly serviceable.


> you don't even have to open a web browser.

It would be nice if that worked consistently, but having just spent 2 weeks in airports and ho[s]tels in the UK & Ireland, it doesn't yet.


On the rare instances where it has failed me, I have diagnosed the problem to be the router allowing (or faking, not sure) a success response from http://captive.apple.com/hotspot-detect.html

If anyone has coded a captive portal to behave differently when it sees captive.apple.com they should be ashamed. All they're doing is making life difficult for everyone unnecessarily.


Google should do it on 8.8.8.8 just to make it more famous, but in general think about the meta websites defined in the W3C/IETF standards, such as: http://example.com


My go to is http://google.org.

Along with the 100 million in grants they write yearly, they provide a valuable social service for those more fortunate.


I use http://purple.com for this. It's easy to remember and always up.



The one thing a lot of people don't consider is that placing an entire site on https/ssl/tls ruins the traditional approach to caching. It used to be very simple to add a full-page caching layer in front of your unencrypted application. TLS completely destroys this. You can no longer simply plop a proxy cache in front of your application.

There are numerous approaches to make tls-for-all work, but they take more engineering effort. And some introduce security problems, like people who use CloudFlare which decrypts your traffic and then (optionally!!!) re-encrypts it between their servers and yours. Even taking the best encryption scheme with a CDN like CloudFlare still means that the CDN has full access to unencrypted payloads. That is an insane amount of trust to give away to a 3rd party company.

There is also an impact for large organizations that run everything through a local cache (like Squid). You can't share a cache between multiple clients when TLS is used. This push to encrypt everything means we're at the end of the era where such caches are useful. Does BuzzFeed really need to be encrypted?

One of the largest gaps in encryption in general is still performance. You hear all the time how TLS adds almost zero overhead anymore. This is simply false for most of us. Not everyone is Google who has their own custom low-level encryption implementation. Those of us using off-the-shelf software continue to pay the price of CPU overhead.


> It used to be very simple to add a full-page caching layer in front of your unencrypted application. TLS completely destroys this. You can no longer simply plop a proxy cache in front of your application.

I wouldn't say it's no longer possible. You have to do your caching before the TLS termination, and you'll have to pay the encryption overhead for every resource.

> This push to encrypt everything means we're at the end of the era where such caches are useful.

You can still do this kind of caching if you install your own CA certificate on all your devices. Almost all enterprise deployments will do this anyway so that their security appliances can scan TLS traffic. Squid supports this.

> Does BuzzFeed really need to be encrypted?

Yep! Otherwise, that free WiFi at your favourite cafe, or your ISP, might inject malicious code or ads.

> You hear all the time how TLS adds almost zero overhead anymore. This is simply false for most of us. Not everyone is Google who has their own custom low-level encryption implementation. Those of us using off-the-shelf software continue to pay the price of CPU overhead.

Obligatory reference to https://istlsfastyet.com/

There are two aspects here: Handshakes and encryption performance. Handshakes are expensive, you can probably do about 500-1k handshakes per core per second with a 2048 bit RSA key. This is usually no big deal in production because you can use session resumption, meaning once you've completed a handshake with a user, you can skip that part until the cache expires (people tend to use one day as their lifetime).

The second aspect, raw encryption performance with a symmetric cipher, is something you don't get to complain about unless you're Netflix. Any server CPU with AES-NI can push at least 500MB/s per core per second, probably more. You'll run into some other bottleneck way before TLS performance becomes an issue. Unless you're Netflix, of course[1].

[1]: https://people.freebsd.org/~rrs/asiabsd_2015_tls.pdf


    like Squid
I think the answer to that is that the days of local caches being useful are long gone. The top traffic count on many networks ends up being things like Youtube, and very few people are going to see the value in attempting to locally cache that sort of data.

Even if served over http, tonnes of traffic is uncachable anyway due the increased use of dynamic resourcing. Even this basic looking HN page will end up uncached by Squid because I'm logged in, regardless of https being in the picture.

As for performance, I've played around with benchmarks using Goad on a micro tier EC2 instance and never found an appreciable difference between http and https.


> That is an insane amount of trust to give away to a 3rd party company.

Why is it insane to give that trust to a carefully selected business partner, but not the random 3rd parties with access to plain HTTP content? Also, your CDN has access to unencrypted payloads anyway, whether it's encrypted elsewhere or not.


I still see login pages over http sometimes. I will be happy when I rarely see any http pages. I wonder what would happen if Chrome made all http pages red to indicate insecure...



There's a Chrome flag you can enable for that experience today, as it's only a matter of time before that is default. I use this and so this is how the TechCrunch article shows for me: http://i.imgur.com/c8Cz7S4.png


And, that imgur link is also a broken lock? ;)



That blog post is from January and Firefox Dev Edition 46 has since advanced to Beta. The insecure login warning will ship to the Firefox release channel on April 26. :)


Just the fact http was the default for all websites hosted on WordPress.com is really weird to me. All those websites had all the passwords sent over plaintext.


Note that this is about custom domains hosted on Wordpress.com's infrastructure. Blogs that were hosted as subdomains of wordpress.com have been using SSL since 2014 according to the original announcement. Let's Encrypt allowed them to enable it for custom domains without delivering a truckload of money to a CA.

The original announcement is a bit more precise on this.


Nope — login pages have been https for many years.


That's because Google rewards on page speed, and SSL used to make you lose the game. So if you want your wordpress to be well referenced, it used to be better to have it insecure.

Additionnaly if you published both on HTTP and HTTPS, you were flagged as duplicate content. Fortunately they've reordered the incentives now.


I'm speaking out of ignorance here, but why does every webpage need to be encrypted? What about simple static sites that are only serving information?


1. Comcast (the US ISP) inserts advertising in http webpages, to the point that it breaks webpages, delivers malicious content, or replaces a 404 with a "Do you want to download [that IE toolbar/Chrome extension which inserts ads in every page]". ISP are extremely bad-behaved. This is active spamming.

2. Hey Joe, I've seen you've consulted wikipedia pages about unionizing. What about we give you plenty of free time to unionize yourself?

2. Have you ever downloaded the pic of a sexualized person? Do you have his/her ID card on file? If not, does she "look like" <18 years old? I don't mean is she actually, but what would a jury of average blokes (who neither care about truth and want to have the trial over soon) vote for if they had to vote? Boom, you're jailed as a child pornographer, even though the model was professional and over 25. That's the beauty of blackmail. This could happen as soon as you have professional secrets, or you are annoying for a competitor or a colleague, so as soon as you do anything meaningful in this world.


Are there any other method to prevent / protect information being hijacked without going for encrypt?


Why not going for encryption? The reasons usually provided by detractors aren't satisfying: "Police needs to access consumer data because they need to investigate terrorism". This is false because the police doesn't even use the current investigation methods. For example:

France and Belgium were recently submitted to terrorists attacks. The police didn't have detectives for such terrorist cells in the first place, so full access to citizen's privacy wouldn't have helped if there was no investigation going on. You won't believe how simple is the root cause: Any organization is dedicated to justifying its growth, so police will constantly request for an upgrade of their gear, people and scope, such as computing methods to solve petty crime, while policemen have no incentive to risk their lives to investigate terrorism on the field. They, unfortunately, have no incentive in solving terrorism, because it will always be legitimate to say 'Oops, this terrorist cell wasn't detected, give us more means and people so we do better next time'. On the other hand, terrorists know very well how to protect their data, and it gives policemen huge pride and power to have access to citizen's private data - just check how many US policemen were condemned for stalking a girlfriend last year. So police access to private data is not going to solve terrorism, because there aren't enough forces dedicated to investigation anyway.

Another example: France has signed the Shengen agreement, which says it's better to join our police forces to protect the borders of Europe instead of each country protecting its own. Fast forward 30 years, we've let about 2 million illegal immigrants through, including ALL our terrorists who were born in France, went to Syria for military training and back. Talk about porosity! People with military training from an ennemy force are supposed to be arrested under Geneva conditions; They were well known and listed by our files, some of them were arrested by Turkey and given to Europe handcuffed and person-to-person, and freed later by European police for lack of means. So you see, there is not even a need for police to access to all SMS and phone calls. We weren't doing anything against terrorism in the first place. We've been lied to by politicians.

Also note that the US has become a police state and levied all their privacy protections since 2001. Did it solve terrorism? 2013: Boston attacks.

Terrorism isn't something police can reduce in the first place. Besides, going to foreign countries to destroy houses and spread poverty will only accelerate it, especially when using drones and non-judicial decisions. I'm flommoxed that France goes to Syria, because our terrorists were born in France. The only 3 things that eventually reduces candidates to terrorism:

- Investigations where police forces risk their lives. Not going to happen.

- For western-born terrorists: Good education, career prospects, loving wife and children. No racism. Not going to happen.

- For foreign countries: No Predator, no drones, no american/european attacks, no spies, no Israel expansion, just lest we parachute schools and books and food and toys for toddlers and education for women. Joint development. Remember the Marshall plan? We need to learn to forgive again. Not going to happen soon.

So don't expect terrorism to be over any soon. It's one of the risks of life, among with drugs, second-hand smoking and heart strokes.

On the other hand, the risk of not encrypting is to give people in power more information, and centralizing power even more, to the dismay of social mobility. For example, the #PanamaPapers couldn't have been investigated if police had access to journalists' communications, because they would have gently asked them to shut up.


Well nothing as drastic as what you mention. I was mainly thinking if there are any protection of hijacking website without the complexity of encrypting. Which as some have stated makes caching harder. And therefore lowers performance. I was hoping if there are solution without going to encryption.


So that you have assurance that you're connecting to the site you intended to, and not an imposter, and so you know that the information has not been altered in transit.


Not only that the high-level information on the page has not been altered, but also just simply that no code has been injected that could possibly lead to a drive-by download or other malware shenanigans.


Don't forget about loads of websites on cheap shared hosting, which cannot use Let's Encrypt or other free SSL certificates. And Telnet, used for logging in to a server, is much more dangerous than plain HTTP, used just for viewing HTML documents.


some cheap hosting providers started enabling SSL by default for all their users who use a subdomain (e.g. User.cheaphosting.tld) and mine just announced that they provide free SSL certificates for all custom domains by default thanks to letsencrypt.


Good to know! Can you share which one you are speaking of?


I thought of Lima city, but they are a German hoster (website is in German, data centers are in Frankfort, Germany). I'm sure you will be able to find many others. Lima city is free (as in beer) and have unlimited storage[1], unlimited bandwidth, support PHP and MySQL, and allow you to add custom domains (you can buy them directly though them for very reasonable prices).

Although they are free, they have very little outages but they area limited to PHP (a big no-go for all my projects these days).

Disclaimer: I'm in no way affiliated with them except hosting my homepage there. I've used them for quite a few years now and they allowed me to show off my PHP projects to friends when I was 14 or so.

[1]: they have a fair use policy and downloads (everything except, html, PHP, css, js, and image files) are blocked and need to be hosted on their download server, which has a rate limit. They will also monitor your account for abuse (so if you consume a lot of bandwidth because you hid your download in the JPG file, they will ban you).


Many cheap hosts are adding support for Let's Encrypt. This story is a prime example.


Of course techcrunch.com itself still does not support HTTPS.


They're an enterprise customer and not automatically enrolled into this for a few reasons. See https://news.ycombinator.com/item?id=11459229


> we need to always default to secure.

This.


Not to say this is a bad thing, but I'm sure Wordpress just broke a lot of links on their user's sites. For example, any embedded images from other servers not using HTTPS means that they won't load anymore due to browser policies, essentially breaking the links. It also means that any embedded images/videos/etc. will only work if the remote server has HTTPS. Again, not a bad thing, but it's pretty painful to have to deal with this with a lot of users that aren't experts on HTTP, and I'm sure it's a similar story at Wordpress.

I can flip the switch for default HTTPS on Neocities in a day. The hard part is figuring out how to not break user's sites in that process. Ideas welcome.


We've been working on this for quite a while and several parts of the solution deal with rewriting embedded URLs using HTTP. If you have any examples of breakage, let us know.


> If you have any examples of breakage, let us know.

I believe it's breaking podcast feeds being served with WordPress.com, because iTunes doesn't support Let's Encrypt certificates.

https://www.dominicrodger.com/2016/02/29/lets-encrypt-itunes...

This may not affect a lot of customers (since WordPress.com doesn't support PowerPress for feed generation), but I know some podcasters create feeds by hand or with other apps.

This issue will cause at least some podcasts to disappear from iTunes without warning unless you can coordinate with Apple to fix it.


> I believe it's breaking podcast feeds being served with WordPress.com, because iTunes doesn't support Let's Encrypt certificates.

Do you have an example? We have already implemented workarounds for iTunes. If they aren't working I would love to know the specifics so we can fix it.


> Do you have an example?

Just the confirmation from Apple's podcaster support team that iTunes doesn't support sites which use Let's Encrypt. (I don't use WordPress.com myself.)

I've just posted a request for examples in popular podcasting groups, and I'll let you know when/if I get responses.

> We have already implemented workarounds for iTunes.

Can you elaborate just a smidge? Is WordPress.com, for example, not encrypting content when it's requested by iTunes? (Thanks!)


> Can you elaborate just a smidge? Is WordPress.com, for example, not encrypting content when it's requested by iTunes? (Thanks!)

Yes, we have some targeted exceptions for incompatible clients.


How do you handle that users might embed external http images in a page? Can't you somehow warn about this during editing?

Asking because that's the problem I see at my site currently (https://groni50.org). In this case I'll just upload the external images to our site. I'll also brief our users. But I wonder if something couldn't be averted/checked in the wysiwyg editor.


Any plans to use HSTS and preloads header to default to SSL in browsers for Wordpress-hosted sites?


Yes, we're working on it.


I'm not sure if WordPress is actually doing this, but they might be using something like camo[1] to transparently rewrite any http:// URLs to an image proxy running on SSL.

This gets harder to implement correctly depending on what kind of content you allow on your sites (i.e. does your CMS only permit sanitized HTML, or are users allowed to do basically anything?), so it's not a perfect solution for everyone, but it might work here.

[1]: https://github.com/atmos/camo


Photon[0] does exactly this (and more!) and is free to use as part of Jetpack[1]

0. http://developer.wordpress.com/docs/photon/ 1. https://wordpress.org/plugins/jetpack/


And if node isn't your thing, there is a work-a-like[1] written in Go.

Disclosure: I am the author of go-camo.

[1]: https://github.com/cactus/go-camo


Depending a bit on which browser we are talking about, passive mixed content (such as images) will typically be allowed through on default settings, while only active mixed content is blocked.


> but I'm sure Wordpress just broke a lot of links on their user's sites.

Growing pains. I think that will at least make people on the web more aware of the HTTPS "revolution".


I'm not that clear as to the severity of the issue, but are you saying that "yes, it is severe, but there nothing you can do about it"? Because thats the price we pay in getting a more secure browsing experience?

Just wonderding about the metaphor of "growing pains". In humans its somethng that happens and for some its painful but has to happen, for others it isn't painful, but the process that makes the pain happens goes on regardless. Is this an accurate metaphor in this example?



Not relevant to the WordPress part, but can someone explain to me why websites like eBay don't run on HTTPS except during login? Doesn't that allow any sniffer to steal your authentication cookies?


Yes, yes it does. It's pretty annoying, aliexpress does similar too. You'd think big ecommerce sites would have caught up with this.

As for their reasoning... maybe performance, but more likely laziness.


From their perspective, its not their issue.

If a user gets their credentials hijacked, and a hacker makes a bunch of unauthorized purchases with their saved credit card, who's the customer going to call? AliExpress or their bank to mark the purchase and fraudulent and refund the money?

To them, they're merely supplying the vehicle to do business. It's the payment processing companies, the banks and third party vendors who handle the money, so its their responsibility to notice the charges and shut the account down.

Like last week, I got a call from my bank asking if I was making purchases in Belgium, Norway and France. I was like, "Uhhhhhhhhhhh no, that's fraud." They blocked the purchases first and THEN called to confirm with me. It was pretty obvious based on my banking behavior this was out of the norm and immediately flagged. It wasn't the travel sites fault they let it happen, it would've been my banks problem if they let those purchases go through.

I'm glad they have an incredible fraud detection system. This is the second time they've flagged something on my account and shut these down before any damage could be done.


Is identity theft not their issue? That site includes a history of your buying/selling habits, your address, your phone number, your payment information... screw the money itself, there's a lot more damage a nefarious eavesdropper can do than make a purchase with your account in Belgium.


Identity theft is non-thing, a lie made up by unscrupulous creditors to pretend it's not their fault for incorrectly authorizing a criminal then charging you for it.

Think about it: This information you're revealing to eBay is basically the same for any other online merchant. If that's enough to "steal your identity", where does the problem really lie?


Yes and it doesn't even protect the password appreciably either.

User logs in with HTTPS, gets redirected to HTTP site and the MitM throws up the "Incorrect password try again" page. User types their password and transmits it over HTTP or JS steals it etc. etc.

eBay does it because they aren't sufficiently interested in protecting against MitMs.

The web isn't ready for HTTPS only yet but it will happen over time.


It's already pretty much happened, I can search google, browse wikipedia, read email, HN and reddit, even click the images on imgur all without leaving the SSL comfort zone. Even facebook seems to have taken this route. Most big sites now offer SSL-only.


No. Providing the cookies are set with secure flag (https://www.owasp.org/index.php/SecureFlag), that will mean the browser won't send the cookie over http requests, only on https requests, and to answer the other comment below about 'JS snarfing', that is mitigated by setting the cookie as httpOnly (https://www.owasp.org/index.php/HTTPOnly) which doesn't mean http vs https, it means not accessible via DOM JavaScript.


Meanwhile, the chromium preload list just passed 10.000 domains. Things are moving forwards.

https://twitter.com/lgarron/status/718242465782853633


Do you know how they're stored on my PC? Last time I checked they were all in a giant C source file, which sounds like a pretty bad idea to me since I can't imagine it'll scale well.


Awesome to see stuff like this. LetsEncrypt is really doing a great service to make the Internet a better place.


WordPress.com illustrates an interesting challenge in supporting SSL if you allow people to use subdomains on your service:

https://bestcrabrestaurantsinportland.wordpress.com/ works fine

https://www.bestcrabrestaurantsinportland.wordpress.com/ displays a certificate warning

Unfortunately I don't think there's a good solution for this. Humans are gonna www- things.


> Humans are gonna www- things.

The same humans who incorrectly add "www." when not told to are also unlikely to add "https://". So have the www. version redirect to the correct URL with HTTPS.

For that matter, since certificates with Let's Encrypt support arbitrarily many SubjectAltName (SAN) values, you can include the www variant in the certificate, so that your redirect can use HTTPS and HSTS.


But then, if I'm MITMing you, I can just silently keep you on www, without SSL, without caring about HSTS or anything.


If WordPress enables (preloaded) HSTS, that would not matter.


This happens with the https:// vary rarely. The http:// case happens more often. This is one of the complications in adding HSTS headers with includeSubDomains. Right now, the http:// version works as expected, but if we added HSTS with includeSubDomains for "wordpress.com" those URLs would stop working and generate a SSL error.

Somewhat related - if you have a mapped/custom domain on WordPress.com, even though we are strongly no-www[0] the "www" will work over HTTPS, assuming that DNS for the sub domain points to our servers[1]

0. http://no-www.org/ 1. https://www.1912.me/


I never understood why you can't get a wildcard cert for * . * .example.com. Then again, the whole concept of wildcart certs being priced differently is pure price segmentation, so that's to be expected.


I've always figured it was just unwillingness on people to implement the work required for that to happen. Inertia.

The "wildcard" certs' relevant RFCs are worded such that * . * .example.com isn't valid; one of the relevant RFCs restricts you to only 1 star in the leftmost component, so two stars is invalid.

There's a thing that you can put into a cert called "name constraints" that does have a syntax that allows you to say ".example.com", which allows things such as "foo.bar.example.com". It's valid on CA certs (it's oddly only valid on CA certs), which means that you could get a cert that was a CA cert for just your domain, and all subdomains. It'd be incredibly useful.

But no CA that I know of will issue them. Of the major browsers, only Firefox supports them. The whole Lets Encrypt thing makes it mostly a moot point though, since with Lets Encrypt you'd just obtain a non-CA cert for each specific domain.

(It'd still be useful, I think, to see it implemented, if only for restricting CAs to certain public suffixes, when/if that's appropriate.)


It's so people can't use one wildcard cert to have facebook.com.example.com and google.com.example.com, potentially tricking users.


I don't think that's a valid reason, because you can get a cert for * .com.example.com, hence you can have those domains.


Very good point.


It's not price segmentation. You can encrypt X hosts with a wildcard cert, and X can be any number. So you basically buy encryption at a flat price, which can save you a LOT of money.


Fair enough; maybe the term isn't correct. My point is that a wildcart cert is technically no different than a 'regular' cert, and the CA incurs in no extra cost, unlike with EV certs. The price difference is purely based on the fact that buyers who need wildcart certs tend to have larger budgets.


And then letsencrypt gives out free certs, but not wildcards, though their argument is more for securities sake.


can you get a second certificate? i'm not sure how the technology works, but since let's encrypt is free i think it could also be automated to solve this problem


There is a feature in X.509 for this, "Subject Alternative Names" to cover these alternate hostnames.

https://en.wikipedia.org/wiki/SubjectAltName http://wiki.cacert.org/FAQ/subjectAltName https://www.openssl.org/docs/manmaster/apps/x509v3_config.ht...

Certificate authorities charge extra for it, of course they do. DigiCert brands this as "Multi-Domain (SAN) Certificate" and charges nearly $300/yr, while my choice provider, sslmate.com offers the same for $25/yr.

And now $0 certificates with Let's Encrypt, I'm sad to see sslmate.com's business hurt, as they are the first to provide no-bullshit sysadmin-focused CLI tools to get the job done. I'm very wishful to see DigiCert.com and others like it go bankrupt, however.

I don't see any reason why Honest Achmed's request to be a CA was denied by Mozilla, https://bugzilla.mozilla.org/show_bug.cgi?id=647959 at least he is honest about his business model.


LE allows you to put many names in the same cert. Adding www is extremely simple.


This is great news. All the more so as there is a tremendous amount of high-quality content under the Wordpress.com domain, something I chanced on while seeking out signs of intelligent life on the Internet.

https://www.reddit.com/r/dredmorbius/comments/3hp41w/trackin...


Is anyone providing a certificate solution for LAN deployed devices/software where there isn't a stable name, or for that matter an administrator?

https://news.ycombinator.com/item?id=11457567


I think this is awesome news. Hopefully we will see Chrome starting marking http only sites as non-secure and Apples App Transport Security (ATS) forcing people to switch to https all over the web within a year or two.

https://www.chromium.org/Home/chromium-security/marking-http... https://developer.apple.com/library/ios/releasenotes/General...


I would recommend the HTTPS everywhere extensions for your fav. browser. It forces all web-pages to be loaded using HTTPS (if available).

https://www.eff.org/HTTPS-everywhere


If available, and if someone has added it to the database of more or less manually maintained rulesets for redirection;

https://www.eff.org/https-everywhere/atlas/

https://github.com/EFForg/https-everywhere/tree/master/src/c...


KB SSL Enforcer allows you to automatically build your own enforced HTTPS list.

https://chrome.google.com/webstore/detail/kb-ssl-enforcer/fl...


I wonder how they work around Let's Encrypt rate-limiting?


This is about custom domains, not subdomains of wordpress.com (they're using a wildcard cert for that, and have been for years).

Rate limits aren't much of an issue in that scenario unless someone has more than 20 separate subdomains set up as a WordPress.com blog under the same domain. Even then, you could theoretically get 20 * 100 subdomains covered every week if you're smart about which domains you combine on a single SAN certificate.


Those are the rate limits, as far as I understand them:

* 100 Names/Certificate (how many domain names you can include in a single certificate) * 5 Certificates per Domain per week * 500 Registrations/IP address per 3 hours * 300 Pending Authorizations/Account per week

It seems to me that WP.com could reach at least one of those... So I was curious to hear how they were doing that.

And yes, I was wondering if they would replace the *.wp.com wildcard - i guess not...


The rate limits have been changed to 20 certificates per domain per week recently.

The registrations/IP rate limits aren't really a problem - WordPress could, in theory, run their entire Let's Encrypt infrastructure using one registration (account).

Pending authorizations shouldn't be much of an issue given that all custom domains are CNAMEs pointing to their servers, so they should be able to solve all challenges.

(By the way: If you're building a large integration, Let's Encrypt can change the rate limits for you.)


A little on-topic hype if allowed: free "HTTPS Everywhere" monitoring https://nonstop.qa. Hacker News passes with flying colors:

https://nonstop.qa/projects/387-hacker-news

(Free because I'm applying the GitHub model: free public projects, will eventually charge for private ones.)


Let's encrypt is great, but I'm still running into people that have Chrome on WinXP or even IE8. It's crazy, I know. They did promise to start supporting both o XP because it had something to do with an intermediate cert somewhere. They didn't deliver on that promise. I don't blame them.

By the way, the cert on Wordpress.com is issued by GoDaddy, all the examples I could come up with are also. Guess it's a roll out process.


Windows XP support was rolled out March 25 2016.

You can find more information about upcoming and completed features here:

https://letsencrypt.org/upcoming-features/


It also, ironically, broke a bunch IIS and Azure Web App hosted sites due to an incorrect intermediate being sent by the servers, with no recourse for new and renewing users at the moment...See https://github.com/sjkp/letsencrypt-siteextension/issues/42


But that doesn't really help you if you are using SNI to host multiple sites on a single IP address, does it?


Internet Explorer doesn't support SNI on Windows XP, correct.

Let's Encrypt doesn't force you to use SNI, though. SNI is not something you "stick" on a certificate - it's a TLS extension which you don't have to use at all.



Great. Tumblr enabled it earlier this year as well.


Only in their dashboard, not for individual blogs itself.



It says they don't support it on custom domains.


Let's Encrypt is great, but Start SSL has also shaped up considerably. A while back their process and the GUI was a real stumbling point. Today however it is a breeze to get it going. (Disclaimer: I am in no way affiliated with Start SSL)


Amazing what a bit of competition can do.


While this helps *.wordpress.com users or custom domains using the wordpress.com back end, it's going to cause a ruckus with self hosted ones.

Neither WordPress or LetsEncrypt has any way to modify global server setting on any shared hosting environment. Slapping in an SSL certificate doesn't make a site secure, properly configuring the services that use the cert is what makes it secure.

GoDaddy isn't going to let Company Xyz rebuild Apache or configure cyphers server-wide...

In the end, while this is a move in the right direction, I fear it will give false confidence to many web providers that don't have enterprise experience with security fundamentals.


This won't affect self hosted sites, only those on WordPress.com's platform. A lot of the code for that service isn't present in the self hosted script.

So it won't break servers or shared hosts.


Many of the larger webhosts have free (but not mandatory) SSL support in production, beta, or on their near-term roadmap.


Google's Blogger is moving to https too, over time, my dashboard shows.


I wonder if they bundle multiple domains in one certificate?


This is awesome news.

I wonder if Squarespace will follow suit in this endeavor.


Squarespace already allows this for non-custom domains, but if you have a custom domain then you can't use https.

I hope this move by Wordpress will push Squarespace to support https for custom domains as it's a very frequently requested feature.


Squarespace needs to follow suit


Does SquareSpace do any encryption for non-admin log in things?


12+ years in the making.


Nice.

However, they could have shelved out a couply of hundred of bucks for a wildcard cert before.


This includes custom domains not just *.wordpress.com


Is it not live yet? The article uses the present progressive "is activating", but e.g., https://whatever.scalzi.com serves a certificate whose CN is *.wordpress.com (i.e., one that is invalid for the intended domain).


whatever.scalzi.com is on WordPress.com VIP—same platform, but a different segment of users. Our VIP sites often use 3rd parties (mostly ad servers) that don't yet support https, so we haven't defaulted any of those sites to https—it's an option available if they want it though!


Thanks, that was the missing thing!


Wordpress is still a security nightmare.

PHP, mostly dyanmic everything, unmoderated cesspool of plugins, themes, etc... where you just drop code, predictable URLs and pages to brute force, I could go on...




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: