[go: up one dir, main page]

Working around the Twitter Cards "SSL Handshake Error" on Uberspace

Sebastian Greger

NB. This is a two-years old post, only kept for archive purposes. The technology stack at either Twitter or Uberspace has since changed, and the described issue should no longer prevail - making this workaround obsolete.

Ever since I migrated my website to HTTPS (and you should too!), I noticed that my Twitter Card implementation - the code that adds a little preview as my posts are automatically syndicated to Twitter - did not work any more.

A blog post as syndicated to Twitter after the site was switched over to HTTPS - no more preview of the linked article.

I had obviously set up an HTTP 301 to redirect all incoming requests on the former HTTP URLs to the new canonical addresses, but both for old and new Tweets, the Twitter Cards did no longer show up.

The root of the issue

Not only the obvious timing, but also the “SSL Handshake Error” given by the Twitter Card Validator, indicated that this was related to my recent update.

When provided with an HTTPS link to my site, the Twitter Card Validator only returns an “SSL Handshake Error”, i.e. their bot cannot access the encrypted website.

A thread at the Twitter Community suggested a range of options, all of which did not seem to apply to my case. Luckily, I discovered another user who had reported this very issue to my web hoster Uberspace, who replied to his support request:

https://twitter.com/ubernauten/status/760123730009939969

It turns out that Uberspace is encrypting HTTPS traffic so heavily (4096-bit Diffie-Hellman) that Twitterbot cannot access the site any more. So Uberspace rightfully declares that it is a Twitter problem, not theirs, and Twitter is obviously running a rather old version of Java, only supporting 1024-bit encryption. Looks like an unsolvable challenge …until you throw the problem at people smarter than yourself! *

Working around the problem

At the first Homebrew Website Club Berlin last week, two fellows not only helped me decipher what above problem really consisted of, but came up with an idea: simply keep serving Twitter the unencrypted pages while forwarding everybody else to HTTPS (hat tip to Oliver for the idea, and Sven for the sparring).

So here is the solution (or shall I rather say “hacky workaround”) for making Twitter Cards work with HTTPS sites on Uberspace - or any web host that uses a 4K-DH encryption Twitter cannot deal with:

1. Exclude Twitterbot in the HTTP-to-HTTPS rewrite

First up, I adapted the .htaccess file in the root folder to except Twitterbot from the HTTP 301 forward to the encrypted pages:

RewriteEngine On

RewriteBase /

RewriteCond %{HTTP_USER_AGENT} !Twitterbot

RewriteCond %{HTTPS} !=on

RewriteCond %{ENV:HTTPS} !=on

RewriteRule .* https://%{SERVER_NAME}%{REQUEST_URI} [R=301,L]

This rewrites an incoming request to the HTTPS URL (line 6) only if the user agent does not include the term “Twitterbot” (line 3), and the request does not already target the HTTPS address (lines 4+5).

2. Serve an unencrypted image URL in Twitter Card markup

In my Wordpress theme, I made sure that Twitterbot is being served an HTTP URL for the thumbnail image in the Twitter Card meta markup:


  if ( $domsxe = simplexml_load_string( get_the_post_thumbnail( $post->ID, 'large' ) ) ) {

    $meta_thumbnail = $domsxe->attributes()->src;

}

?>

" />

3. Test

Most obviously, the Twitter Card Validator is the go-to address when setting up Twitter Cards. It sometimes fetches previously tested URLs from cache, but simply adding a GET variable to the URL (…?12345) ensures that the validator pulls the most recent copy.

Once the hack is implemented and Twitter provided with an unencrypted HTTP link, the Twitter Card validator provides a working result.

The User Agent Switcher extension for Firefox came in handy to verify that the server indeed only returns the unencrypted site to requests identifying themselves as Twitterbot (hint: disable DNS cache and all other caches, e.g. using the Web Developer Tools extension, since HTTP 301 requests are cached permanently, i.e. if you ever visited a page forwarded using that code the browser will not check the original URL again).

I also ran a quick test with Bridgy, ensuring that the service is still able to syndicate the backfeed (likes, retweets, replies) correctly to my Webmention endpoint, despite the fact that the Tweets now carry HTTP URLs instead of the canonical HTTPS addresses. Not to my surprise, being the trusty high-quality tool it has been for a long time, Bridgy of course handles these forwards as it should.

4. Tweet only HTTP URLs

In order for this hack to work, any Tweets posted may not include a link to the HTTPS version of pages since - see the intro to this post - Twitterbot would not be able to access the site.

This naturally comes with a few caveats, which might be worth considering if you intend to apply this hack on your own site, depending on the circumstances:

  • Users following the link from my tweets are initially sending out an HTTP request, which could be intercepted and/or tracked (with an HTTPS link, interception would not be possible and the only visible meta information would be the host name, but not the page requested). Since URLs on Twitter are anyhow wrapped into a t.co-shortlink and Twitter users thereby expose their clicks to be tracked, I consider this suboptimal but negligible in my specific use case: with my Twitter feed being only a syndicated copy of this blog’s contents, this only affects Twitter users - others always reading my content at the HTTPS-encrypted source.
  • If others share the link of a page on my blog on Twitter, they are most likely to copy-paste it from the address bar, sharing the HTTPS version of the URL and therefore without support for the Twitter Card. This is the second major limitation (making this a “hack”, not a “solution”) and the reason it would still be desirable to see Twitter update their bot some day soon.

A more beautiful implementation would be to rewrite only Twitterbot requests from HTTPS to the less-safe HTTP, but the strong 4K-DH encryption prohibits this order as the bot would not even be able to get to the point where it is being redirected (the “handshake”). On a side note - and not intended as a critique but rather as a philosophical observation - this kind of turns Uberspace’s desire to apply the strongest possible encryption into a security weakness, if sites have to fall back to the old HTTP protocol in order to make use of widely used features such as Twitter Cards. Yet, obviously, the real problem here is Twitter not ensuring compatibility with the latest standards - but I am not surprised they do not care about this.

The result

After less than half an hour of work, Twitter Cards are working again with my blog:

After implementing the special handling of Twitterbot, the Twitter Cards are back and flawless.

As far as I can tell, this hack should not have any negative impact on search engine ratings (as attempts to serve e.g. Googlebot something different than the normal web user will see commonly have), since it is specific to Twitterbot. From what I can see so far, human users are correctly forwarded to HTTPS at any time, and Twitter repeatedly displays the correct Twitter Card for any new Tweets from my site.

* only later I found out that another Uberspace user had already come up with the same idea last summer.