by Kaj Kandler

In the recent months I noticed hat I had rather slow ping times to or The latency to Google’s servers was in the 200 – 300 ms. In addition I noticed that when my workstation was on the Corporate VPN, I had * pings of < 40ms. As Google these days is the network contacted by so many websites for analytics or as CDN for JavaScript, etc. , it is of vital importance to have a fast connection to Google. Some quick analysis revealed, that a typical traceroute to from my home went through 12 hops of Internet and then another 17 hops in Google's network and the latency jumped by 100ms four hops into Google's network. I found that rather odd. However, when I'm on the corporate VPN, the number of hops inside the Google network shrinks to 6-8 and their latency is much smaller. I also noticed that had a different IP address if I used the corporate network. Measuring the latency to the IP address that I got when on the VPN showed similar latency and traceroute results. So my configured DNS servers were to blame. Back when Verizon started to break the DNS protocol in their servers I had configured some public DNS server from Level 3, namely and as they had the best latency at the time. I had to reconsider that decision. Armed with a free open source tool named namebench I found the fastest DNS server’s available for my connection. But it turned out that their IPs for * were as bad as the previous one’s. So I tested the two name servers that Verizon configures automatically and they provide IP addresses with 20 – 40 ms ping times and much shorter traceroutes. I guess with multi homing the Internet’s architecture has fundamentally changed. That said, Verizon still uses an intentionally broken implementation of DNS, which does not return a failure if a request can’t be resolved, instead it returns it’s own web server. I almost considered to leave it at that, as better performance seemed more important then a broken DNS. However, the usability of this “helpful” Verizon server is horrible, as it redirects to its own URL, so if I make a typo I have to essentially retype my address or edit the original request in my URL bar to correct it.

As a last resort, I tested Google’s public DNS servers and While Google’s DNS servers do not answer as fast as my local verizon servers, they are only marginally slower and can deliver the proper IP addresses for the Google network without breaking the DNS protocol in the process.



by Kaj Kandler

I love my ergonomic MS Elite keyboard, so I use it on my Mac Pro workstation. However, today I needed to boot into single user mode to repair the filesystem of the boot partition. Normally you hold down the Command+S keys, but that only works with original Apple keyboards (and even then it seems to be unreliable for bluetooth keyboards). Apparently the key mapping happens in the drivers or some later stage of the OS and this is not loaded at the boot time.

I found out that the boot manager rEFIt has an option to boot n single user mode as well. Simply select the Mac OS X you want to boot and hit F2 to get a menu of different boot modes, such as safe mode or single user mode.



by Kaj Kandler

I just started to use multiple monitors on Mac OS X (Leopard) and immediately encountered issues with Outlook for Mac OS X. When I open a new window, to write a new e-mail, it positions the window on the main monitor and not on the second monitor where I have Outlook open.

That becomes very annoying when you use Screen Sharing and look only at one monitor at a time.

A small tool, SizeUp, comes to the rescue. SizeUp lets me most of all move the current active window to the other monitor, using a keystroke. So when I open a new window and it appears on the wrong monitor I can easily put it into its place.

SizeUp also allows to size the currently active window to the right or left half so I can have two windows side by side on a monitor, which is great for copying comparing documents.



by Kaj Kandler

If you run a website of some mild success, then you have come across so called “scraper” sites. A scraper site copies content form RSS feeds and potentially the web pages of a site and re-publishes it as their own content. Tonight I read a blog post about “benign scraper sites” by AK John.

Scraper sites hope to attract visitors that then click on advertisement and so make money for their owners. If they are combined with Search Engine Optiomization, they can outrank the original. Scraper sites are certainly a violation of copyright. John thinks that even benign scrapes, those that link back to the original source are harmful duplication of content that cloggs the arteries of the Internet.

When I also read Johns recent post on Google’s ambitions with “AuthorRank and the rel=author verification”. It became clear to me that Google can/will use the author verification of content to know which site has the original content and which site has the copy. Because the Google+ Author profile will point back only to the original site.

So to outrun the Scraper sites I will claim author ship for my content.

Here is the question for my readers, will Google be able to detect if the scraper site sets up fake Google+ profiles and modifies the author links? Does Google have a way to detect who published first?



by Kaj Kandler

Tonight I happened to read an article that made a claim about the website and its use of certain semantic web technology. I was curious how they employed the technology so I looked at one of their web pages for a random TV.

I was amused that even such a large retailer could make some simple mistakes. I found numerous places where invalid HTML was used, due to using reserved characters in regular text. Proper HTML should use substitues called entities. The error is triggered by a TV’s screen size being measured in Inches, which is often expressed with the double quote sign (“). However the double quote is a reserved character in HTML and so needs to be replaced by " where ever it is used.

Here are a few examples from

<meta name="keywords" content="DYNEX, 42" Class / LED / 1080p / 60Hz / HDTV, DX-42E250A12, 30"+ Televisions, Televisions" />
<meta name="description" content="DYNEX 42" Class / LED / 1080p / 60Hz / HDTV: 2 HDMI inputs; 1080p resolution; 160-degree horizontal and vertical viewing angles" />

<li class="property included-item">Dynex&#153; 42" Class / LED / 1080p / 60Hz / HDTV</li>

Its funny that the page encodes one special character properly (the Trademark symbol as ™), but not the other. But then in other places it messes up the trademark symbol and encodes the double quote correctly

<meta content="Dynexâ„¢ 42&quot; Class / LED / 1080p / 60Hz / HDTV" itemprop="name"/>

As it happens this error is in the area of code I was interested in. And yes, in one place both are correct.

Dynex&#153; - 42&#34; Class / LED / 1080p / 60Hz / HDTV - DX-42E250A12</title>

If you read the source code it is peppered with things like tracking codes and semantic web data to make it attractive for search engines and other programs that analyze code automatically. I think these encoding mistakes do mitigate those efforts to a certain degree.

For that reason I check all (most of) my pages with an HTML syntax validator. Not that I correct all mistakes, because most browsers can handle some of the mistakes just fine (including this one, except for the third example). However, every browser (and other programs reading HTML, such as search engine crawlers) is different in their ability to handle invalid code. So I try to take as little chances as necessary.



by Kaj Kandler

The Document Foundation has released LibreOffice 3.5. The new release has above all improved performance due to the elimination of dead code that is not used anymore or not really needed. This made the application lighter and faster. The most gained has LibreOffice Calc, the spreadsheet application.

Another focus has been interoperability, that allow documents from the Microsoft Office suite and Office Open XML documents to be read. Especially, scalable symbols from PPTX files are not imported correctly and various SmartArt is understood by LibreOffice 3.5. I’m sure that many office users will welcome the new ability to import Visio diagrams and reproduce them correctly. Also the import of RTF formatted documents has been improved.

LibreOffice 3.5 does now also support more completely the Open Document Format specification 1.2. Various graph forms are smoothed better, new data point and line ending symbols have been added. Unfortunately documents saved in the new version of the format are not yet recognized as valid by the Microsoft Office family. Lets hope the “leading” office suite does make its product interoperable soon.

Another major addition is a new and improved grammar checking tool, packaged with the Libre Office suite.



by Kaj Kandler

I did start working the user experience of Plan-B for because I thought that 70% of a bounce rate is rather high. While I succeeded with some first steps to encourage visitors to explore the site, some other steps did not do as much as I had hoped. However I was wondering what my target should be? What would be a good bounce rate, specifically a good bounce rate for my type of site? I wondered if there is a benchmark that I could measure myself against?

Today I read the Google Help article about high bounce rate. Most informative is the video from Avinash Kaushik @ He states:
* Marketing metrics are different for every website
* Typical bounce rates are between 40 and 60%
* There are two reasons for a visitor bouncing:
* The visitor found what she was looking for (satisfied customer?)
* The visitor did not think she found what she was looking for (window shopping, in the wrong place, different expectations)
* It is hard (impossible) to know which is the reason for a bounce
* However changes in bounce rate are significant. The trend is your friend!
* Bounce rate is a great qualifier metric!

So here it is some number I can compare with. However, the nugget I learned is to read the bounce rate in conjunction with other metrics:
* How does the bounce rate for different traffic sources (Google vs. Bing, Search vs. Direct Link vs. Mail campaign, AdWords vs Organic search)
* How does the bounce rate differ per keyword on the same landing page?
* How does the bounce rate differ on the top 20 landing pages?



by Kaj Kandler

I recently decided to replace the lucene based search engine on Plan-B for with a Google Custom Search engine. At first glance this seems to be an easy task. Remove the old code and replace it with some Google Java scripts. However this is not how it turned out to be.
I targeted a layout, where the search box is part of the general navigation menu bar and results appear on their own page. However the HTML/CSS code generated by Google is rather inflexible. The two page template came the closest as it generates two separate code snippets, one for the search box and button and one for the search results.
So I had to add some CSS to make the divs and its generated child elements inline elements

div#cse-search-form {
display: inline-block;
zoom: 1;
div#cse-search-form * {
display: inline;

Another inconvenience is that the JavaScript includes an absolute URL for the results page. But it also works when I omit the protocol and hostname part




by Kaj Kandler

I have replaced the Plan-B for OpenOffice / LibreOffice search engine with Google Custom Search.

The local search engine based on lucene was heavy on resource consumption and did require a lot of effort to keep up the indices with new or changing content. So I decided to switch to a Google Custom Search Engine.

I hope this change makes the site an even better resource or OpenOffice and LibreOffice users. Please let me know if you have any suggestions on how to improve search on the site.



by Kaj Kandler

In my quest to improve the user experience at Plan-B for OpenOffice/LibreOffice, I did change the over 1,000 video pages, such as “Export a presentation in PDF format” or “How to create an Agenda Template with Writer.”

All video pages were constructed the same. Front and center was a massive frame for the 800×600 video player. Every video started instantly when the page loaded. I replaced the frame it with a simple “play video” button that brings up an overlay to play the video. This button is much smaller and allows you to start the video multiple times.

I have read that starting video instantly is not appreciated by most users and I sympathize. Although our videos are silent, and do not draw attention immediately from everybody around, it feels better to be in control.

This change brings the textual content of the pages above the fold and I hope this will stimulate users to explore the site more fully. if I’m right this should influence lower the bounce rate and increase the average time spend on the site.