Category: User Interface

Two annoying issues with Firefox 4 (and their solutions)

The Mozilla Foundation updated Firefox to its version 4 last month and it has lots of interesting features. So I upgraded and although I initially criticized the new interface (when looking at screenshot from beta releases), I now quite like it: it gives more space to the actual content (web pages) to keep the “container” (the Firefox GUI in itself) to a minimum.

But … (because there always is a “but”). But I found 2 annoying issues with the new version …

  1. the “open in new Tab” option is now second in the pop-up menu when you right-click on a link;
  2. Firefox will not save all your open tabs when you close it.

In version 3, when you right-click on a link, the pop-up menu shows you options to open in new window or in new tab in this order:

  1. Open Link in New Window
  2. Open Link in New Tab

Now in version 4, when you right-click on a link, the pop-up menu shows you these options in the following (inverted) order:

  1. Open Link in New Tab
  2. Open Link in New Window

So if you are used to click on the second option to open the link in a new tab (in version 3), you automatically do this in the new version. But you open a new window. That’s annoying! And that’s a known bug/feature.

There are three solutions to this issue:

  1. use Ctrl + (left) click to open a link in a new tab.
  2. use the Menu Editor add-on to re-organise the pop-up menu order as you wish.
  3. use this tweak proposed in a Mozilla forum.
  4. (I know, a fourth solution) just get used to it because I’ll also find it annoying if Firefox developers suddenly change the order of the menu items back in version 5.

My other issue with this new version of Firefox is that it doesn’t warn you that you will close it but not save the tabs you are currently browsing. In Firefox 3, there was a warning dialog box telling you something like: “you are about to close Firefox but there are still tabs open. Would you like to save them, quit anyway (and lose them) or stay in Firefox”. In Firefox 4, no warning, it closes and doesn’t save your tabs.

Initially you think you made a mistake, there was no tab when you closed Firefox. Or maybe you didn’t pay attention to the dialog box and closed Firefox, telling it not to save. At the third occurrence, you are sure there is an issue! And there is one indeed! The Firefox development team apparently decided not to show this box anymore. It sounds ok if you think that this roadblock-dialog-box isn’t in the user’s way when he/she decides to quit. But then save the tabs to users used to this features don’t lose their tabs. This evening, I lost at least 20 tabs containing things not so important (so not in bookmarks) but I wanted to read tonight. Grrr …

Fortunately, as usual, there is a solution:

In the address/url bar enter: about:config
In the filter box enter: quit
set browser.showQuitWarning to ‘true’

Again, this is a feature or an explicit decision: see bugs 592822 and 593421 for instance.

From my point of view, the gold standard is: don’t change the UI and user experience ; or do it but tell the user you did it the first few times the old behaviour isn’t occurring.

Google and the bottom search box

When Google rolled out “Instant“, they also removed the bottom search box. Bad idea.

Google Instant is a nice, web 2.0 improvement to Google “classic” where results appear as soon as you type them in the ad hoc text box. Google claims that Instant can save 2 to 5 seconds per search. Maybe.

But, at the same time, they removed the bottom search box. I extensively used this search box: when you enter your search criteria and look at the results, you may want to refine your search, add some terms, remove or exclude others, etc. With a second search box at the bottom, you can directly do it after having browsed the first bunch of results. Without this box at the bottom, you can’t: you have to think to scroll all the way to the top of the page and actually do the change in the only, upper text box. You lose 2 seconds to scroll back to the top of the page and you may lose some idea on the way (especially if you have 1001 ideas at the same time). When you sometimes perform a lot of searches per day, the time you gain with Instant per search is largely lost by the time spent browsing back to the top. I’m not the only one to think it’s was a bad idea.

But if you want to keep Google as (one of your) your search engine(s) and want to get back the second search/text box at the bottom (and optionally get Instant too), just use “Encrypted Google SSL Beta” (URL in clear: https://encrypted.google.com/).

Browser hardware acceleration issue?

Browser hardware acceleration is meant to render websites faster by allowing the graphics card (its GPU) to directly display “things” (videos, animation, canvas, compositing, etc.) on the screen. By bypassing software rendering systems, lots of websites seem to render faster. All major browsers jumped on this: Firefox, Chrome, Internet Explorer and Opera (post of 2008!).

I understand that enhancing the user’s experience while surfing the web is something that can be interesting. Hardware acceleration opens the door to unseen compositions, to new types of animations, to new kind of applications. Directly in your favourite browser.

Comment if I’m wrong but hardware acceleration will not lead to fragmentation of the web landscape. HTML5 seems to be the standard behind which browsers developers are adding their acceleration engines.

However, an issue (from the user’s point-of-view) will probably be that hardware acceleration will still help the emergence of a consumer-only web. A lot of your applications will be in your browser, with your data in someone else’s data center. You want your data safe? You need to trust your provider’s security measures. You simply want your data on your hard drive? I think you’ll have a problem here. But I agree it’s not the technical implementation that will be responsible for that.

First LaTeX Beamer presentation seen in a proteomic conference There is another issue I see with browser hardware acceleration. And it’s very down-to-earth. As you often encounter in presentation with videos, the presentation is displayed via a beamer but not the video (a black rectangle is displayed instead). You can easily disable hardware acceleration in most presentation software (if it’s not disabled by default). But, with hardware acceleration fully integrated in the browser, what will be displayed with the beamer if we have to do a demo of a website or simply when the presentation software is the browser? A page with patches of black rectangles? I hope not.

Why do I blog this? I enjoy reading about the (technical) details of (browser) hardware acceleration. I am very interested in general in all the new developments in IT regarding the use of GPUs and graphics card computational power to solve current issue or allow future developments. But I’m also using these (new) technologies everyday. So I don’t want that technological improvements on one hand turn to cause trouble on the other hand.

Cognitive Surplus visualised

In the 300-and-more RSS items in my aggregator this week, there are 2 great ones from Information is Beautiful, a blog gathering (and publishing its own) nice ways to visualise data.

The first one is based on a talk by Clay Shirky who, in turn, was referencing his book Cognitive Surplus. In Cognitive Surplus visualized, David McCandless just represented one of Shirky’s ideas: 200 billion hours are spent each year by US adults just watching TV whereas only 100 million hours were necessary to create Wikipedia (I guess the platform + the content) …

Cognitive Surplus visualised from Information Is Beautiful

It makes you think about either the waste television helps to produce either the potential of human brain(s) if relieved from the burden of television.

The second interesting post appeared in fact in information aesthetics, a blog where form follows data (referencing Information is Beautiful but I can’t find this post). In Top Secret America: Visualizing the National Security Buildup in the U.S., Andrew Vande Moere relates “an extensive investigative project of the Washington Post that describes the huge national security buildup in the United States after the September 11 attacks”. The project website contains all the ingredients for a well-documented investigation with the addition of interactive maps and flash-based interfaces allowing the user to build his/her own view on the project.

Top Secret America from the Washington Post

It’s nice to see investigative journalism combined with beautiful data visualisation and handling!

Software license and use of end-product

In one of his buzz, Cédric Bonhomme drew my attention on the Highcharts javascript library. This library can produce beautiful charts of various types with some Ajax interaction. The only negative point imho is that it is dual-licensed and all cases deprive you from your freedom:

  • there is a first Creative Commons Attribution-NonCommercial 3.0 License: you can use the library for your non-profit website (see details on the licensing page) ;
  • there is a commercial license for any other website.

Now what if we only need the end-product, i.e. the resulting chart, in a commercial environment? What is covered by the license is just the re-use of the javascript library in a website, not the resulting chart. If a company choose to use Highcharts internally to render some beautiful charts and just publish (*) the resulting image, I guess they can just download the library and use it (* by “publishing”, I mean: publish a scientific paper in a peer-reviewed journal, not publishing on its website). On the other hand, no one ever questioned the fact commercial companies have licenses for all the proprietary software they use to produce anything else, from charts to statistical data, just because they publish results with these software as tools. So the “trick” here would be that, by changing the medium on which you display end-results (from website to paper, even if it’s in PDF on the journal website), you can use the free-to-download license, even in a commercial environment, for an article from a commercial company. I’m not sure this was the original intention of Highslide Software.

3DSecure not secure

You may have seen in various places that “3-D Secure” (aka “Verified by Visa” or “Mastercard Securecode”) is not as secure as it says. The original paper is here (PDF).

Unfortunately, having implemented the 3-D Secure system via a third-party somewhere in Europe, I have to agree with the authors. I will insist here on one aspect – the inline frame – but the authors are giving more aspects and some solutions worth considering in their paper.

The first issue is that most merchants or banks embed the 3-D Secure page in an inline frame: the 3-D Secure page appears as if it was served by the merchant website although it comes from another website. This is similar to the hypothetical case where that image in your newspaper comes from another newspaper. You wouldn’t notice the difference (unless/until the image is completely different from your newspaper content). And, back to our topic, if a fake 3-D Secure page is given inside the inline frame, it’s difficult to notice it, the most common way of noticing it (a different URL in the address bar) is indeed hidden by the inline frame. During the development and testing, I put in place an internal, fake but similar-looking payment page and we sometimes have to think twice before knowing if we were on the fake page or in the test environment. Webpages at a merchant or a bank website are of course supposed to be kept far from crackers and villains 😉 But a man-in-the-middle attack (replacing on the fly the real payment page by a fake one allowing to collect card details) is rather easy to implement (considering actual villains know-how) and wouldn’t be noticed until they collected a certain number of card details …

To illustrate the above, please insert your card details below.

Card number:
Expiry date:
Secure code:
 

Fake 3D Secure

Apart from the fact this form was done in 30 seconds and doesn’t really look like a real a real payment form (and does nothing), how can you tell the difference? So, be careful when using 3D secure (with Firefox you can always right-click to see the security information about the form you are about to fill in). And always try to check the URL if it’s possible.

Evolution of H1N1

I needed some data to test the pChart charting library so I decided to use WHO data about swine flu (in its weekly updates). The only issue I had was that the WHO started to collect data by country and changed to gather data by regional offices from July 27th, 2009 onwards. So graphs below are only by regional offices.

Evolution of A/H1N1 cases - jepoirrier.net

Evolution of A/H1N1 deaths - jepoirrier.net

For your information:

  • AFRO: WHO Regional Office for Africa
  • AMRO: WHO Regional Office for the Americas
  • EMRO: WHO Regional Office for the Eastern Mediterranean
  • EURO: WHO Regional Office for Europe
  • SEARO: WHO Regional Office for South-East Asia
  • WPRO: WHO Regional Office for the Western Pacific

I didn’t really see such graph on the web but there is the excellent FluTracker by Dr. Niman and a lot of information about the swine flu on Wikipedia. If you want to start interpreting these curves, you might be interested in reading squareCircleZ’s post about the H1N1 and the Logistic Equation.