Month: September 2010

Google and the bottom search box

When Google rolled out “Instant“, they also removed the bottom search box. Bad idea.

Google Instant is a nice, web 2.0 improvement to Google “classic” where results appear as soon as you type them in the ad hoc text box. Google claims that Instant can save 2 to 5 seconds per search. Maybe.

But, at the same time, they removed the bottom search box. I extensively used this search box: when you enter your search criteria and look at the results, you may want to refine your search, add some terms, remove or exclude others, etc. With a second search box at the bottom, you can directly do it after having browsed the first bunch of results. Without this box at the bottom, you can’t: you have to think to scroll all the way to the top of the page and actually do the change in the only, upper text box. You lose 2 seconds to scroll back to the top of the page and you may lose some idea on the way (especially if you have 1001 ideas at the same time). When you sometimes perform a lot of searches per day, the time you gain with Instant per search is largely lost by the time spent browsing back to the top. I’m not the only one to think it’s was a bad idea.

But if you want to keep Google as (one of your) your search engine(s) and want to get back the second search/text box at the bottom (and optionally get Instant too), just use “Encrypted Google SSL Beta” (URL in clear:

Happy Software Freedom Day 2010!

Today, September 18th 2010, it’s Freedom Software Day all over the world. It is an annual worldwide celebration of Free Software, a public education effort with the aim of increasing awareness of Free Software and its virtues, and encouraging its use.

On the SFD website, there isn’t a lot of events registered for Belgium. There is only one, in fact, in Oostende (LiLiT is doing an install party in Liege but I can’t see any reference to SFD; still, it’s a good initiative!). Well, a SFD on September 18th in Belgium might not have been a good idea if the goal is to increase awareness of Free Software: more than half of the population is celebrating the Walloon Region or preparing a Sunday without car in Brussels (while others are just looking for a government since April 2010!). So, at a personal level, I decided to give Ubuntu a try (10.04 LTS).

In terms of user experience, you can’t beat the installation process of Ubuntu (my comparison criteria are Fedora 13 and any version of Windows XP, Vista or 7 that are not on a PC-specific image disc). Seven configuration screen with rather simple questions and that’s it. There are choices you can’t make like the selection of software you want to be installed and available on the next reboot. But, most of the general software is there: a web browser, a word processor, some games, a rudimentary movie player and a music player. The “Software Center” is also readily visible so you can’t miss it and it seems to be an obvious choice if you want to install any other software.

New Ubuntu desktop for Freedom Software Day 2010

The real test will now be if one can actually work with it. If I don’t post any furious comment against some features or if I don’t post anything about the installation of some software in the coming days / weeks, you’ll know I’m still working with this Linux flavour.

Browser hardware acceleration issue?

Browser hardware acceleration is meant to render websites faster by allowing the graphics card (its GPU) to directly display “things” (videos, animation, canvas, compositing, etc.) on the screen. By bypassing software rendering systems, lots of websites seem to render faster. All major browsers jumped on this: Firefox, Chrome, Internet Explorer and Opera (post of 2008!).

I understand that enhancing the user’s experience while surfing the web is something that can be interesting. Hardware acceleration opens the door to unseen compositions, to new types of animations, to new kind of applications. Directly in your favourite browser.

Comment if I’m wrong but hardware acceleration will not lead to fragmentation of the web landscape. HTML5 seems to be the standard behind which browsers developers are adding their acceleration engines.

However, an issue (from the user’s point-of-view) will probably be that hardware acceleration will still help the emergence of a consumer-only web. A lot of your applications will be in your browser, with your data in someone else’s data center. You want your data safe? You need to trust your provider’s security measures. You simply want your data on your hard drive? I think you’ll have a problem here. But I agree it’s not the technical implementation that will be responsible for that.

First LaTeX Beamer presentation seen in a proteomic conference There is another issue I see with browser hardware acceleration. And it’s very down-to-earth. As you often encounter in presentation with videos, the presentation is displayed via a beamer but not the video (a black rectangle is displayed instead). You can easily disable hardware acceleration in most presentation software (if it’s not disabled by default). But, with hardware acceleration fully integrated in the browser, what will be displayed with the beamer if we have to do a demo of a website or simply when the presentation software is the browser? A page with patches of black rectangles? I hope not.

Why do I blog this? I enjoy reading about the (technical) details of (browser) hardware acceleration. I am very interested in general in all the new developments in IT regarding the use of GPUs and graphics card computational power to solve current issue or allow future developments. But I’m also using these (new) technologies everyday. So I don’t want that technological improvements on one hand turn to cause trouble on the other hand.