Category: Reading

"Why groupthink is the genius of the internet"

In the August 10th, 2006 issue of Financial Times [1](*), Patti Waldmeir wrote a column about a new book [2] she recently read.

In this book, Sunstein start from a 1973 citation from F. Hayek, a liberal philosopher and economist:

Each member of society can have only a small fraction of the knowledge by all and … civilisation rests on the fact that we all benefit from knowledge which we do not possess.

While Sunstein knows the potential flaws of today internet collaborative projects (wikis, blogs, etc.), he argues that “sharing scientific information online would cure some of the worst problems of the U.S. patent system and foster innovation much more efficiently than costly patent litigation”.

Before the internet, we used to look for solutions by asking family or neighbours. Now, we are looking on the internet where people genuinely wants to communicate their knowledge. Groupthinking may be “the genius of the internet”, it already was the genius of any group, with or without computers and network.

Hmmm … Anyway, this author seems interesting to read …

[1] Waldmeir, P. “Why groupthink is the genius of the internet“. Financial Times, August 10th, 2006, p. 5 (article unavailable without subscription)

[2] Sunstein, C. “Infotopia: How Many Minds Produce Knowledge“. Oxford University Press, October 2006 .

(*) I am taking advantage of a free 4-weeks subscription to the Financial Times. That’s why it’s my second post about an article published in this journal. But I don’t think I’ll subscribe: 1. I have other things to read ; 2. business and finance are not in my core business ; 3. I don’t understand half what they wrote (especially in the “Market data” and pages alike).

P.S. When you read F. Hayek’s biography on Wikipedia, this political philosopher also made an inroad in cognitive science, independently developing an alternative “Hebbian synapse” model of learning and memory. Another interesting author to read …

"Hacking the genome"

Like computer hackers who cooperate in developing and using tools to understand and manipulate the inner workings of computer software, researchers are developing sophisticated biological methods that will allow them to crack the function of the genome.

Daniel Evanko shortly writes about two methods to probe the function of the genome: cDNA sequencing and microarray hybridization. It’s in Nature Methods 3, 495 (2006): abstractfull text

(a post just to show that biologists are also hacking their stuff)

Some quotes …

In the June/July 2006 issue of Scientific Computing World, there is an interesting article about Andre Geim, director of the Manchester Centre for Mesoscience and Nanotechnology. Do you remember the levitating frog? It’s him (he even got an Ig Nobel prize for it). Do you remember the “gecko tape” ? It’s also his actual group!

So, in this article, they wrote about Prof. Andre Geim story, from his early school days near the Black Sea and in Moscow to the various labs he visited and worked with in the past few years. Although he works in nanotechnology, some of his quotes can easily be applied to biological sciences (where I am not doing “mainstream” experiments like stem cells, genomics, fMRI, etc.) …

Many of the things I learned [at University] I never used in my professional life, but I guess it helped develop some of my axial lobes. I used those lobes to replace the lobes I lost due to the amount of alcohol we needed to wipe out after the exams.

I am so glad I am not doing astrophysics or particle physics, because what it actually means is that there are so many people involved in any experiment you are just a tooth in the wheel and, if you end up at the top of the food chain, it is more by luck than your abilities. Those at the top could just as well be journalists or politicians.

Many people chose a subject for their PhD and then continue the same subject until they retire. I despise this approach. I have changed my subject five times before I got my first tenured position and that helped me to learn different subjects.

Gecko tape was a sideline. I simply cannot keep to the same track in my research. It is very difficult to get research money for something new. You have to ask for money to continue something that is old, but I believe you should use some of the money to try something new.

When you have a thousand people researching one subject, they start acting like a single organism that acts by its own rules.

Identity 2.0

This week-end, I attended a scientific meeting and, although the content of the presentations were often interesting, they also often lacked attractiveness. This reminded me two videos I stored, some time ago, on my hard disk. Sébastien Lorion called them “refreshing”. And, for me, not only these presentations look beautiful, they also talk about an interesting topic: who are you on the internet ?

In the first presentation (a keynote at OSCON 2005), Dick Hardt talk about what is identity and how do we prove who we are, in the online world.

Identity is what I say about me and it’s what other say about me. In the real world, technical advances had enabled the separation between acquisition and presentation of credentials as well as the separation between the identification process and the authorisation process.

Now, in the online world, we are still at the “Identity 1.0” level, where one has to register at a website in order to get a service. User IDs and passwords are just authentication, they only proves that you are a directory entry! At this level, it’s impossible to prove who you are because this so-called “verified identity” is not what you apparently give to the website but what this website knows about you. Dick Hardt calls those websites “walled gardens”, closed and complex identity silos, lacking transparent policies, simplicity, scalability and flexibility.

So, with his company, Sxip, he proposes a Simple eXtensible Identity Protocol (it’s the acronym for SXIP), based on buzzwords like “Web 2.0” or “webservices” (ok, I’m exagerating a little bit). He described some of the technological details in another presentation (given at ETech 2006). And, although I didn’t tested the websites he gave as examples, I think that, perhaps like other companies (MS Infocard, IBM Higgins, …), they succeeded in separating acquisition from presentation of credentials as well as the identification process from the authorisation process.

Use of a credential provider to many resources needing this credential
Use of many credential providers to only one resource, a bit like showing ID, SSID and driver license to the policeman
Claim acquisition
Claim presentation

In his talk, Dick Hardt gave two links : his blog and Microsoft Kim Cameron’s (where you can find the laws of identity but I didn’t had time to read them yet).

Finally, to come back to the presentation aspects, I think Dick Hardt presentations are quite surprising for me, sometimes slides go too fast (that’s because I’m becoming old 😉 ). But I am wondering how can I apply some of his tricks to my presentations (next one is on June, 29th). Let’s see ..

Yes, Trusted Computing is used for DRM

In this blog, Andy Dornan takes us from a simple demonstration of Lenovo laptops new “abilities” to the fact that the real owner of documents with DRM is the software company and not the owner/creator of the document.

You can create a document and claim ownership on it with DRM systems. Unless you can open it with or export it to a software coming from another company, you’ll be dependent on one company to open your document. Imagine you create a text file and protect it with sofware X. If you cannot open it in another text processor/editor and that the maker of X decides that you cannot open your document anymore (for whatever reason: you live in a dangerous “terrorist” country, your name sounds too different, you didn’t pay your monthly fee on time, etc.), your are stuck.

Why do people needs to control who can access to their documents?

I have not lived a long time on this earth but I can’t find a good reason to control access to a document. The documents-themselves do not need to be protected. The protections need to be enforced at the physical access (and/or at the network level).

I thought that one good reason was to restrict access to confidential data/documents regarding health of patients in hospitals. But even at that level, the document is the wrong target. In the real, still paper-based, world, hospitals don’t encrypt data in the medical files. They simply don’t give the key that open the door to archives to anyone. Of course, a malicious key-owner can give/lend his key or lose it ; it’s exactly the same in the computer world: I can give you my passwords or lose the small sheet of paper where it’s written. You can even reproduce my fingerprint (I’m not saying it’s easy).

Finally, I think there are actually enough tools that are free and do not force you to use proprietary tools to encrypt your data (cryptoloop, dm-crypt, GnuPG, …).

A special Belgian e-ID for foreigners? Bad Idea!

This last week-end, a Dutch-written Belgian newspaper wrote that the Minister of the Interior, Patrick Dewael, is planning a special electronic ID card for foreigners in Belgium. If this become reality, every foreigner in Belgium will have an e-ID with his/her biometric data inside, even if he/she is officially living in Belgium, with a regular permit to live, work, etc. Of course, this project is aimed at illegal foreigners (btw, have a look at his other “brilliant” idea: heavily punish those who are helping illegal foreigners to obtain asylum, regular papers, etc.). Thus it seems there will be two versions of this card. Their official reason is “better control”.

I first think it’s a very bad idea because no one will be equal in front of the law, police forces, etc. You can keep you “special foreigner e-ID” in your pocket but once you show it to someone (to identify yourself in a bank, in a car rental office, …), it will be humiliating. And there is a risk that some people will refuse to interact with you because of this card.

This e-ID is only a tool, I agree. But it will be created with a bad intention in mind. In a previous post, I didn’t dare to say that the “regular” e-ID was created in order to control Belgians. This “foreigner e-ID” is created expressly to control foreigners in Belgium. The technical questions about who will be able to read information on the regular e-ID and how it will be done, within which legal frame ; these questions are not yet solved. Now that it’s technically possible, nauseous ideas are appearing …

If you know French, you can read some reactions in La Libre Belgique (there is a video but I cannot see it (obscure MS format unreadable with free software)).

And I thought I had stress before my presentations …

There is a story on how Steve Jobs prepares his talks for Apple’s keynotes in the Guardian Unlimited. Well, it doesn’t say much about the preparation in itself. But I can feel the stress Mike Evangelist is experiencing: he had to talk in front of hundreds of people, in front of his boss and present software not yet finished. And I thought I had stress before my presentations …

Of course, there is another reading of this article. OK, Apple is a small technology company. But it developped a good sense of communication. When you are buying an ipod, your are not only buying a portable music player (btw including imprisoning DRM): you are also buying a feeling (of hype, of having the last gadget, …). When you are buying a Mac computer, it’s also a feeling of being part of “another” community, … Apple cultivated this feeling since the beginning with slogans like “Think Different”. And, of course, this article, the book Mr Evangelist is trying to write, … even this post on this blog, they all participate in the “buzz” around Apple keynotes. Finally, if this noise (*) wasn’t there, Apple will only be another computer-selling company.

(*) in signal processing, noise is what intefere with the signal. Only the signal (Apple products) might be important. Noise is there to disturb, enhance, distord, … the signal.

Quaero and the quest for alternatives

An article in the French newspaper Le Monde presents Quaero (to seek, in Latin) as the future “European Google”. Comments on this article are divided between supporters of this alternative and denigrors that predict another bureaucratic, bloated, ineffective project. My point here is not to argue pro or against this project. But I would like to dwell on American databases and search engines that serve the entire world.

When you need to look at some information on the internet (mainly, the web), I am sure you are using (American) tools like Google, Yahoo! or Altavista. In the life sciences domain, we have a wonderful database, PubMed, a service of the (American) National Library of Medicine that includes over 16 million citations of biomedical articles. When you are preparing a presentation or an experiment on a subject, it’s a great tool to do the bibliography.

I am insisting on the fact that these tools are coming from American companies or government agencies because I am wondering what we are going to do if, one day, for whatever reason, the U.S.A. will decide to stop providing these tools to the entire rest of the world. Or what if they simply decide to filter content delivered outside their country? Are you sure they are not already doing it? It’s the same problem with the satellite positioning system (GPS ; that’s why Europeans are launching the Galileo project), the root of internet domain name servers, the Microsoft Windows operating system, etc.

So, if the goal of Quaero is to achieve a relative independance, I agree with it (I have still some fears it will become a costly and ineffective tool). But I am wondering why isn’t there any free (at least as in “free beer”) alternative to PubMed. For the moment, I only see an alternative in the cooperation, interoperability between Open Access repositories with projects like the Open Archives Initiative, OpenDOAR, GNU Eprints and other software. But, until Open Access journals are widely used by scientists, it won’t be a PubMed replacement. And there is still no alternative for scientific litterature already published in Closed Access journals.

Identification -vs- authentication

I was reading a presentation on the Belgian electronic identity card (PDF 150 kb, in French, by a friend). Compared to the old, analogic card, this new card has an electronic chip on it. This chip contains some information that are already visible to any human eye on the surface of this card and more information (like a photo, your address, digital certificates, …). I stopped on the 5th slide where it’s said that this new “e-ID” will allow someone to be identified, to authenticate (what?) and to fill in on-line administrative papers.

Being in the “general culture” of privacy-related subjects, I often heard these two words (identification and authentication). But I never paid too much attention to their meaning. So, once and for all, I decided to have a look in a dictionary.

Identification is the act or process of identifying somebody or something or of being identified. So, it’s an act or process of showing or proving who somebody is. The identity card (ID card) is a card bearing the holder’s name, signature, etc. and often a photograph, carried or worn by somebody to show who he/she is.

Authentication is the act or process of proving something to be valid, genuine or true (act of authentication). You even have the word authenticity, the quality of being authentic.

So, why put identification and authentication means in the same card? Aren’t they redundant? The old, analogic ID card was sufficient to prove who I am to a policeman and to retrieve administrative documents. I think the idea behind this new e-ID card is to adapt this identification process to the electronic world (internet being the most obvious one). It seems it’s far more easy to forge another identity based only on character strings and bits than on a real, physical human being. When paying on the internet with a credit card, you need your card number, your name and a “validation number” that is on the back of your card. Now, with the e-ID, you’ll have digital certificates to electronically identify yourself and authenticate this identification.

As I use to say, this is only a tool (like a hammer, a knife, a RFID tag, a video camera, etc.). But now, I often add that it depends on the goals behind the creation of the tool.

A knife was first created to cut meat, branches, etc. A hammer was first created to hit on a nail. A video camera was first designed to add motion to photographs. Now some people use knives to take control of planes, they use video camera to film their children playing around or British cars registration numbers. This diversion of usage, combined with an increasing “Western comfort” lead some people (in governments or not) to the need of preserving this comfort, this security. They now not only created new tools (DRM, RFID, …), they created tools in order to keep and further increase their profits, to control identities, …

I am not saying that the Belgian government is intentionally imposing the e-ID card in order to control Belgians. But, apparently, some points are not clear … Who will control who (or what application) will have access to the information stored on the chip? And who will control if the restrictions on information access are respected (and how)? Who will control data mining done with information retrieved from the chip (and how)? For the moment, only information already available from different sources are now grouped on the chip, making them easier to retrieve. Who knows what kind of information could be added on the chip, later? If you want more information on this topic, I suggest you to follow the news on the AEL website.

P.S. I really like dictionary: you are looking for one word and you finally read definitions of 2 or 3 words. And if you have an illustrated dictionary, you’ll also look at the pictures. For example, in my Oxford dictionary, “identification” is on the same page than an illustration of an iceberg. An iceberg is just “a huge mass of ice floating in the sea”. But, because it’s related to the idiom “the tip of the iceberg”, nearly 80% of the illustration is showing iceberg part below the see level. By the way, while I was there, I also checked the word idiom: “a phrase or sentence whose meaning is not clear from the meaning of its individual words and which must be learnt as a whole unit” (of course, I also read the other definitions of idiom …).

2 studies, 2 differents vision of innovation and competitivity in Europe

Some days ago, IDC published a study carried out on behalf of the BSA (Business Software Alliance). In this study, they promised the creation of 4 000 new jobs and the addition of 2.6 billion US$ to the economical growth in Belgium if software piracy is reduced by 10% between 2006 and 2009. For France, it is a promise of 30 000 new jobs and an addition of 13.7 billion US$ to the economical growth!

Although I know which companies are “hiding” behind the BSA (Microsoft, Adobe, …), I think that this continuous battle against software piracy could only be a benefit for the free software industry. With exceptions of some job-specific software, I think free software have now all the capacities to be used in the industry (imho ; and there are software which are only available as free for some jobs). Therefore, to oblige people to acquire licences is also, in a way, to oblige them to ask themselves questions of costs and dependences to their suppliers. And, perhaps that the idea of a migration to (or to remain with) free software can be seriously taken.

On the “other” side, a study on innovation and competitiveness in Europe (PDF 1.8Mo), carried out by PricewaterhouseCoopers on behalf of the Ministry for the Dutch Economic affairs, presents the future European law on the patents as a particular threat for the ICT industry in Europe. The study takes in example the rather moderate protection set up around IP protocol, the www, Linux, … (discoveries made in Europe) which made it possible to maintain a “competitive and innovating” software industry while limiting the entry of new actors on the market. However, according to the study, the nature itself of the patents could kill the innovations rate/rhythm in communication and information technologies. In conclusion, this study also identifies 10 potential important technological progresses and recommends to the European Union to lower the barriers of entry on the ICT market and to encourage the investment and the standardisation.

Linux, interoperability, standardisation, position on patents, … I prefer this study by PwC and you?

(J’ai également publié ce mot en français sur LinuxFr)