Several years ago Microsoft made a rather bold decision to fund a substantial research project (now known as “Station Q”) exploring one particular approach to quantum computation, known as topological quantum computation. This project funds some top-notch physicists and mathematicians who work for Microsoft, as well as funding many other researchers at universities around the country and around the world. This has been a huge shot in the arm for my part of the physics community, which had been languishing due to insufficient government funding. Despite my loathing of Windows Vista, I have sworn not to say anything bad about Microsoft because I am eternally grateful to them for making my life as a scientist a whole lot more interesting.
Last week Microsoft hosted its biannual Station Q progress meeting at their research center in Santa Barbara. This is the sixth such meeting and, although the first few meetings were depressingly void of good experimental data, the last few have been extremely exciting and filled with plenty of interesting new things to think about. The experiments have been focused, to a large extent, on understanding the physics of one state of matter known as the nu=5/2 quantum Hall state. In particular, the experiments are trying to demonstrate that the state of matter is "non-Abelian" (maybe I'll blog about what this actually means some other time.. see also my upcoming web page at Oxford). At any rate, if the experiments manage to show this to be true, as all the theorists already believe, it would potentially provide a new route to building an error free (decoherence free) quantum computer -- essentially doing an end-run around the main stumbling block that has so far stymied all attempts to do real quantum information processing.
Of course all of the experiments are insanely difficult. They are all done at temperatures between 10 and 50 millikelvin above absolute zero. That is something like 1/10000 of room temperature. And the actual interesting part of the experiment is just a few square microns large. This is not for the faint-of-heart.
Maybe some day I will blog about all of the interesting experiments that were discussed at this meeting and how I see status of this field. However, for now I want to talk about two of the talks that were the most controversial. Both talks were research projects that I presumably have some responsibility for. One talk was given by Bob Willett, who is a researcher in my group at Bell labs, although he certainly runs his show without any help from me --- and as of thursday I am no longer at Bell anyway. The other talk was given by Woowon Kang from University of Chicago --- I worked very closely with him, proposing many pieces of the experiment. I spent much of the conference discussing the results of these two talks, and debating whether they were "right" or not. I think most people agreed that in both cases the data was certainly interesting, but in neither case could any conclusion really be drawn yet. The problem in both cases was that data was shown where it was very hard to see if there was really a signal behind some very noisy results. And in neither case was the necessary detailed statistical analysis done. As a result the feisty audience tried to hold both of their feet to the fire. (Note: part of the point of this conference is to show preliminary data, so even if their feet got burned a bit, they should not be blamed).
The discussions that ensued over this data made me think very hard about how to treat murky data. One sometimes gets the idea that science is very clear -- test a hypothesis and the result is either right or wrong. But frequently the results come out inconclusive... or barely conclusive. In some cases results get accepted by the community on tenuous data and later have to be reconsidered. Of course, in the best of all worlds, data is crystal clear and there is little room for doubt. But more often this is not the case. One has to be even more careful when there is a great driving force for a community to come to a particular conclusion. In this case, the community is predisposed to want to see results that confirm what we all expect. It is likely that the answer will turn out to be what we have predicted for years -- but we should keep an open mind that it may not turn out this way and we certainly should not jump to declare victory until the data really is incontrovertible. (On the flip side, there may be others in the community predisposed to disbelieve the data, and they should similarly agree to keep an open mind).
With this in mind, I decided it was a good idea to go back and reread the words of the great physicist Richard Feynman:
"Science is a way of trying not to fool yourself. The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that. After you've not fooled yourself, it's easy not to fool other scientists."
I personally hope that at least one of these works (or perhaps another similar experiment from another group) turns out to be correct, as this will give further life to a very exciting scientific field. But also, I hope that, despite all of our desire to see these experiments prove the theories, the community lives up to its scientific responsibility to accept the data as correct only when it really has been established, and to raise appropriate questions until that time.
Subscribe to:
Post Comments (Atom)
2 comments:
Couldn't have said it better myself :)
btw, the above comment was poster by roni and not amir
Post a Comment