8bit tools of science

According to the founder of Playpower.org, more people in India have TVs at home than tap water. And there are $12 computers everywhere that uses the TVs as monitors, like so many of the personal computers of old.

Now consider that these hardwares based off older 8bit chip designs and the softwares that run on them are more or less in public domain. We are looking at a significant portion of the entire human population just poised on the verge to hackerdom. It’s not just typing education and language training. We could build entirely new framework for education in 3rd world urban area using existing tools of education and science. Imagine being able to design an 8bit program for those machines (some of them can actually do internet) that pulls data from research institutions of all kinds (BLAST, Wolfram Alpha, and etc etc) and scale it down to a form those machines and people using those machines can understand. We already have beta versions of synthetic biology CAD program that undergraduates regularly use for their school assignments and private projects, so it’s not that far away in the future.

Will a child capable of programming computers and pull data on SNP variations to do his/her own genotyping using soon-to-be widely available opensource PCR machines still languish in poverty and despair? I don’t know. I’d sure like to find out though.

Advertisements

Synthetic Biology Debate:Drew Endy and Jim Thomas

Here’s the link to the talk between Drew Endy and Jim Thomas on various aspects of synthetic biology, sponsored by the Long Now foundation.

The talk is about two hours long, and is available for download on the website. From brief look the talk is more about basic exposition of synthetic biology and possible social and ethical implications, rather than technical execution. I’ll do a bit more detailed post on this once I get around to finish watching it.

Science in Apple?

Like most people, I was tuned into the WWDC keynote address on Monday. Most of the stuff on the keynote were more or less expected, including the iPhone/Dev kit and the OS X 10.6. However, the way they were presented were intriguing to say the least… To this scientist-in-training at least.

First the iPhone. Inclusion of medical applications within the presentation was the real eye-catcher of the show for me (other than the $199 price point for the iPhone, but that was expected). Why go through the trouble of including such specialist application in a presentation aimed at developers and consumer-enthusiasts? Of course, it would be nice to be able to present applications from variety of fields to showcase the capacity of the iPhone and ‘grow the image,’ but something tells me that medical imaging application and med-school study guide are probably not the most interesting of the applications submitted to the Apple in time for WWDC. Based on circumstantial evidence, I think Apple planned to have presentation for medical application included from the beginning, and I think they wanted more than one to showcase the professional academic muscle of the iPhone. The very fact that they took the trouble to include a testimony from Genetech regarding their enterprise functions of the iPhone seem to support this assumption.

Second, the OS X 10.6, also known as the Snow Leopard. The primary idea of the OS seem to be out-of-the-box utilization of multi-core processors that are mainstream these days. Most of us run dual processors right now and it wouldn’t be farfetched to think that we (and by we, I mean the normal computer users. There are already quite a number of quad core users in more specialized communities I hear) might as well be running quad processor systems a year or two from now. It’s a reasonable move, considering that no OS of any flavor seem to be taking noticeable advantage of the 64 bit architecture that had been around forever. Apparently Apple is calling their own system for utilization of expected slew of multi-core processors Grand Central (after the beautiful Grand Central in my hometown, no doubt), which will no doubt form the headstone for the new OS X 10.6 iteration when it is released a year or so from now. Is it pushing it to far to say that this might as well be a move on Apple’s part to appeal to the professional scientist community that actually has real and pressing need for more computing power? The distributed computing projects like the BOINC and the folding@home for example (both of which I am an active participant. I urge you to join up if you think you ave some cpu cycles to spare). My Intel Core 2 Duo 2.3 Ghz processor isn’t enough to complete complex work cycles in any reasonable frame of time. What if we can run more simulations and calculation on our own laptops/desktops for faster results? It’s no secret that Mathematica and Apple seem to be on something of a favorable ground. Apple’s ethos on this particular attempt will be simple. Keep the computer out of the scientists’ way. Just plug in the numbers, get the results, no worries about 64 bit support or any complex refitting of scientific programs (unlike what most people seem to think, studying physics or any other branch of science doesn’t make you good at computer science. Those are entirely different fields! Physicists are merely proficient at limited skills needed for physics computing). Who wouldn’t want that?

Third, the OpenCL (which stands for Open Computing Language). This part might as well be a dead giveaway of the Apple’s company wide strategy to woo the scientific community. OpenCL is a method Apple is developing that would allow developers to use the GPU of computers to do CPU tasks. A few years ago the news of PS3 GPU being redirected for mathematical calculation made some news. I believe there were other ones where conventional graphics chipsets were utilized for complex physics calculations that gave results that far surpassed what was possible when using only the conventional cpu. It’s been such a long time that I am somewhat surprised that only now they are thinking of integrating it into mainstream computer market. Mind you, this method of diverting gpu to do cpu work was done at first to provide more muscle for physics simulations using conventional computer systems and components rather than specialized supercomputer systems. I do not foresee normal Apple toting screenwriters and web surfers needing all that computing power anytime soon. If this is coming, it’s coming for us, the scientists, who need to crunch numbers most people haven’t even heard of.

If we put the three together with the assumption that Apple might be shooting for the scientific computing community, we have possibly mobile computing platform with serious power (macbook pro), able to run variety of scientific programs (Mathematica+Matlab, BLAST etc), with built in ability to sync and wirelessly connect to/controlled by a dedicated mobile phone with some serious computing power of its own (iPhone+community apps). So the actual computing can be done at home, while the user receives output and sends input from his iPhone. Would this work? I think there are plenty of people doing the similar thing already. But there will be possibly significant differences between device that had been essentially hacked together and series of devices that were designed to work in conjunction from the beginning. I see this as very exciting development on part of Apple and computing industry in general.

Having a science-oriented Apple isn’t the only thing I’m excited about. Let me put it this way. iPhone made people who didn’t use text messages on conventional phones to text each other constantly. iPhone also made people who never used the browsing capabilities of their conventional phones to browse around the web. This is the problem and effect of accessibility that I mentioned in some of the other posts on this blog. When people don’t do something, it might not be because they want it that way. It might be because there is an accessibility barrier between the individual and the activity. We complain about how people are no longer interested in sciences and other higher academic pursuits. Maybe we’ve been unwittingly placing accessibility barriers on the paths to higher education? If such ideas about accessibility barrier between the public and the sciences have a grain of truth in it, maybe this new direction of Apple can do for sciences what it did for telephony. Especially with the community based distributed computing projects and DIY mentality across variety of scientific, but especially biological disciplines on the rise, (the term synthetic biology itself isn’t even new anymore, despite the immaturity of the field itself) maybe I can hope for some sort of change in today’s somewhat disappointing state of affairs.