22 July 2008
Recently I took part in a genetics research experiment. The study was related to Fragile X syndrome[?] and involved two days of mental evaluations and medical exams. I don't have Fragile X, but they needed a normal person as a control subject. Granted, "normal" doesn't really apply to me; according to the debrief I'm nearly three standard deviations away from normal. But it was fun and I got some MRI time.
And of course that meant writing a pile of software to interpret and render the 11 million voxels of data (not only is my software open source, but apparently so am I). Scanning, cropping, histograms, black-equalization, white-equalization, contrast-enhance, scaling, rendering -- all in 3D. In the process I discovered two (inconsequential) off-by-one errors in the MRI scanner's software.
Items of note:
The X/Y/Z axis animations were pretty straightforward, I'm getting good at them (I rendered a set last year). The rotating 3D view was more challenging. I wrote a Python script which generated a POV-Ray file, then rendered that file. Unfortunately my home computer didn't have anywhere close to the capacity required to even parse the POV-Ray file, let alone render it. But as the sticker on my laptop says, "My other computer is a data center". So I rendered the the whole thing over a weekend using Google's hardware.
If you've had an MRI done and would like linear animations made on each axis (no 3D view), just send me either an animated GIF of one axis or a stack of image slices. I'd be happy to run my software over your data set.
[Wow, the Domino Domino Logic video in my previous post has attracted over 70,000 views. An unexpected hit.]