Sunday, December 20, 2009
Sunday, November 15, 2009
Discussion Today
Today I'll be joined by Nick Stroud and Allison Stockdale for a discussion on the firing of UK Drug Policy adviser David Nutt, the ethics of genetic experimentation, and the Singularity.
Sunday, November 8, 2009
Tuesday, October 20, 2009
Discussion with Nobel Laureate Eric Kandel
I was able to pry my jaw from the floor and begin taking notes when the subject turned from his relationship with Austria (from which he fled at age 11 to escape the Nazi occupation) to a discussion on neural decoding. Quick to remind us this was not his area and expert opinions could surely be found elsewhere at the conference, he noted on the dynamics of the neuroscience community.
"There are fads and trends in neuroscience," commented Kandel, "but we are moving from cellular and molecular understanding towards a systems approach. What we really want to understand is the logic of the transformation of information between neurons." When asked where the most significant advances in schizophrenia would likely be seen, he said "Ten years ago, I would have said genetics. Anymore, I'm not so certain.
"The biggest concern with neurologic diseases like schizophrenia is public perception. Many people are struck with cancer late in life. Schizophrenia occurs in adolesence. People are stuck with this the rest of their life."
When asked about the challenges in neuroscience research, Kandel asserted,
"It's no secret that financial support is the biggest issue we face in this country. People don't realize that science is not just medicine. Science is the driver of economic growth. It is what allows us to move into the future. Public health is a burden on society because we have yet to fully understand nature. This is something we can overcome."
Dr Kandel is giving the Presidential Special Lecture tonight at SfN, titled On the Perpetuation of Long-Term Memory.
Monday, October 19, 2009
This was going to happen
I attended a talk today by Steven DeVries of Northwestern University, one of the pioneers in figuring out how the retina works. Dr DeVries, an ophthalmologist and an electrophysiologist, was talking about a hot-off-the-press technique he was developing to understand how individual cells in the retina send and receive signals. Without dropping too much jargon, he was using frequency dependent pulse modulation to measure receptor saturation in cells. To most people this means nothing, which is OK, for now. But while listening to his talk, I began to suspect that this was a technique that I had actually seen before. It took me a few minutes to recall where I had encountered it, but when I did, I nearly LOLed out of my rocker.
Let me rewind for a moment. This past spring semester, I was enrolled in a neuropharmacology class, dominated by molecular-biologists-in-training, and taught by a pharmacologist. The final assignment was to write a short research proposal aimed at studying some cells we had been going over in class. Given that my knowledge of molecular biology is scant at best, I decided to use a set of techniques in electrophysiology and boost them with a touch of mathematically rigorous principles in engineering. The instructor had previously announced that, for the sake of time, 4 of the 9 students would be required to defend their research proposals in front of the class. For soon-to-be obvious reasons, I was selected to defend my proposal. I proceeded to map out the series equations and graphs that described my proposal: frequency dependent pulse modulation to measure receptor saturation in cells. Amidst blank stares from my molecular-biology peers, the instructor asked me a few tangential questions that didn't really apply at all to what I was trying to do, to which I offered relevant responses. At the end of the class, the instructor announced that everybody's final grade was ready, except mine, because "he had to mull over it."
After meeting privately with the instructor, I came out with a B+ on that grounds that he, as a molecular-biology-rooted-researcher, couldn't quite wrap his head around what I was trying to propose. Given my less-than-impeccable record on homework grades, a B+ really wasn't that bad. But the fact that the professor really thought that my entire proposal was crap and made his opinion quite evident was a little irksome. Feel free to skim back over the first paragraph for a refresher before continuing reading.
My inner-generator of hilarity went wild when I realized that my in-class research proposal was EXACTLY WHAT STEVEN DEVRIES WAS PRESENTING AS NEW RESEARCH. Reassuring was how Dr DeVries spent a generous portion of time describing his methods, likely due to the many blank faces (along with one laughing one in the back) in the audience.
The fact that I saw someone talking about an idea I recently had is not surprising. This kind of thing happens all of the time in science, so it was kind of cool to see a hotshot researcher presenting and defending this idea. I hardly expect this technique to be as important as calculus, so I'll avoid bringing up a rematch of Newton and Leibniz, particularly because I never touched it after the proposal, but also because Dr DeVries has spent the last few years working on this. He clearly thought of it before me. The issue here resides in the aforementioned research proposal evaluation.
Now I could easily be angry about this whole spiel. I was for quite some time. But I think I've learned an important lesson here. No matter how correct a professor thinks he is, or how wrong he thinks someone else is, his opinion is just as likely to fault as a normal human being (which is pretty high). The next time someone calls bullshit on my newest crazy idea (and trust me, there are plenty), he better have more than just a hunch of disbelief. This might be dangerous, because my I was just starting a theory on overcoming the time dependent nature of human cognition, which is likely to draw criticism from most of civilized society. Keep tuned in for more, and by all means keep telling me how wrong I am. It is doing wonders for me.
Let me rewind for a moment. This past spring semester, I was enrolled in a neuropharmacology class, dominated by molecular-biologists-in-training, and taught by a pharmacologist. The final assignment was to write a short research proposal aimed at studying some cells we had been going over in class. Given that my knowledge of molecular biology is scant at best, I decided to use a set of techniques in electrophysiology and boost them with a touch of mathematically rigorous principles in engineering. The instructor had previously announced that, for the sake of time, 4 of the 9 students would be required to defend their research proposals in front of the class. For soon-to-be obvious reasons, I was selected to defend my proposal. I proceeded to map out the series equations and graphs that described my proposal: frequency dependent pulse modulation to measure receptor saturation in cells. Amidst blank stares from my molecular-biology peers, the instructor asked me a few tangential questions that didn't really apply at all to what I was trying to do, to which I offered relevant responses. At the end of the class, the instructor announced that everybody's final grade was ready, except mine, because "he had to mull over it."
After meeting privately with the instructor, I came out with a B+ on that grounds that he, as a molecular-biology-rooted-researcher, couldn't quite wrap his head around what I was trying to propose. Given my less-than-impeccable record on homework grades, a B+ really wasn't that bad. But the fact that the professor really thought that my entire proposal was crap and made his opinion quite evident was a little irksome. Feel free to skim back over the first paragraph for a refresher before continuing reading.
My inner-generator of hilarity went wild when I realized that my in-class research proposal was EXACTLY WHAT STEVEN DEVRIES WAS PRESENTING AS NEW RESEARCH. Reassuring was how Dr DeVries spent a generous portion of time describing his methods, likely due to the many blank faces (along with one laughing one in the back) in the audience.
The fact that I saw someone talking about an idea I recently had is not surprising. This kind of thing happens all of the time in science, so it was kind of cool to see a hotshot researcher presenting and defending this idea. I hardly expect this technique to be as important as calculus, so I'll avoid bringing up a rematch of Newton and Leibniz, particularly because I never touched it after the proposal, but also because Dr DeVries has spent the last few years working on this. He clearly thought of it before me. The issue here resides in the aforementioned research proposal evaluation.
Now I could easily be angry about this whole spiel. I was for quite some time. But I think I've learned an important lesson here. No matter how correct a professor thinks he is, or how wrong he thinks someone else is, his opinion is just as likely to fault as a normal human being (which is pretty high). The next time someone calls bullshit on my newest crazy idea (and trust me, there are plenty), he better have more than just a hunch of disbelief. This might be dangerous, because my I was just starting a theory on overcoming the time dependent nature of human cognition, which is likely to draw criticism from most of civilized society. Keep tuned in for more, and by all means keep telling me how wrong I am. It is doing wonders for me.
Sunday, October 18, 2009
Neuroscience 2009
Benefits of exercise on the brain, gender specific attributes of cognition, and mechanisms of facial feature processing in social situations: finally, research in neuroscience can be understood by all. Or can it? In the early nineteenth century when Phineas Gage shed a frontal lobe, people started to notice the brain had much to do with our every day selves. Centuries later, a field has evolved that strives to understand every last detail of the human mind. Some may recall the Star Trek: Next Generation episode in which Captain Picard has a headache and the doctor is surprised: by that time man had mapped the brain and headaches were shelved next to polio on the old-disease list. Such a thought is a dream for every mind here at Neuroscience 2009, the world’s largest annual meeting for all things brain. Something like 30,000 neuroscientists gather here to share their newest discoveries. But while banners boast the big ideas of how thinking in men differs from that of their better counterpart, the minutiae make up the guts of the conference. Insert Latin phrase for “Lay people stay clear”. As an engineer in a field where Molecular Biology is King, I often struggle to translate findings into terms I understand (binary, preferably). But while I’m normally able to understand basic ideas of such topics, the vast majority of the research presented here goes under the radar of the public eye. Not because there isn’t press to gobble them up, and not because their research isn’t hot enough: most scientists struggle to show how the single molecule they study in species no one has heard of actually applies to humans. So I'll just stand up for everyone here: it all matters. Their research may be decades from translation into a vaccine or cure, but these are the building blocks that are required to reach any goal, no matter how lofty. I am reminded of a wonderful quote from Jill Tarter, director of SETI, the Search for Extraterrestrial Intelligence, saying “ ‘Are we alone? Humans have been asking [this question] forever. The probability of success is difficult to estimate, but if we never search, the chance of success is zero." Such a mindset resonates in the sciences. We don’t know if we can map the human mind. We certainly don’t know what to expect in the instance we do. But let’s keep the comb fine and make some space on that shelf.
Friday, October 16, 2009
The Tale of Phineas Gage
Every decade or so, a new piece of information surfaces about our seminal beloved guinea pig of neuropsychology, Phineas Gage, whose frontal lobe was pitted by a rail tie stopping through on its way into orbit. Gage, whose biosketch is constantly debated among neuro-historians (all three of them), can attribute his fame to two things: one, that he survived the blow; two, it was so poorly recorded in history that wild speculation is the norm. Oh, Gage!
In the spirit of Neuroscience 2009 in Chicago, I present to you "Phineas Gage" by cult favorite buscemi.
http://www.youtube.com/watch?v=xiI7tJM9KTM
In the spirit of Neuroscience 2009 in Chicago, I present to you "Phineas Gage" by cult favorite buscemi.
http://www.youtube.com/watch?v=xiI7tJM9KTM
Tuesday, October 13, 2009
Blogging at Neuroscience 2009
content="0; url=http://erikleenylen.com/">
Stay tuned for updates this extended weekend from Neuroscience 2009, the world's largest conference about all things brain. There are some exciting topics that I can't wait to discuss.
As such, frequent contributor Kristopher Klein will be filling in for me on Only Science this Sunday, so make sure you tune in to hear him talk about the latest news in Nobel-ville.
For Science,
Erik
Sunday, October 11, 2009
Stream Audio of "Motion 451"
Check out an audio recording of today's show, "Motion 451" below.
Many thanks to improv masters Dave and Greg!
Any thoughts, comments, or concerns should be forwarded to erikleenylen@gmail.com
Tune in next week at 5 PM!
Many thanks to improv masters Dave and Greg!
Any thoughts, comments, or concerns should be forwarded to erikleenylen@gmail.com
Tune in next week at 5 PM!
Motion 451
Special thanks to David Philips and Gregory Gallagher for joining me in Rimbar, Motion 451 today!
Check out their website here:
paperbackrhino.com
Check out their website here:
paperbackrhino.com
Subscribe to:
Posts (Atom)