XARK 3.0

  • Xark began as a group blog in June 2005 but continues today as founder Dan Conover's primary blog-home. Posts by longtime Xark authors Janet Edens and John Sloop may also appear alongside Dan's here from time to time, depending on whatever.

Xark media

  • ALIENS! SEX! MORE ALIENS! AND DUBYA, TOO! Handcrafted, xarky science fiction, lovingly typeset for your home printer!



Blog powered by Typepad
Member since 06/2005

Statcounter has my back

« Because Janet can't look at the previous post | Main | Week 17 NFL Power Rankings »

Wednesday, December 27, 2006


Feed You can follow this conversation by subscribing to the comment feed for this post.



There is an interesting sidebar in the new (01/07) issue of Wired in which Jonathan Zittrain, professor of Internet governance and regulation at Oxford, opines that the current love affair with the Internet as we know it may well go the way of the CB radio.

He cites the increasing threat of malicious code as what might eventually drive folks from open to closed systems.

The bar is on page 058, by the way.


It's the disInformation Age.

I yearn for the days when I had to get up turn the channel on the TV, if I can get this computer off my lap.


The cultural rorschach test and "just taking sides" arguments ring particularly true. Just read the comments sections on any blog attended to by both sides.


Emory Study Lights Up The Political Brain (Neural Bases of Motivated Reasoning: An fMRI Study of Emotional Constraints on Partisan Political Judgment in the 2004 U.S. Presidential Election)

The Political Brain

Janet Edens

So, before, we had only a little information and had to rely on someone we trusted to tell us what we needed to know. Then we had plenty of information, but not much access so we relied on people we trusted to tell us what we needed to know. Now, we have access to too much information, so we need someone we trust tell us what we need to know.

Pick me! Pick me!


Political Memory: Keep on typing...


While I've been taking time off, a 2D-LC-MS has been analyzing microbial samples for us - samples of a coral bacterial pathogen that is virulent only in warmer waters. Each sample 'run' consists of a microbial culture, grown under a specific condition - and each run will generate ~8,500 peptides (with each peptide ranging in size from 400 to 4000 daltons). After running adequate replications and all of your treatment conditions - you end up with a peptide library consisting of ~120,000 peptides which you sort into genes/proteins. This is a very small study done 'on the side' - and is by no means the big data generating experiments in the lab right now.

But in the lab we have help - good computer programs and databases and with some luck bioinformatics folks as collaborators. I get lost at home, outside of work, with information overload. It's like I have a meltdown. Thank God Janet is out there!

Anna Haynes

One solution: Lex's Listening to the people who were right. Past performance is the best predictor of future results.

The comments to this entry are closed.