Friday, December 7, 2012

Only a hammer in your toolbox

Talk about climate change, and be sure that your analyses are rock solid, because you will get some serious backlash. What interests me most is the danger of "if the only tool in your toolbox is a hammer, everything will look like a nail" thinking.

http://www.statisticsblog.com/2012/12/the-surprisingly-weak-case-for-global-warming/ is a blog post written by a graduate statistics student, with this summary:
"TL;DR (scientific version): Based solely on year-over-year changes in surface temperatures, the net increase since 1881 is fully explainable as a non-independent random walk with no trend. 
TL;DR (simple version): Statistician does a test, fails to find evidence of global warming."
The correcting nature of the connected scientific community quickly came to the rescue (?) in the form of Dr. Fellows http://blog.fellstat.com/?p=304:
"What we have shown is that the model proposed by Mr. Asher to "disprove" the theory of global warming is likely misspecified. It fails to to detect the highly significant trend that was present in our simulated data. Furthermore, if he is to call himself a statistician, he should have known exactly what was going on because regression to the mean is a fundamental 100 year old concept." 
I  find it really brave that a grad student puts himself out there, blogwise (see Commandment 6); especially with a subheading that reads:
"In Monte Carlo we trust"
Mr. Asher put a lot of effort into writing that post, given the length and thought that went into it. And he clearly feels very strongly about this issue (see Commandments 1-3). Too bad he put too much trust into his hammer, as pointed out by Dr. Fellows. I do think that Dr. Fellows also plays the man in his final comment, which was not necessary, given his rebuttal; but I can see where his frustration is coming from.

I was going to write that "Hopefully, I will not make the same mistake as Mr. Asher when writing for this blog", but then there is always Commandment 8 ("Thou Shalt Not Simply Trot Out thy Usual Shtick."), so luckily I can then always fall back on Commandment 5 ("Thou Shalt Not Flaunt thine Ego. Be Thou Vulnerable. Speak of thy Failure as well as thy Success.").

Wednesday, November 28, 2012

Why consistent terminology is important


Some time ago, I received this email from a grad student:
"Do you know the blog zombies ideas in ecology?? I think this is the kind of ideas that could interest you. After reading all these chase papers I just find myself completely lost in the meaning and use of words such as stochastic, random and neutral…. This text kind of help me (it’s a critic of the use of these terms in community ecology) http://oikosjournal.wordpress.com/2012/01/23/zombie-ideas-in-ecology-neutral-stochastic/ but I’m still lost…do you have any tips on that?? It seems to me that a stochastic process is a process that you can  only predict in probabilistic term. But then it can of confuse me because it seems for me that neutrality is random and randomness is a stochastic process, right??..

And shortly after that a 2nd question from the same student:
"And what to do with that : http://oikosjournal.wordpress.com/2012/01/24/zombie-ideas-in-ecology-neutral-dispersal-limitation/ . All Chase works on beta diversity assumption and works on limiting distance similarity is to throw in the garbage??..."
I took me a while indeed to figure out first the problem, and second a potential solution. I will try to break it down, and I will not use full prose, since I am still trying to work out the best way to present and think about this myself, thus the semi-concept map style of writing below.

Observation:

  • spaghetti clump of terms in the literature:
    • neutral, drift, random, stochastic, unexplained, error term, probabilistic equivalence, dispersal limitation, noise, chance, spatial processes
    • deterministic, niche-based, selection
  • there is not a lot of confusion about niche-based processes = selection

Problem:
  • neutral and stochastic pretty similar, while different definitions
    • stochastic - in Chase and Myers 2011: "chance colonization, random extinction and ecological drift", "indistinguishable from random chance alone"
  •  but
    • dispersal does not have to be neutral, while this is implied by the definition of Rosindell et al.
    • neutral and stochastic often used as synonyms in e.g. Chase and Myers
  • So lots of confusion around the first group of terms, hence the spaghetti clump 

Solution:
  • Fox provides some thoughts on a solution, but I think that could use some more explanation (and he will probably correct me if I am wrong ;-)
  • looking at the four community processes from Vellend 2010, I suggest to identify five, fundamentally orthogonal, questions you need to ask to classify a process (and thus the associated terminology)
    • What is the time frame of the process: ecological versus evolutionary time frame (or is it necessary/useful to include speciation as an important process?)?
    • What is the spatial scale of the process, or the metacommunity context:  within- or between-site (dispersal) interactions? I would refrain from using "local vs. regional" here, because "regional" also has the connotation of "at a larger scale", while the action of these "regional" process are sometimes at the within-site scale. For instance, climatic features such as rain or temperature are regional in nature, but each organism experiences this within a site. So it is a regional process at a within-site scale, in that case (another quagmire of confusing terminology).
    • What is the nature of the species differences: are they niche-based (or under selection sensy Vellend 2010) versus neutral (or probabilistic equivalence of species sensu Rosindell et al.)?
    • What is the stochastic nature of these ecological processes: Do they result in a deterministic or stochastic outcome?
    • How do you detect the signature of these ecological processes? Through randomization procedures, or explained variation, or noise, or ...

Using this scheme, I think you could uniquely identify several terms and patterns in the literature, and solve some of the inconsistencies noted above:

  • drift is a specific type of end product of within-site, neutral, and stochastic, species interactions
  • dispersal can be 
    • neutral: all species similar in dispersal characteristics
    • or stochastic: see limiting dispersal sensu Winegardner et al. 2012 (see e.g. Matias et al. in press) , but with some deterministic patterns (see distance decay or isolation by distance in neutral metacommunities, Jeremy Fox's 2nd zombie idea mentioned above).
    • or deterministic: see efficient or high dispersal sensu Winegardner et al. 2012
  • neutral species interactions can
    • lead to dominance of one species over long time periods, and thus look deterministic
    • be deterministic (see Jeremy Fox's examples in his first zombie idea)
  • selection can be stochastic (see also Jeremy Fox's example in his first zombie idea)



Caution

  • It would be easy to fall in the trap outlined by Jeremy Fox, and treat these questions as "ends of a linear continuum". I do think they are ends of a continuum, but this does not have to linear (which is what I think the main danger is that Jeremy Fox wanted to point out).
  • I reserve the right to change my mind on this above classification, based on what I hope to be lots of discussion and exchange of ideas.

Monday, November 26, 2012

A new Dr. !

Thiago successfully defended his PhD in Brazil! Congratulations! You will be able to read all the innovative research he has done the last couple of years as they will become part of the vetted ecological literature, no doubt about that. But I think he learned more than "ecology" during these last years. Here is an excerpt from his acknowledgments:
"Tive muita sorte de conhecer o meu co-orientador Karl Cottenie. Para minha surpresa, um quadro branco e uma caneta preta - e não um programa de estatística - me mostraram como rabiscos podem gerar ideias e muito conhecimento. Foram seis meses que mudaram minha vida! Karl, I know that you'll google it ;-)"
 So after the magic of Google Translate:
"Ik was erg blij om mijn co-adviseur Karl Cottenie weten. Tot mijn verbazing, een wit bord en een zwarte pen - niet een statistisch programma - liet me zien hoe krabbels kan genereren ideeën en heel kennis. Het duurde zes maanden dat mijn leven veranderd!"
It is always satisfying to see another convert to the power of the whiteboard and concept maps. Keep Thiago on your radar, he will do fun things!

And thanks to the power of Facebook, here is a picture:

Update:
And here is a link to Thiago's amazing presentation: it not only looks nice, but presents a succinct and clear overview of what he did: http://prezi.com/rki6rucrxlxd/defesaii/

Friday, November 23, 2012

"We are creative people"

Some insights from a completely different source through amazing photography and words: http://www.brucepercy.co.uk/blog/2012/11/21/finding-inspiration-through-concepts/

"I’ve started looking at some of my other work, to see if I can find a theme, a concept of some sort, and I think I’ve begun that process. It’s rather exciting to feel that one thing leads to another, and by simply being open and experiencing my existing work in a new way, I can see something lurking, waiting to be pulled out and developed. Maybe something new will come of it. I really don’t know, and I guess that’s what’s so great about the creative process, things often have a way of taking on a life of their own."

Thursday, November 8, 2012

The role of intuition in data analysis

Most of you readers probably followed the US elections with varying degrees of interest and passion, but as scientists are also aware of the "role" of Nate Silver and his blog FiveThirtyEight.com. Lots has been written about his success, which is "just" a nice example of the strengths of the scientific process, and thus not that surprising. What does surprise me, is the reluctance of journalists to come to terms with what he does. See this very insightful blog post by Mark Coddington.
"Silver’s process — his epistemology — is almost exactly the opposite of this: Where political journalists’ information is privileged, his is public, coming from poll results that all the rest of us see, too. Where political journalists’ information is evaluated through a subjective and nebulous professional/cultural sense of judgment, his evaluation is systematic and scientifically based. It involves judgment, too, but because it’s based in a scientific process, we can trace how he applied that judgment to reach his conclusions."
This reluctance reflects, amongst other aspects, a classic case of a an either/or role for intuition in science/data analysis. We already had a lengthy exchange about the scientific process on this blog (see for instance this and this), and Stefan's main critique against the scientific process as a method is maybe summarized in this statement:
"I am claiming here that in order to do good science, one must know which auxiliary assumptions are reasonable to question in the face of evidence that conflicts with one’s predictions."
One way to look at this is that intuition plays that role of knowing "which auxiliary assumptions are reasonable". So it is not an either/or relationship, but really a feedback between intuition and hypothesis testing.

Or my last commandment: "Thou Shalt Listen to thy Intuition, but Follow the Data."

Thursday, October 11, 2012

For the love of whiteboard

Here is another convert of the magic of the whiteboard: http://lifehacker.com/5950957/how-a-whiteboard-helped-a-terrible-delegator-keep-a-team-on+task.

He provides a list of why whiteboards work:
"Whiteboards are big enough for everybody to see.
Whiteboards make you want to fill the space, and therefore expand and branch your thoughts.
Whiteboards inspire you to keep writing, to keep pushing on what's in your head, because it feels awesome to swing your arms that widely. 
Whiteboards feel less like you're committing to an idea than throwing it out for consideration. 
Whiteboards are nearly impossible to lose inside your backpack."
This latter point, was until recently, my major problem: I did not have a camera with me all the time to capture each session. But now that I have a cell phone, that is solved:
 Hopefully the information on this board will be featured in some later, more extensive, post.

Thursday, July 19, 2012

Colonization rates and species richness

Our first publication from the amazing spanish pond system set-up by Andy Green and his group, but especially with the hard work of Dagmar Frisch, who has the patience of a saint to keep collaborating with me: "Strong Spatial Influence on Colonization Rates in a Pioneer Zooplankton Metacommunity" in Plos One.

Each published article is unique in its calvary to publication. Most often, this is not visible to the reader, although sometimes you can tell that certain parts in the manuscript are clearly added on to satisfy a reviewer's pet peeve. For this publication in particular, it rested on how to define "Inundation days", the  essential part of our measure of colonisation rate. After some back and forth with reviewers, we agreed on the following figure:
As you can probably tell from the formatting, this figure is only:

  • available in the supplemental material
  • shows both the age of the analyses in this manuscript and my inertia, since it still uses the Lattice package in R, instead of the now much more used ggplot2 package
But I am still so proud of this figure, since it also provides me with a reminder that species richness as a concept can be useful. I am still working, if only in my mind, on a publication on why we should not use "species richness" as much as we do in community ecology. But I also believe that maybe there are only two cases for a defendable use of species richness. One is in the context of "regional" species richness, of which this figure illustrates one component: the potential species that could occur within a site because they have been detected there at least once throughout its colonization history.

The second case of defendable use of species richness is in a metacommunity with neutral (or neutral-like) species. But I will hopefully explore this in more detail down the road.

Tuesday, June 5, 2012

Inspirational stories for teachers

An oldie, but worthy of a link: http://blogs.plos.org/neurotribes/2011/10/05/whats-the-most-important-lesson-you-learned-from-a-teacher/

I have since tried to identify a specific teacher and lesson for my own story, but I am not good at recalling memories, and my university teachers were pretty old school, sage-on-the-stage types of teachers, not exactly my style.

The one person that keeps coming up is my first kayak instructor (name forgotten) when I was 16 (?), in France. It was just me and him, and he really focused on breaking kayaking down the basics: reading the water (currents and countercurrents) and weight position in the boat (left-right and front-back), followed by exercises that internalized those basics before moving to the next step. With this as a foundation, I was ready to confidently tackle more challenging rivers in completely different situations, independently in later years.

Sounds like a life lesson for me as a teacher.

Tuesday, May 15, 2012

Indiana Jones denied tenure

http://www.mcsweeneys.net/articles/back-from-yet-another-globetrotting-adventure-indiana-jones-checks-his-mail-and-discovers-that-his-bid-for-tenure-has-been-denied

"The lone student representative on the committee wished to convey that, besides being an exceptional instructor, a compassionate mentor, and an unparalleled gentleman, Dr. Jones was extraordinarily receptive to the female student body during and after the transition to a coeducational system at the college. However, his timeliness in grading and returning assignments was a concern."

Thursday, May 10, 2012

Intuition and science

"Seems to me that there’s not really a contradiction between System1 (intuition) and System 2 (rational) modes of knowing.  Instead, one’s just faster than the other.  Both are useful, both are necessary.  But the great achievement of science has been to tell us that intuition always has to be checked.  In essence, science is the overturning of intuition by slow, painful, careful System 2 reasoning."
from http://searchresearch1.blogspot.com/2011/11/intuition-and-counterintuition.html

Thursday, May 3, 2012

Brittany defended yesterday...

... successfully, of course. Now we just have to turn it into a publication. I always give a short speech during the celebrations afterwards, and the theme was what Brittany, her personality, brought to the research project (Ryan Gregory's great standard question). She was much too modest when she answered this during her defence, so I answered it for her:

  • her organizational talents (keeping track of the extensive experiments and field observations, all at the same time)
  • her efficient and hard work (she counted more than 600 samples one summer, compared to my 90+ samples for my entire PhD)
  • her intellectual contributions to her thesis, which looks very straightforward afterward, but it took us many iterations to understand and explain it this clearly
  • knowing how to handle me (important one!)
  • her willingness to work in Churchill, given the remoteness and sometimes dangerous nature of the field work there
And the polar bear situation in Churchill was the source of my present for Brittany:

Colour-coordinated accessories ;-)
Congratulations again, Brittany.

Friday, April 27, 2012

Making fun of professors

It is so easy to make fun of professors, we even have our own comic strip. If you believe the stereotype, we have so many laughable traits that there is a never ending list of possible punch lines. The sad part, some (most?) of them can hit really close to home.

Recently McSweeney's published a more elaborate punch line on philosophy to answer:
"UNIVERSE'S FUNDAMENTAL QUESTION"
 This is a golden oldie, and apparently it made the internet rounds already, but I became aware of it through Geekdad (what's in a name, right?): What is the difference between a geek and a nerd?


And now I don't know who I am any more:

  • My geek traits
    • A mac
    • Can be pretentious and longwinded (see this blog)
    • A fan of gadgets
    • I fell for and married a non-geek
  • My nerd traits
    • Extreme interest or fascination with academics (see this blog)
    • Introverted (do not look at this blog)
    • Socially inept
    • Diverse and sometimes impractical skills due to broad interest in science
    • Interest might include Battlestart Galactica, computer programming
    • Reclusive professor

Thursday, April 26, 2012

Flipping education

It seems that the new verb this day in education is
"Flipping"
Kevin MacIce wrote a full piece around it in Wired, and then TED-Ed was announced today, with as one of the main selling points the open (?) teaching platform they are creating around YouTube, by providing an on-line lecture around any video on YouTube, or "flipping" the video.



As an educator I am not really good at capturing and keeping the attention of my students during a "lecture", I really dislike that word and the activity. I am just not a good performer, or the edutainment part of teaching. I do think I am somewhat competent in solving problems that students have with the materials, in answering questions.

So I have always said that I will "flip" my educational approach. And these two articles will probably form the impetus and starting point for when I start again after my upcoming sabbatical. Avoiding the "just add an extra hour to every lecture hour" danger will be relatively easy. The main problem I am struggling with at this point is that I use questions in class during lectures to introduce new material. This illustrates, I hope to the students, that their intuitive notions of ecological mechanisms are often correct, and that the the literature on these subjects consists of a 1) a more in-depth exploration and explanation of these intuitions, 2) examples of these mechanisms, and 3) where these mechanisms (and intuitions) break down. I think this is really empowering to realize that learners can work there way through the basics of the material by themselves.

But how can I incorporate this in the flipped version of my current teaching practice?

  • I cannot use those questions in the video, since the student will not engage with the question, and just keep watching.
  • If I end class time with these types of questions, the students will have the engagement, but they will probably watch the video several days later, and maybe not notice the connection between their own intuition and the material.
I have a year to figure this out, and hopefully in the mean time somebody will launch an alternative to "flipping", because somehow it really grates every time I use it, similarly to "lecturing". 

Monday, April 23, 2012

Metacommunity terminology

The title of my "Teaching Philosophy" statement (that I completely rewrite every year), is "Research-Teaching-Learning Link". In it, I try to point out the obvious and maybe not so obvious connections between these different aspects of my professional life. One of the items that is hardest to prove, though, is how teaching can inform my day-to-day research focus.

And now I can finally provide evidence for this: our comment in Trends in Ecology and Evolution "The terminology of metacommunity ecology". Nothing earth shattering in it, but what we hope would be a suggestion to remove some confusion about the words used in metacommunity explanations. Believe it or not, but this is the culmination of 6 years of teaching community ecology, and 3 years of lab discussions. The impetus was how to explain the different metacommunity frameworks from Leibold et al. 2004. I was always unsatisfied with the lack of structure in Figure 1, and my inability in successfully explaining it to students. So now we have this comment, that explicitly identifies the 2 structuring forces in metacommunity theory, species sorting and dispersal, and how combining them in different magnitudes results in the commonly identified frameworks.

And the Acknowledgements have this in them:
"We thank Kyle Gillespie for comments on these early thoughts, as well as the classes of BIOL 3120– Community Ecology at the University of Guelph for showing us how to explain metacommunity ecology to a broad audience."

Sunday, April 22, 2012

End of semester

I am about to submit the grades for the two courses this semester, and this often leads to self reflection. I am not going to do that now, but I will do something better: link to the self reflection of a teacher that is well-written, funny, insightful, sometimes reflects my own experiences, sometimes not at all. So I highly recommend William Bowers' "All we read is freaks" for all educators.

I could extract so many passages out of this long piece, but this is the one I emailed to myself:
I begin: “Let’s hear some gut reactions to last night’s reading from you guys, to give the illusion of open discussion before I enforce my agenda.” I admit to offering minor doses of edutainment. These kids are TV-saturated and sometimes sit there waiting for the world to break into a spectacle, and so I meet them halfway, which prevents class from becoming an academic purgatory or a filibustering standoff. I also exploit whatever residual hipness my relative youth  allows. The secret, of course, is that I feel a need to perform and to be liked; embarrassingly, in a conversation with a tenured prof, I once referred to the class as “the audience.”
How can you come up with this: "the illusion of open discussion before I enforce my agenda". I laughed so hard, but then I started thinking about it, and I am probably guilty of this as well. And how often has my class turned either in academic purgatory or a filibustering standoff? And how I wished I was a better performer? All in 3 sentences, imagine what the rest of the essay is like? Go and read it.

Friday, April 13, 2012

R-hipster confession

I sometimes think of myself as a science nerd, but then I read Benjamin Mako Hill's computing set-up, and suddenly it's like reading the uber science nerd manifesto. Amazingly consistent and hard-core, every single step of his work flow. And where our work flows overlap is probably in R:
 "R is slow and using it with big datasets gives one plenty of time to reflect on this fact. But R is also expressive, elegant, and concise for numerical and statistical work so I happily suffer through it. I make my graphs in ggplot2 which is so trendy that I feel that mentioning this is a sort of R-hipster confession."
I wonder how quickly we can make this R-hipster confession a true internet meme? Help me out here.

Thursday, April 12, 2012

A call to all aquatic nerds...

... how many species can you identify in this video?

Tuesday, April 10, 2012

Talking to...

As you probably noticed, we do lots of interdisciplinary research in this lab. A group of undergraduate students approached this question for a 4th year class, and made a website that approaches interdisciplinary research from, you guessed it, several different perspectives. One of the activities they did was interviewing a wide range of people with this question in mind. Two of those persons where Ingrid and myself. And I will leave it up to you to find those, embarrassing ?, video interviews.

Tuesday, March 27, 2012

The battle of zooplankton

Kaleigh Eichel, a student in Knowledge Integration, did research in Churchill for two summers. In addition to her research, she also created a web site translating some of the research in Churchill to the general public. One of the episodes features the projects done by Amanda and Brittany:

http://www.avioyak.ca/episode3.html 



Enjoy.

Tuesday, March 13, 2012

Discussing place-based education ...

... outside in the Arboretum, with our First Year Seminar class. The (only?) advantage of an early spring this year.


Although Ingrid would probably point out that our discussions, even in the Arboretum,
"... had been sitting on the land, instead of having roots within it." (Laura Piersol, Canadian Journal of Environmental Education, 2010, 15:198-209)
But how to design a university course to have its roots within the land? It is easier to do this in the context of a field course, but with a Monday-Wednesday 8:30-10 am lecture slot? I will have to give this some more thought.

Wednesday, March 7, 2012

Scientific Method continued

There once was a great chef who claimed to have a sure-fire recipe for baking great cakes.  The chef constructed his recipe by carefully watching the cake-baking procedures of other great chefs in his native country of Austria.  Chef Popper --that was is name-- argued that anyone who followed his recipe would be guaranteed to bake a great cake.  Amazingly, Chef Popper’s recipe was also extremely simple.  Just three instructions: (1) get ingredients, (2) mix ingredients, (3) place in oven. “A Monkey could bake a great cake using my recipe,” Chef Popper would sometimes say, “so long as it follows the recipe precisely.”  Soon after, other great chefs from around the world agreed that, indeed, Chef Popper had nailed it. “That is exactly the recipe I use when baking great cakes in my country,” other chefs would exclaim, “more or less” (they would sometimes add).  

I concocted this little story because I think it helps to illustrate where we seem to be disagreeing.  Obviously Chef Popper hadn’t provided a recipe, because all of the important details about how to bake cakes are missing.  Let me now explain why the HD “method” is just like Chef Poppers “recipe.”  

Consider a simple example of science in action. Let the hypothesis be that average global temperature has increased over the past century by at least two degrees.  Let the prediction that one derives from this hypothesis be that the average space between tree rings will have increased over the past 100 years in trees that are at least 130 years old. Now, suppose that we go out and core some 130 year old trees to test this prediction. Lo and behold, we find that there is no change in the average distance between tree rings in our sample: the observation “falsifies” the hypothesis.

Should we reject the hypothesis on these grounds? Of course not!  Instead, we identify a questionable auxiliary assumption that was implicitly used to derive the prediction. Let the auxiliary (background) assumption be that, in our sample, soil quality did not change over the past century.  If soil quality did change, then we would not necessarily expect tree ring distances to increase. This leads us to conduct a new experiment on soil quality in our original sample of trees. And so the process goes.
 
“But that’s just the HD method!” you might claim. No, it is not.  If you are inclined to say this, you are like the chef who thinks that Chef Popper is describing your recipe. The reason that the process I just described is not the HD method (under any guise) is because it is not a method. Let me explain why.

Any hypothesis depends on a huge number of auxiliary hypotheses in order to derive a prediction. Importantly, by “derive” I mean  deduce according to logical rules. This is important because we are trying to come up with a method (recipe) that anyone (monkey?) could follow.  

To derive, logically, a prediction from a hypothesis, thousands of other (background) assumptions must be in place.  For instance, take the derivation: if average temp has increased by 2 degrees, then there should be an increase in the distance between tree rings in the sample population. To do this, one must  also assume that:
       
       * The sample had roughly stable nutrients for the last century.
               * The sample was exposed to stable atmospheric gasses.
             * The earth did not undergo a major shift in rotation around the sun.
             * Climate change deniers did not tamper with the evidence.
    etc.

Now, some of these auxiliary assumptions strike us as ludicrous (or at least highly unlikely). But that is part of the point that I am trying to illustrate. The reason that some of these auxiliary assumptions strike you as silly is because you know better – you have what I referred to as discipline specific knowledge in my previous post.

Remember that the name of the game is to come up with a method, the analog of a recipe, that allows one to do good science (or distinguish good from bad science). I am claiming here that in order to do good science, one must know which auxiliary assumptions are reasonable to question in the face of evidence that conflicts with one’s predictions.  The person who tests whether the earth underwent a major shift in rotation, for example, is not doing good science.Since the list of auxiliary hypotheses is indefinitely long, one cannot simply test them all.

Tom argues that Popper's contribution (the philosopher, not the chef) was to identify falsifiability as the mark of science. But that can't be right, because any hypothesis can be rendered unfalsifiable with the right sort of tweaking to auxiliary assumptions. One can always cling to a cherished hypothesis in the face of conflicting evidence by raising questions about the auxiliary hypotheses instead. That is why falsifiability is not an adequate criterion for distinguishing good from bad science. 

In practice, of course, we do not let people get away with indefinite amounts of tweaking to background hypotheses. Sometimes a scientist will hold onto a hypothesis in the face of conflicting evidence; other times she will reject it. The important point is that nothing about the HD method enables one to make these judgments.  How one makes these judgments depends on the information available about the system in question, the range and nature of the assumptions informing one's study, the prevailing theories, etc. That is what I meant when I said that if you want to know how a science works, look at how it deviates from  the HD method. In other words, to understand a science, look at which hypotheses are rejected and which are retained in the face of conflicting evidence.

In my earlier post I called this as “discipline specific” knowledge. Karl seems to have mistaken me for saying that one has to be initiated into a discipline in some formal sense in order to acquire discipline specific knowledge.  He imagines trying to test some sociological hypothesis (I think it was a hypothesis about the tendency for ecologists, unlike most other scientists,  to be wedded to the HD model). Nothing is preventing him from trying. Nothing is preventing me from trying to bake a cake with Chef Popper’s instructions.  Let me know how that works out for you though. 
  
The important point is this. To have a method is to have a recipe.  Chef Popper’s instructions were not a real recipe because they do not provide a set of step by step instructions for baking a great cake.  Similarly, Philosopher Popper did not provide a scientific method. He also did not provide a set of instructions that are adequate for doing science well.  One of the reasons why Philosopher Popper’s instructions are inadequate is because they do not tell you which background assumptions are reasonable to question when predictions are not borne out. There is NO discipline-general method for making these decisions.

I should add that the only original bit of philosophy here is the story. Philosophers of science have recognized these limitations in the HD "method" for decades. I reckon that we have just done a poor job of explaining its limitations to other disciplines. I fear that I might not be doing much better.

Friday, March 2, 2012

The fallacy of our misunderstand of the scientific method

Ryan Norris via Tom Nudds sent me a link to this web site, and I think all three of us had the same "What the ..." reaction after reading the entry by Irene Pepperberg, titled "The Fallacy of Hypothesis Testing":
"I was trained, as a chemist, to use the classic scientific method: Devise a testable hypothesis, and then design an experiment to see if the hypothesis is correct or not. And I was told that this method is equally valid for the social sciences. I've changed my mind that this is the best way to do science. I have three reasons for this change of mind."
These three reasons are:

  • the importance of observations, without explicitly testing hypotheses
  • testable hypotheses are not interesting
  • the scientific method often leads to proving a hypothesis, not testing it
These three reasons point out some major misunderstandings of the scientific method (see this post, or this post) :
  • the context leading up to the hypothesis-prediction-test (the question, background information, etc.) forms an essential part of the scientific method
  • the discussion of results (the information increase that leads to the next cycle) also forms an essential part of the scientific method. This is exactly what she points out "... the exciting part is a series of interrelated questions that arise and expand almost indefinitely".
  • this is a general misunderstanding that Dr Peppenberg correctly identifies. But this is not a reason to dismiss the method, but more a call for better education.
We always tell our students that a full understanding of the scientific method, despite its apparent simplicity, is actually more challenging than it looks like. And now we can point them to this article, because this is a successful scientist from one of the best universities in the world (Harvard), who has a very limited idea of the scientific method.


Thursday, March 1, 2012

Galapagos

A visual story about field work on the Galapagos Islands, and the human influence on the fauna and flora.

Friday, February 10, 2012

Scientific debate...

... rests on actually reading and understanding each other's words.

This is a difficult exercise, and not only for students. Tom Nudds and I just covered in community ecology the rationale for protected areas planning based not only on representativeness (making sure that all species are protected), but more importantly also on persistence (see his article in Biodiversity and Conservation for a summary of their ideas). If the target areas have all the species you want to conserve, but lack essential components that would ensure their actual persistence through time, representativeness means nothing. Wiersma and Nudds addressed this in their article by focussing on one important component of target areas, minimum reserve area:
"We use candidate protected areas that meet an empirically-derived estimate for a minimum reserve area (MRA) above which no mammal extinctions have been detected from existing protected areas since widespread European settlement, even in parks that have become insularized from the surrounding habitat matrix (Gurd et al. 2001)."
Despite the obvious logic of this argument, not all conservation planning uses it. In the summer of 2011, Pompa and co-authors published an article on developing target areas for conserving marine mammals.


Conservation targets covering (A) 10%, (B) 15%, (C) 20%, and (D) 25% of the marine mammal distributions using the Marxan optimization algorithm to optimize the number of grid cells and its geographic location.
Yolanda and Tom recently wrote a lettre pointing out the lack of including this notion of persistence in the set-up of the Pompa et al. analysis:
"Our argument previously (3) has been that percentage targets do not allow for any assessment of whether species will persist in areas set aside for conservation, and this critique continues to hold for the analysis by Pompa et al. (1) Their study assessed global representation by using 1° × 1° grid cells, but does not address whether this resolution meets criteria for species persistence."
And this is where it gets ... messy, weird, strange? Pompa and co-authors wrote a reply to this criticism, but actually did not address the lack of including persistence at all. The somehow summarize the single point of the criticism into four separate points:
"Wiersma and Nudds (1) make four main points: (i) the area to be covered by the key conservation sites proposed is a negligible percentage of the ocean; (ii) they doubt the representativeness of our biodiversity patterns; (iii) they question the persistence of the species; and (iv) they claim we do not acknowledge the dynamic nature of ecological systems."
Of the 4 main points identified by Pompa et al. in their reply, (i) and (ii) were side points on the limitations of only including representativeness, and they implicit agree that they did not address point (iv). And it is not that Wiersma and Nudds questioned the persistence (iii) per se of the species, Wiersma and Nudds pointed out, rightly,  that this necessary and important condition was not part of the original Pompa et al. conservation strategy.

Instead of acknowledging this directly, they use the scientist cop out, which is so easy to make, but also so unsatisfactory and weak:
"Despite constraints, our approach still best available."

Wednesday, February 8, 2012

Context is everything

An interesting paper in Science by Stumpf and Porter takes a hard look at "general" power laws in science:
"A striking feature that has attracted considerable attention is the apparent ubiquity of power-law relationships in empirical data. However, although power laws have been reported in areas ranging from finance and molecular biology to geophysics and the Internet, the data are typically insufficient and the mechanistic insights are almost always too limited for the identification of power-law behavior to be scientifically useful (see the figure). Indeed, even most statistically “successful” calculations of power laws offer little more than anecdotal value."
One argument that they use is that power laws are a statistical feature (or artifact, depending on your point of view) of probabilistic theory:
"Suppose that one generates a large number of independent random variables xi drawn from heavy-tailed distributions, which need not be power laws. Then, by a version of the central limit theorem (CLT), the sum of these random variables is generically power-law distributed." 
So finding statistical evidence for power laws (which is not that straight forward, according to the authors)  is not even the main problem, finding "a generative mechanism" should be the main focus of scientific pursuit. This is a roundabout way for these mathematicians to suggest the importance of the scientific method, where the context (question, hypothesized causal mechanism, prediction) drives the statistical test that will be tested with new data.

This is one of the biggest hurdles when teaching the scientific method to undergrad (and sometimes graduate) students (and faculty?). We get so many questions that start out with "Is this a good test?", and we always have to get back to them with our standard reply "Context is everything": this test means nothing to us without knowing the context. And now I can refer them to this article as a nice example of the fallacy of focusing on the stats in isolation of the rest.

In addition, the authors also provide a subtle argument against the seductive power of induction. Their concluding sentence reads:
"The most productive use of power laws in the real world will therefore, we believe, come from recognizing their ubiquity (and perhaps exploiting them to simplify or even motivate subsequent analysis) rather than from imbuing them with a vague and mistakenly mystical sense of universality."

Tuesday, February 7, 2012

Tangled bank

What do these four, apparently very unconnected items, have in common?

  • Star wars: the clone wars


My life used to be easy.

I was a scientist,  climber (a long time ago), father, and a teacher. And now suddenly these separate aspects of  my life got sucked into a vortex, all thanks to reading some passages of Aldo Leopold's "The Land Ethic" for the First Year Seminar course Ingrid and I have designed. These items literally came across my desk (RSS reader, watching TV with the kids, teaching a course) in a 2 day period. And they point out ethical decisions we/I should be making on a daily basis:

  • should we/I, scientists, actually sample this unique lake that has been separated from the rest of life on earth for who knows how many years?
  • should we/I "conquer" (Leopold would love this word) a mountain by drilling and placing bolts every couple of feet?
  • should we/I study, let alone kill, the last individual of a species that could end a war, especially if society (or in this case Chancellor Palpatine, for all the wrong reasons) made this decision?

My life is now a tangled bank.



Tuesday, January 31, 2012

collaborative learning: two sides of a coin

Collaboration is on my mind these days. Ingrid pointed out this article to me, about, among other things, the potential pitfalls of group work. The author, Susan Cain frames this in the presence or importance of introverts, and how negating introverts has a negative effect on the performance of group work. Choice quote from the interview:
"Forty years of research shows that brainstorming in groups is a terrible way to produce creative ideas."
(After I blogged this, I found a similar article were Susan Cain explains it in her own words: http://www.nytimes.com/2012/01/15/opinion/sunday/the-rise-of-the-new-groupthink.html)

And guess what, serendipity strikes again: I had flagged this recent article in PNAS for a closer look and a potential blog article. It actually explores this tension between individual versus group learning. The outset of the paper is the experimental result that in networks of learners (i.e., collaborators), reducing the speed of information exchange between learners (or network efficiency) actually leads to higher performance of the network as a whole. Or translated into the Cain framework: introverts can be introverts, do not have to adjust to group thinking, and can thus lead to higher performing networks (note: this is not necessarily the 40 years of research that Cain refers to). The mechanism provided in the PNAS article puts this in more scientific terms:
"[T]he explanation is that slowing down the rate at which individuals learn, either from the “organizational code” (3) or from each other (8, 11, 13), forces them to undertake more of their own exploration, which, in turn, reduces the likelihood that the collective will converge prematurely on a suboptimal solution."
The problem with this mechanism is that it is mainly based on computer simulations. The novelty of the article is that they tested it with real, human, players using a large sample size. And they found the opposite results:
  • network efficiency was positively correlated with performance
Average points earned by players in the different networks over rounds (error bars are ±1 SE) in sessions where the peak is found. Graphs with high clustering and long path lengths are shown in dark gray; those with low clustering and low path lengths are shown in light gray.
  • players copied less in more efficient networks
(A) In contrast to theoretical expectations, less efficient networks displayed a higher tendency to copy; hence, they explored less than more efficient networks [numbers and colors (orange is shorter and green is longer) both indicate clustering coefficient]. (B) Probability of finding the peak is not reliably different between efficient and inefficient networks.
These results indicate that in networks with efficient information exchange, individual behaviour (of for instance introverts) is valued, and this individual behaviour helps to increase the performance of the group. Or the apparent contradiction between collaboration and introverts is saved!

This blog post provides some real-world evidence for this. It is a book review of Steven Johnsons's Where good ideas come from. John Batelle extracts this nice figure (which of course appeals to myself as a scientist), illustrating that major ideas arise more frequently in a networked environment:
Johnson's chart of major ideas emerging during the 19th century, categorized by commercial and networked properties

The two sides of the collaboration coin are thus individuality and efficient information exchange.

It is not that simple, of course, but at least it provides a mechanism to for instance increase the efficiency of collaborations in a class room setting:
  • it is important that everyone participates
  • it is important that everyone thinks about the issue by themselves, before sharing it with the group
  • it is important to listen to everyone
  • the flow of information between participants should be as efficient as possible

Thursday, January 26, 2012

Making the story whole again

Ingrid is featured in the At Guelph!  I repeat, Ingrid is featured in the At Guelph! It is not her whole story, of course, but it does provide key aspects of her story.


Sunday, January 22, 2012

Farewells

There is a time of arriving, and a time of leaving, and the time of leaving has arrived. Thiago and his family is going back to Brazil, after a very successful, productive, and fun 6 month visit. Now I have to find a way to visit Brazil, somehow, sometime. The lab will be a little bit quieter, again.

Saturday, January 21, 2012

"You don't have to be neutral to talk"

Ingrid and I are currently teaching a first year seminar course on "Environmental healing and deep community". Designing the course, let alone actually doing it, has been an eye opener. One of the issues and I always need to have a conversation about, often very long ones, is the notion of "bias". This can have so many different meanings, and it trips me up every time. I started watching this TED video below, since it deals with dialogue and collaboration as a means for conflict resolution. However, I was really struck by something the presenter said:
"You don't have to be neutral to talk."
And then I have to try and reconcile this with one of Tom Nudds more spiffy quotes:
"Science takes part; it doesn't take sides."
How do I get out of this conundrum?

Tuesday, January 17, 2012

Going down the statistics rabbit hole

It all started with this post:
http://www.johndcook.com/blog/2012/01/13/interpreting-statistics/

which led to this post:
http://wmbriggs.com/blog/?p=5062

which led to this post:
http://wmbriggs.com/blog/?p=4381

which let to this post:
http://wmbriggs.com/blog/?p=4368

And finally to this hilarious, submitted, manuscript:
http://arxiv.org/abs/1201.2590

Reading them in this sequence will explain why the author, William M. Briggs, writes things like:
"When supplicants approach us for wisdom about uncertainty, because of our ardent love of computation we have developed the unfortunate habit of insisting first on the memorization of mathematical incantations, such as figuring chi-squares (by hand), or we require students look to up values in obscure tables in the backs of textbooks. In pursuit of our love, we forget why civilians come to us. Even we statisticians can forget why the math is there. Because of this, short shrift is paid to interpretation and to the limitations of the methods, to what it all means, which is all that civilians care about."
or something like this (and this is only on page 4 of the manuscript):
"This state of affairs is odd because frequentist theory tells us that the p-value is as silent as the tomb about the truth of the theory at hand. Yet when a civilian cocks his ear towards a wee p-value, he hears music. Angels sing."
Where is Brian Ripley when you need him? He has the same confidence of mind and writing style to potentially provide a valid counterpoint to the issue brought up by  William Briggs, and watching these 2 alpha males take a go at each other would be very entertaining and informative.

On another note, I wished ecological manuscripts were sometimes as personal in writing style, although I would not know what to do with a similarly written manuscript as an associated editor. I would definitely smile more when reviewing papers, that is for sure. Now we ecologists seem so boring.

Tuesday, January 10, 2012

Science and uncertainty

I just started teaching Community Ecology with Tom Nudds again, and one of the main themes of this course is exposing students to the importance of uncertainty in science (or Science, if you want). Today another insightful article by John Timmer in Ars Technica appeared, and provides a real-life example of this uncertainty.

It tells the story of Peter Duesburg, a scientist with an impressive academic history:
"He did pioneering work in the characterization of retroviruses (viruses that are transmitted using RNA as a genetic material, but then copied into DNA and inserted into their hosts' genome), helping to show that they could pick up genes from their host that enabled them to induce cancer. That work, extended by others, ultimately led to the oncogene hypothesis, which suggests that cancer is caused by mutations in a limited number of host genes that control cell growth. The work got Duesberg a tenured position at Berkeley and election to the National Academies of Science."
 He also has a history of challenging the status quo. He was co-signer of a lettre in Science that questioned the link between HIV and AIDS. Other co-signers did not have the same scientific credentials:
 "But one signatory is an actuary; another wrote a biography of Duesberg; two are journalists. One of the journalists was the author of the Politically Incorrect Guide to Science, which also criticized the science of climate change and evolution. The biggest surprise was the presence of Phillip Johnson, the Berkeley Law professor who had by this point founded the Discovery Institute and helped develop its wedge strategy, a plan to replace science as it's currently practiced with something he found more theologically palatable."
 The reason for this article was his contribution to another peer-reviewed publication on the potential absence of this link. It provides an interesting read, but the real scientific discussion only starts with this publication. I am pretty sure that this manuscript got rejected by several other publications, and not only the journal Medical Hypotheses. So now it is just waiting for those arguments to form part of the debate to show up in the scientific literature and/or blogosphere.

In the mean time, this controversy can maybe form a starting point for a discussion on the importance of uncertainty in science for undergraduate students?

Monday, January 2, 2012

The Cottenie commandments - illustrated

It's been a long time since I focused on this, but I found some nice illustrations for some of them. The illustrations in this post by Maria Popova reminded my of the 10 Commandments. I looked through the other posters on the original website Advice to Sink in Slowly, and actually found an illustration for every commandment:

1. reveal curiosity/passion
Andy J Miller
2. tell a story
Mathew Isherwood

3. great dream
Lee Basford

4. fun
Dave Bain


5. vulnerability
Rebecca Cobb

6. connection
Simon Vince



7. concept map
Ryan Morgan
8. no usual shtick
Lily Trotter
9. metadata
Ellie Cryer


10. Intuition
Carys Williams