Taking time to learn

Growing exotic and native riparian trees

By Lindsay Reynolds, PhD

Blogging is a powerful avenue our society has developed as a way to communicate ideas, but 15 years ago the word didn’t even exist. And now, some people are already asking, has blogging hit its peak? The first Blackberry smartphone was introduced in 2002, and for a long time the only people I knew with smartphones were my friends in med school. Then, the iPhone emerged in 2007, and now smartphones are ubiquitous. None of us need to be reminded how much technology has changed our world and the breathtaking pace at which it continues to change. Not only is it changing our social world and the way we communicate, but it has changed how we do science. How do we keep up as scientists and how do we figure out when and where to allocate time to learning new tools?

This is a question every scientist has to ask themselves. Its not a new problem, either. It’s as old as science itself. In the end, it is an issue that is less to do with learning the shiniest new technology and more to do with maintaining a mindset as a constant learner in your career. Technology is the most glaring example these days of a subject where everyone is doing some self-teaching, but embracing the mindset of a constant learner includes reading current literature, teaching yourself state-of-the-art statistical techniques, and, if your career takes a turn into a field where you have little experience, getting out a basic text book in that field and doing some reading.

While we’re in grad school, we have the opportunity to take classes to learn most of the tools we need for our science.  But once we meet our course requirements, we start writing and then eventually graduate, and it becomes hard to find the time to take an entire course. Thus, we have to short cut the process and figure out how to teach ourselves new skills that will enhance our science. Taking the time to learn a new tool is a careful cost-benefit analysis. Will a new tool help advance my science in an essential way?

One of my committee members used to preach about self-teaching. At the time, I didn’t fully grasp how meaningful his message was. Self-teach? Of course, isn’t that what everyone strives for? But now that time is more precious, I realize what he was trying to demonstrate. If you recognize a tool will make your science more powerful and relevant or if you see your field moving in a new direction, it is worth taking the time to learn. The same professor gave me material to teach myself the statistical software R and also basic Bayesian statistics. I never took a class in these subjects, but with a little self-teaching I became proficient. I’m still struggling to fully wrap my brain around Bayesian stats, but R has become an essential part of my repertoire.  ArcGIS, various citation softwares, a few computer programming languages, and basic hydrology are all examples of tools that I ended up teaching myself and now are essential to my work.

DataLoggingStation

Solar powered data loggers for a stream gage in the field

Investing time and money can be a risk. This past summer, I was hoping to implement field data collection on iPods so that my data was entirely digital. I wanted to eliminate the time-consuming data-entry process and finally getting rid of paper data sheets. So I bought all the necessary hardware (including some cool solar chargers!) but ended up not having time to develop/learn a data–collection app for the iPods. I fell back on paper data sheets and my field techs spent two weeks at the end of the season entering data into our database. Self-teaching fail! I’m still hopeful that I can put those iPods to use next field season, though.

As scientists, we inherently have a tremendous capacity (and usually a love for!) learning. But it is easy to get entrenched in our work, the niche we carve out for ourselves in our field, and get frustrated when a new technique demands learning time. In many cases, senior scientists leave it up to their grad students and post docs to do the work on new learning and provide an avenue for new tools to enter their research program.  This is an effective model because many productive scientists do it. Is it the best model? I’m not sure. It can leave them at the mercy of colleagues who are more proficient in new techniques and potentially left behind when the field advances with new tools.

Taking time to learn is important. It keeps us relevant. But it’s often painful to fork over the hours necessary to gain new tools. So it becomes a delicate balance between allocating time to learn, when necessary, while still keeping up with the rest of one’s regular workload. But, in my experience, it is usually worth it.

Summer05FieldNotes

Old fashioned paper-and-pencil field data

Advertisements

4 thoughts on “Taking time to learn

  1. Lindsay makes some very good points about self-teaching and building valuable research skills. I once had a boss who said that graduate school isn’t where you go to do earth-shattering science – it’s where you go to learn how to do science. As an early career scientist the questions matter, but engaging in, and even embracing, the process matters much, much more. There will always be a first time an individual has relatively complete control over his or her experimental design, data collection, management, and analysis. Moments of uncertainty will arise, but the researcher must prevail. Usually this coming of age arrives in some form of undergraduate research or graduate school. More often than not, when this scenario arrives, the stakes are high. In any first research experience, the objective is to create knowledge. This can be done efficiently, elegantly and with contemporary methods, or it can be done sloppily, with old school methods. Both approaches can and do work, but eventually the goal shifts from learning the ropes of doing research, to doing it transparently, consistently collecting quality data, meeting deadlines and serving an essential role within collaborative teams. I never knew what my old boss meant until I realized that it doesn’t matter if a researcher’s first paper runs in Science, but if they have the skills and attitude to learn those skills that will propel them onward.

    While Lindsay outlines the value and challenges of self-teaching and learning new, integral skills, I prefer to focus on the resulting opposite of self-learners, someone who may outsource certain parts of the research process rather than growing their skill sets. These are the individuals whom we’ll call “Henny Penny collaborators.” These aren’t the people who have spent time learning how to improve their efficiency, quantitative skills, programming, or data management. These are the people who haven’t turned GIS on since its inception, still use spreadsheets for everything, and think that new lab or field techniques are for someone else. These are the folks who don’t want to figure out how to fish, they just want grilled salmon for dinner. Eventually, their favor diminishes among their peers and their scientific potential diminishes as the field moves on. Much like ol’ Henny Penny, nobody wants to share the bread with someone who can’t reproduce the work – and perhaps doesn’t even know that it’s made of wheat. When these HPCs run out of questions that a one-way ANOVA can answer, or need to use novel methods to make inference sans grad students or skilled technicians, they might just be sunk.

    If making the time to teach oneself a statistical programming language seems daunting, the payoff is in the pudding. Self-teaching cultivates skills that help us to do our work as ecologists more effectively, helping us to ask bigger questions and answer them more elegantly than we otherwise could. These self-taught skills are also the same in-demand, employable entities that can help shape the trajectories of our careers. In my estimation, staying methodologically current and well read, even in one’s spare time, is time well spent. After all, chances are good that we’re going to be doing this ecology stuff for a while.

  2. What a great post – you really capture the nature of walking the balance between embracing new paths and not getting caught down too many dead-ends! As someone who doesn’t shirk from self-teaching (from database design to multivariate stats), I know from experience that for some skills, I really would have been better off sucking it up and making time for a class, workshop, or webinar. A fundamental trick is deciding my approach for different skills –a combined tack often works best, with some self-teaching and some supported learning. I’ve learned the latter can include scheduled tutoring from colleagues, which can be really efficient and more fun (once I get over the blow to my ego at admitting I need help and/or feeling like I’m wasting their time!)

    And since Nate addressed important technological considerations in his reply, I’ll let myself go with some philosophical thoughts I had about this post. One of the saddest moments in my grad career was when I was talking stats with my dad (who got his M.S. 30 years ago), and he reminisced about doing regressions by hand – I realized that many of our approaches will be “antiquated” in a very short time. But the sad moment didn’t last long, because I realize that being a scientist is really about the process as much as the knowledge. In order for methods – whether statistical, technological, or methodological – to become more elegant and efficient, they have to be tried out, tested in different contexts, and they have to fail too (thank you Lindsay!). So along with contributing to knowledge through my research, I am always looking to also push technology and methods along or discover their limitations – this can take the sting out of finding myself down a “dead end”.

    For my own balance, I have adopted a couple of rules of thumb: somehow I am not worried that they contradict each other. The first I mentioned above: embracing science as a process which is evolving and not being afraid to admit that what I’ve done in the past might be replaced with something better, simpler, or easier. The second (contradictory) rule is always questioning whether a simpler approach will work. I was really surprised when my advisor (somewhat known for complex multivariate approaches and neural networks) recently pushed me toward one-way ANOVAs and simple regression models in some work. I took home the important lesson that complex isn’t always better – but to use Nate’s “Henny Penny” analogy, it’s important to educate myself about what different approaches offer, and choose what will best answer my questions, test my theories, or move the technology forward.

  3. This is a great post on a very relevant topic. It hits particularly close to home for me as I enter deeper into the world of remote sensing and ecological modeling. I no longer collect field data; instead I base my research on processed satellite data, model runs, or data compiled from the literature. I have found that staying competitive in this field requires an iterative learning process where new tools are implemented as they are developed. In part, this is because productivity (i.e. publishing papers) is often limited by how efficiently large amounts of data can be processed into meaningful results. Currently, I am struggling to strike a balance between learning new skills (skills I am certain will improve my productivity) and maintaining a comfort zone (using familiar tools that may be less efficient). Often, I find myself taking on projects that require new unfamiliar tools and learning as I go. While this approach works, I find it adds unnecessary anxiety since hang-ups with new techniques can become very frustrating when working under time constraints. In the future, a goal of mine is to reserve more time for fun, exploratory exercises with new tools that allow me to learn new techniques while not “under the gun”. Does anyone else have experience dealing with this issue?

    I think this post and the previous comments do a nice job of establishing the importance of reserving time to learn from an “evolve or perish” perspective. However, I think an important counter-point is that we are now at a point in science where an individual cannot have an understanding of every tool/model/statistical technique of relevance to their work. Thus, another very important and relevant topic is knowing when and how to “outsource” or bring in collaborators. In graduate school, I have found the focus is on self-subsistence; with very little in the way of learning to develop healthy, mutually beneficial collaborations. I would be very interested to hear from others regarding their perspective on developing collaborations.

  4. Another delayed response, but I’ve been mulling this over for a while.

    First off, great post, Lindsay! I have three general responses to the post and resulting comments.

    1. I totally feel your pain with the digital data collection vs paper data sheets conundrum. You may remember my struggles with palm pilots and the EcoNab software– in the end I think going digital was totally worth the time investment, but there is that small part of me that still wants a paper backup. I did lose some data my first year due to equipment malfunction– it wasn’t a lot of data, but it was enough to scare me.

    2. I struggle with the balance between learning new techniques and getting projects out the door. Absolutely Lauren is right to remind us to keep it as simple as possible to do a good job. I recently submitted a paper with what I thought was a beautiful analysis, only to have the reviewers tell me I had over-complicated it unnecessarily. I think my initial approach was justified, but ultimately I went back and re-did the analysis using more traditional approaches– which probably made the paper more accessible to a wider readership. So, there are trade-offs.

    3. I want to touch on the comments about building collaborations and Nate’s idea of “Henny Penny” collaborators, I think they are related. Yes, there are good and bad collaborators, and no one wants to collaborate with folks who don’t pull their weight. But, we can’t all learn everything as Bill points out (I think that’s BIll?), so I think we absolutely need to make use of each others’ strengths. For example, I love collaborating with researchers who are experts in their field, but maybe need a little help with some statistical analysis that I already know how to do, or can learn quickly. This can be a great opportunity to participate on a paper that doesn’t require that much investment on my part, and I get to learn something new about a specific application– that’s a win-win in my book.

    Developing good collaborations is critical for success in a research oriented field– I think the first step is identifying what you have to offer that might be unique. That might be a willingness to learn, and the ability to do it quickly and efficiently, or it might be some specific in-demand skill you can bring to a larger project (GIS, stats, programming, graphic design, etc). There is a lot more to think and write about on collaborations and how to develop them– Perhaps a follow up post is in order?

Comments are closed.