By Lindsay Reynolds, PhD
Blogging is a powerful avenue our society has developed as a way to communicate ideas, but 15 years ago the word didn’t even exist. And now, some people are already asking, has blogging hit its peak? The first Blackberry smartphone was introduced in 2002, and for a long time the only people I knew with smartphones were my friends in med school. Then, the iPhone emerged in 2007, and now smartphones are ubiquitous. None of us need to be reminded how much technology has changed our world and the breathtaking pace at which it continues to change. Not only is it changing our social world and the way we communicate, but it has changed how we do science. How do we keep up as scientists and how do we figure out when and where to allocate time to learning new tools?
This is a question every scientist has to ask themselves. Its not a new problem, either. It’s as old as science itself. In the end, it is an issue that is less to do with learning the shiniest new technology and more to do with maintaining a mindset as a constant learner in your career. Technology is the most glaring example these days of a subject where everyone is doing some self-teaching, but embracing the mindset of a constant learner includes reading current literature, teaching yourself state-of-the-art statistical techniques, and, if your career takes a turn into a field where you have little experience, getting out a basic text book in that field and doing some reading.
While we’re in grad school, we have the opportunity to take classes to learn most of the tools we need for our science. But once we meet our course requirements, we start writing and then eventually graduate, and it becomes hard to find the time to take an entire course. Thus, we have to short cut the process and figure out how to teach ourselves new skills that will enhance our science. Taking the time to learn a new tool is a careful cost-benefit analysis. Will a new tool help advance my science in an essential way?
One of my committee members used to preach about self-teaching. At the time, I didn’t fully grasp how meaningful his message was. Self-teach? Of course, isn’t that what everyone strives for? But now that time is more precious, I realize what he was trying to demonstrate. If you recognize a tool will make your science more powerful and relevant or if you see your field moving in a new direction, it is worth taking the time to learn. The same professor gave me material to teach myself the statistical software R and also basic Bayesian statistics. I never took a class in these subjects, but with a little self-teaching I became proficient. I’m still struggling to fully wrap my brain around Bayesian stats, but R has become an essential part of my repertoire. ArcGIS, various citation softwares, a few computer programming languages, and basic hydrology are all examples of tools that I ended up teaching myself and now are essential to my work.
Investing time and money can be a risk. This past summer, I was hoping to implement field data collection on iPods so that my data was entirely digital. I wanted to eliminate the time-consuming data-entry process and finally getting rid of paper data sheets. So I bought all the necessary hardware (including some cool solar chargers!) but ended up not having time to develop/learn a data–collection app for the iPods. I fell back on paper data sheets and my field techs spent two weeks at the end of the season entering data into our database. Self-teaching fail! I’m still hopeful that I can put those iPods to use next field season, though.
As scientists, we inherently have a tremendous capacity (and usually a love for!) learning. But it is easy to get entrenched in our work, the niche we carve out for ourselves in our field, and get frustrated when a new technique demands learning time. In many cases, senior scientists leave it up to their grad students and post docs to do the work on new learning and provide an avenue for new tools to enter their research program. This is an effective model because many productive scientists do it. Is it the best model? I’m not sure. It can leave them at the mercy of colleagues who are more proficient in new techniques and potentially left behind when the field advances with new tools.
Taking time to learn is important. It keeps us relevant. But it’s often painful to fork over the hours necessary to gain new tools. So it becomes a delicate balance between allocating time to learn, when necessary, while still keeping up with the rest of one’s regular workload. But, in my experience, it is usually worth it.