I heard someone in a radio news story the other day say something like, “Well, they were government employees, so of course they were naturally very risk averse.” Those kinds of statements along with a lot of others I hear about government inefficiency always grab my attention since I happen to be a government employee for going on 23 years. I’ve been working for the Federal Government in one capacity or another since I started as a volunteer biological tech in a high school mentorship program at age 15.

I read an article a while back in GovExec magazine¬†and a referenced study on the most innovative agencies to work for that got me thinking about these issues. I’m also currently in a relatively senior leadership position that has me thinking quite a bit about some of the issues we have moving forward on some things as a result of an overall risk averse position and outright fear of failure. So, what is it about risk and failure that are so anathema within our government employee culture, and what can we do about promoting a new cultural norm that rewards risk taking and encourages innovative thinking?

The GovExec article is titled, “If It’s So Hard to Fire Feds, Why Are So Many Federal Leaders Risk Averse?” It is hard to fire Federal employees, but it’s not impossible. I’ve seen it happen. The unfortunate thing is that, because it’s so hard, the most common way to deal with a problem employee is to move them along to some other job – sometimes with a promotion just to try and get them to a place that is “less harmful.” That sort of behavior and all the associated wasted energy and inefficiency this introduces into an organization are some of the things that drive me absolutely nuts about working in government.

However, the bright spots I’m involved in with some of my government colleagues in both my own agency and other organizations in our field of earth systems science make me want to do something about this problem at whatever scale I can manage. I have the pleasure of working with some of the most brilliant and innovative people, many of whom are long-time government scientists and other professionals. We’re at or near the leading edge of our fields in science and technology in many cases, and because we have the long-term mandate and ethos for making sound scientific data and information available to the world, we have developed solid software and cyberinfrastructure engineering practice working with cutting edge technologies.

I’m fortunate to work in a science agency and lucky enough to be in a position where I am being supported in working to push some boundaries and develop a new data science program. In about 1998, I put together my first rudimentary web application that made a process more efficient in a regulatory agency – making lists of threatened and endangered species by county available online for organizations needing to potentially consult with the Federal government before proceeding with a possible adverse action. This was tangential to the job I trained for, but from that point, I started developing what’s become a lifelong passion for a more elegant blending of science and technology leading toward more rapid advances in understanding complex earth systems and applying that understanding to more informed decisionmaking in the areas of natural resource management, conservation planning, risk mitigation, and societal sustainability.

My motivation and drive has always been primarily internal. As at least an almost-scientist, I pursue the things I grow interested in over time as long as they align closely enough with the mission parameters of the organization. I’m also fortunate to be in a position where I’m being asked to help explore the edges of that mission within legislated authority and develop a new science program that is already making some strides toward new innovative ways of doing better science with advanced technologies.

But I also look back and see that I had mentors and leaders along the way who actively encouraged appropriate risk taking, allowed failure, and gave me space to learn from my mistakes. Part of that has come from practicing science, which by its nature is a risk taking enterprise. We don’t know if our hypotheses are going to be proven or disproven going into an experiment, but the research is designed to tell us something valuable either way it goes. But part of it also came from the particular kinds of supervisors and mentors I’ve had guiding me and the particular interplay of ideas and personalities I developed with them.

My current local organization is essentially working to figure out what it wants to be when it grows up. We brought together a couple of different groups with different backgrounds and cultures, and several of us leadership types have been working to position the organization strategically to make some major advances in our ability to do big, data-driven science. Only about 10% or less of what we do today is actual scientific research, and we’re trying to grow that from ideas in a 10-year science strategy published last year that I was involved in and a new 3-5 year science plan for our immediate organization. Most of our strengths lie in software engineering, information technology, and data management methodologies. We’re hoping to develop more of a scientific research program while retaining and growing our engineering and operations capacity such that we implement actual on the ground capabilities from our research.

I feel like I’ve been beating my head against the wall for about the last two years with what seem to be some pretty major personality conflicts and culture clashes as we try to form up a new organization. I’m coming to see that some of the biggest challenges for our group have to do with what is increasingly obvious risk aversion and fear of failure that seem to be ingrained in our current culture, which brings me back to the whole point of this post and a series of explorations in this area. In science, I’ve been trained to get beyond the “what” questions to the “why” questions. The what questions seem fairly clear. We’re generally several years at best if not a decade behind the times in our thinking about technological methods of working with scientific data. We seem to always go with what are generally well known and proven solutions to the various problems we are trying to address that are well within our knowledge-bank instead of digging a little deeper to find what might be a better and more long lasting approach. We don’t try unknown things that might fail even though many of the known things have already proven their deficiencies.

But why is that? Where does this overall perception come from that government employees are by their very nature risk averse? Why do my organization and the people in it sometimes seem to embody this issue? Despite being poised for some pretty cool success and advancement in the area I’ve always been passionate about, why are we seemingly afraid to take the next step?

If I can get to some answers for these questions, then perhaps I can start digging into ideas on what to do about it. How do leaders inspire risk taking while still keeping the wheels on an organization? How do they reward success but also reward lessons learned from failure? Can leaders promote new or enhanced internal motivations to innovate as well as provide the external motivators that seem to influence the majority of employees?

And on the grand scale, how do we shift the paradigm of government being risk averse and not very innovative at least at a small scale? These are the questions I plan to pursue in coming days or weeks through this blog in an effort to get my head around what is really a pretty major problem for me right now.

People I work with might run across this public but personal opinion blog and might even be offended at some of the things I’m exploring. I’m okay with that. I’ll even be delighted by it if you will engage with me in the conversation or argue an adverse position with me.