This article originally appeared in Human Resource Executive.
Throughout my career, I have usually been the interviewer, but occasionally I’ve also been the interviewee. Whichever side of the table I have sat on over the past three decades, my least favorite question is always: What keeps you up at night?
I spent the early part of my career in healthcare, first working with Olympic and elite athletes, obese children, pregnant and post-natal women and the apparently well general community; and then running industrial medicine programs. The only thing that keeps me up at night from either a work or personal standpoint is if someone is hurt, suffering or dead on my watch. That’s the gift working in healthcare gives you—lifelong perspective.
So, for me, the sleepless night query is a lazy question. It’s generally code for something along the lines of: “What big problem are you working on?” or “What do you believe is the most challenging issue for employers today?”
Neither of those questions, however, stimulates me to give the interesting answer the person is seeking. But this one does: What distracts you during the day?
When I am trying to solve a problem for which there is no easy solution, almost everything I read or hear causes me to consider how that information might relate to my problem.
Whether working in healthcare, for an insurance company, consulting with employers or running a nonprofit, the basic and vexing problem I’m trying to solve is behavior change and how, ultimately, human beings evaluate and respond to risk.
Here’s what I’ve learned. Essentially, we’re baked and done at approximately 18 years of age. Around that time, your body executes an efficiency review. Any neural pathways your brain hasn’t used are trimmed away. (Note: We have also learned, however, that if you need a pathway back after something traumatic such as a stroke happens, it can regrow with dedicated rehabilitation.)
What is the impact of this spring cleaning of the brain? The person you were as an 18-year-old is, in some ways, your setpoint. If you exercised, ate seven to 10 fruits and vegetables a day, were the right weight for your height, didn’t smoke, didn’t drink more than one alcoholic beverage a day, always wore a seatbelt, saved money for a rainy day, did the healthcare and dental visits recommended for someone your age, etc., you’re set up for a reasonably good physical and fiscal life. Even if you stray from these behaviors due to changing life circumstances, it is easier to get back to these habits because the neural framework is there to support you.
But if your 18-year-old self had some room for improvement, you can undergo changes as you become older and wiser. It will be simply more difficult for you to execute, since you will be working against your neural wiring.
So what does this background have to do with employee benefits?
The longer I work in and around employee benefits, the more I’ve come to appreciate that there are enormous advantages to health- and financial-benefit programs that either a nation or an employer selected and paid for.
Unfortunately, most adults evaluate hazards differently than risk-considering people like me, HR executives or actuaries.
When Texas cattle producers sued Oprah Winfrey for creating “a lynch-mob mentality” among viewers during a 1998 episode on beef safety at the time of the mad-cow-disease scare, a risk-communications consultant named Peter Sandman described a formula for how people evaluate risk: Risk = Hazard + Outrage. Sandman wrote (bracketed words are mine):
“To the experts, risk means expected annual mortality [or financial ruin]. But to the public (and even the experts when they go home at night), risk means much more than that. Let’s redefine terms. Call the death rate (what [many] experts mean by risk) “hazard.” Call all the other factors, collectively, “outrage.” Risk, then, is the sum of hazard and outrage. The public pays too little attention to hazard; the experts pay absolutely no attention to outrage. Not surprisingly, they rank risks differently.”
During and following World War II, when most developed nations chose to provide its residents with healthcare and financial benefits, they unconsciously acknowledged our frailty as humans in evaluating risk.
On Jan. 11, 1944, President Franklin Delano Roosevelt attempted to persuade Congress and the nation of the value of a “second bill of rights” (also known as an economic bill of rights) during his State of the Union address. Included on the list were the right to adequate medical care—and the opportunity to achieve and enjoy good health—and the right to adequate protection from the economic fears of old age, sickness, accident and unemployment. Several of FDR’s rights were addressed in programs such as Social Security. But employers continued to bear some of the onus they picked up during World War II by not abandoning employee benefits such as healthcare coverage.
As employee-benefits costs—led by healthcare expenses and poor pension investments—began to incur precipitous financial consequences for businesses during the 20th century, employers began the shift to cost-sharing with employees. (It’s a trend that continues today.) The advent of this change, coupled with the rise of cafeteria plans, put workers in the risk-assessment driver’s seat. They often made selections that make me shudder.
The latest information that has me losing some proverbial sleep at night is this: An analysis of economic research by a New York Times reporter that showed a trip to the hospital can mean a permanent reduction in income for a substantial fraction of Americans. Some people bounce right back, but many never work as much again.
To read the rest of this post, please click here.