2018 Employer Health Benefits Survey
The Kaiser Family Foundation (KFF) has conducted this annual survey of employer-sponsored health benefits since 1999. KFF works with NORC at the University of Chicago (NORC) and National Research, LLC (NR) to field and analyze the survey. From January to July 2018, NR completed telephone interviews with human resource and benefits managers at 2,160 firms.
The survey includes questions on the cost of health insurance, health benefit offer rates, coverage, eligibility, plan type enrollment, premium contributions, employee cost sharing, prescription drug benefits, retiree health benefits, and wellness benefits.
Firms that offer health benefits are asked about the plan attributes of their largest health maintenance organization (HMO), preferred provider organization (PPO), point-of-service (POS) plan, and high-deductible health plan with a savings option (HDHP/SO).4 We treat exclusive provider organizations (EPOs) and HMOs as one plan type and report the information under the banner of “HMO”. Since 2013, plan information for conventional (or indemnity) plans was collected within the PPO battery. Less than one percent of firms that completed the PPO section had more enrollment in a conventional plan than in a PPO plan. Firms with 50 or more workers were asked: “Does your firm offer health benefits for current employees through a private or corporate exchange?” Employers were still asked for plan information about their HMO, PPO, POS and HDHP/SO plan regardless of whether they purchased health benefits through a private exchange or not.
Firms are asked about the attributes of their current plans during the interview. While the survey’s fielding period begins in January, many respondents may have a plan whose 2018 plan year lags behind the calendar year [Figure M.1]. In some cases, plans may report the attributes of their 2017 plans and some plan attributes (such as HSA deductible limits) may not meet the calendar year regulatory requirements.
The universe is defined by the U.S. Census’ 2014 Statistics of U.S. Businesses (SUSB) for private firms and the 2012 Census of Governments (COG) for non-federal public employers. At the time of the sample design (December 2017), these data represented the most current information on the number of public and private firms nationwide with three or more workers. As in the past, the post-stratification is based on the most up-to-date Census data available (the 2015 SUSB). We determine the sample size based on the number of firms needed to ensure a target number of completes in six size categories.
We attempted to repeat interviews with prior years’ survey respondents (with at least ten employees) who participated in either the 2016 or the 2017 survey, or both. Firms with 3-9 employees are not included in the panel to minimize the impact of panel effects. As a result, 1,529 of the 2,160 firms that completed the full survey also participated in either the 2016 or 2017 surveys, or both. In total, 150 firms participated in 2016, 416 firms participated in 2017, and 963 firms participated in both 2016 and 2017. In addition non-panel firms are randomly selected to participate in the survey.
Since 2010, the sample has been drawn from a Research Now SSI list (based on an original Dun and Bradstreet list) of the nation’s private employers and the COG for public employers. To increase precision, we stratified the sample by ten industry categories and six size categories. The federal government and business with fewer than three employees are not included in the sample frame. Education is a separate sampling category for the purposes of sampling, rather than a subgroup of the Service category. Education is controlled for during post-stratification, and adjusting the sampling frame to also control for Education allows for a more accurate representation of both the Education and Service industries. For information on changes to the sampling methods over time, please consult the Survey Design and Methods Sections of prior Employer Health Benefits Surveys (https://www.kff.org/health-costs/report/employer-health-benefits-annual-survey-archives/).
Response rates are calculated using a CASRO method, which accounts for firms that are determined to be ineligible in its calculation. The overall response rate is 32% [Figure M.2].5 The response rate for panel firms is higher than the response rate for non-panel firms [Figure M.2]. Similar to other employer and household surveys, the Employer Health Benefits Survey has seen a general decrease in response rates over time. Since 2017, we have attempted to increase the number of completes by increasing the number of non-panel firms in the sample. While this generally increases the precision of estimates by ensuring a sufficient number of respondents in various sub-groups, it has the effect of reducing the overall response rate.
To increase response rates, some firms with 3-9 employees were offered an incentive for participating in the survey. A third of these firms were sent a $5 Starbucks gift card in the advance letter, a third were offered both a $5 Starbucks gift card in the advance letter and an incentive of $25 in cash or as a donation to a charity of their choice after completing the full survey, and a third of firms were offered no incentive at all. Our analysis does not show significant differences in responses to key variables or participation rates among these incentive groups.
The vast majority of questions are asked only of firms that offer health benefits. A total of 1,872 of the 2,160 responding firms indicated they offered health benefits. The response rate for firms that offer health benefits is also 32%.
We asked one question of all firms in the study with which we made phone contact but where the firm declined to participate. The question was A6: “Does your company offer a health insurance program as a benefit to any of your employees?”. A total of 4,070 firms responded to this question (including 2,160 who responded to the full survey and 1,910 who responded to this one question). These responses are included in our estimates of the percentage of firms offering health benefits.6 The response rate for this question is 60% [Figure M.2].
FIRM SIZE CATEGORIES AND KEY DEFINITIONS
Throughout the report, figures categorize data by size of firm, region, and industry. Unless otherwise specified, firm size definitions are as follows: small firms: 3 to 199 workers; and large firms: 200 or more workers. Figure M.3 shows selected characteristics of the survey sample. A firm’s primary industry classification is determined from Research Now SSI’s designation on the sampling frame and is based on the U.S. Census Bureau’s North American Industry Classification System (NAICS). A firm’s ownership category and other firm characteristics used in figures such as 3.5 and 6.18 are based on respondents’ answers. While there is considerable overlap in firms in the “State/Local Government” industry category and those in the “public” ownership category, they are not identical. For example, public school districts are included in the service industry even though they are publicly owned.
Figure M.4 presents the breakdown of states into regions and is based on the U.S Census Bureau’s categorizations. State-level data are not reported both because the sample size is insufficient in many states and we only collect information on a firm’s primary location rather than where all workers may actually be employed. Some mid- and large-size employers have employees in more than one state, so the location of the headquarters may not match the location of the plan for which we collected premium information.
Figure M.5 displays the distribution of the nation’s firms, workers, and covered workers (employees receiving coverage from their employer). Among the three million firms nationally, approximately 60.1% employ 3 to 9 workers; such firms employ 7.6% of workers, and 3.4% of workers covered by health insurance. In contrast, less than one percent of firms employ 5,000 or more workers; these firms employ 35.8% of workers and 39.5% of covered workers. Therefore, the smallest firms dominate any statistics weighted by the number of employers. For this reason, most statistics about firms are broken out by size categories. In contrast, firms with 1,000 or more workers are the most influential employer group in calculating statistics regarding covered workers, since they employ the largest percentage of the nation’s workforce. Statistics among small firms and those weighted by the number of firms tend to have more variability.
Throughout this report, we use the term “in-network” to refer to services received from a preferred provider. Family coverage is defined as health coverage for a family of four. Definitions of the health plan types are available in Section 4, and a detailed explanation of the HDHP/SO plan type is in Section 8.
The survey asks firms what percentage of their employees earn more or less than a specified amount in order to identify the portion of a firm’s workforce that has relatively lower or higher wages. This year, the income threshold is $25,000 per year for lower-wage workers and $62,000 for higher-wage workers. These thresholds are based on the 25th and 75th percentile of workers’ earnings as reported by the Bureau of Labor Statistics using data from the Occupational Employment Statistics (OES) (2017).7 The cutoffs were inflation-adjusted and rounded to the nearest thousand. Prior to 2013, wage cutoffs were calculated using the now-eliminated National Compensation Survey.
ROUNDING AND IMPUTATION
Some figures in the report do not sum to totals due to rounding. In a few cases, numbers from distribution figures may not add to the numbers referenced in the text due to rounding. Although overall totals and totals for size and industry are statistically valid, some breakdowns may not be available due to limited sample sizes or a high relative standard error. Where the unweighted sample size is fewer than 30 observations, figures include the notation “NSD” (Not Sufficient Data). Estimates with high relative standard errors are reviewed and in some cases not published. Many breakouts by subsets may have a large standard error, meaning that even large differences are not statistically different.
To control for item nonresponse bias, we impute values that are missing for most variables in the survey. On average, 5% of observations are imputed. All variables are imputed following a hotdeck approach. The hotdeck approach replaces missing information with observed values from a firm similar in size and industry to the firm for which data are missing. In 2018, there were eleven variables where the imputation rate exceeded 20%; most of these cases were for individual plan level statistics. When aggregate variables were constructed for all of the plans, the imputation rate was usually much lower. There are a few variables that we have decided not to impute; these are typically variables where “don’t know” is considered a valid response option. Some variables are imputed based on their relationship to each other. For example, if a firm provided a worker contribution for family coverage but no premium information, a ratio between the family premium and family contribution was imputed and then the family premium was calculated. In addition, there are several variables in which missing data are calculated based on respondents’ answers to other questions (for example, employer contributions to premiums are calculated from the respondent’s premium and the worker contribution to premiums).
Since 2014, we estimate separate single and family coverage premiums for firms that provide premium amounts as the average cost for all covered workers, instead of differentiating between single and family coverage. This method more accurately accounts for the portion that each type of coverage contributes to the total cost for the less than one percent of covered workers who are enrolled at firms affected by this adjustment.
To ensure data accuracy we have several processes to review outliers and illogical responses. Every year several hundred firms are called back to confirm or correct responses. In some cases, answers are edited based on responses to open-ended questions or based on established logic rules.
Because we select firms randomly, it is possible through the use of statistical weights to extrapolate the results to national (as well as firm size, regional, and industry) averages. These weights allow us to present findings based on the number of workers covered by health plans, the number of total workers, and the number of firms. In general, findings in dollar amounts (such as premiums, worker contributions, and cost sharing) are weighted by covered workers. Other estimates, such as the offer rate, are weighted by firms. Specific weights were created to analyze the HDHP/SO plans that are offered with a Health Reimbursement Arrangement (HRA) or that are Health Savings Account (HSA)-qualified. These weights represent the proportion of employees enrolled in each of these arrangements.
Calculation of the weights follows a common approach. The employer weight was determined by calculating the firm’s probability of selection. This weight was adjusted for nonresponse bias and trimmed of overly influential weights. Finally, we calibrated the weights to U.S. Census Bureau’s 2015 Statistics of U.S. Businesses for firms in the private sector, and the 2012 Census of Governments as the basis for calibration / post-stratification for public sector firms. Historic employer-weighted statistics were updated in 2011. The worker weight was calculated by multiplying the employer weight by the number of workers at the firm and then following the same weight adjustment process described above. The covered-worker weight and the plan-specific weights were calculated by multiplying the percentage of workers enrolled in each of the plan types by the firm’s worker weight. These weights allow analyses of all workers covered by health benefits and of workers in a particular type of health plan.
The trimming procedure follows the following steps: First, we grouped firms into size and offer categories of observations. Within each strata, we identified the median and the interquartile range of the weights and calculated the trimming cut point as the median plus six times the interquartile range (M + [6 * IQR]). Weight values larger than this cut point are trimmed to the cut point. In all instances, very few weight values were trimmed.
As in past years, we conducted a small follow-up survey of those firms with 3-49 workers that refused to participate in the full survey. Based on the results of a McNemar test, we were not able to verify that the results of the follow-up survey were comparable to the results from the original survey, and weights were not adjusted using the nonresponse adjustment process described in previous years’ methods. In 2010, 2015, and 2017, the results of the McNemar test were also significant and we did not conduct a nonresponse adjustment.
The survey collects information on primary and specialty care physician office visits for each plan type. Different plan types at the same firm may have different cost-sharing structures (e.g., copayments or coinsurance). Because the composite variables (using data from across all plan types) are reflective of only those plans with that provision, separate weights for the relevant variables were created in order to account for the fact that not all covered workers have such provisions.
To account for design effects, the statistical computing package R and the library package “survey” were used to calculate standard errors.
All statistical tests are performed at the .05 confidence level. For figures with multiple years, statistical tests are conducted for each year against the previous year shown, unless otherwise noted. No statistical tests are conducted for years prior to 1999.
Statistical tests for a given subgroup (firms with 25-49 workers, for instance) are tested against all other firm sizes not included in that subgroup (all firm sizes NOT including firms with 25-49 workers, in this example). Tests are done similarly for region and industry; for example, Northeast is compared to all firms NOT in the Northeast (an aggregate of firms in the Midwest, South, and West). However, statistical tests for estimates compared across plan types (for example, average premiums in PPOs) are tested against the “All Plans” estimate. In some cases, we also test plan-specific estimates against similar estimates for other plan types (for example, single and family premiums for HDHP/SOs against single and family premiums for HMO, PPO, and POS plans); these are noted specifically in the text. The two types of statistical tests performed are the t-test and the Wald test. The small number of observations for some variables resulted in large variability around the point estimates. These observations sometimes carry large weights, primarily for small firms. The reader should be cautioned that these influential weights may result in large movements in point estimates from year to year; however, these movements are often not statistically significant. Standard Errors for most key statistics are available in a technical supplement available at http://www.kff.org/ehbs.
In light of a number of regulatory changes and policy proposals, we included new questions on the anticipated effects of the ACA’s individual mandate penalty repeal on the firm’s health benefits offerings, and the impact of the delay of the high cost plan tax, also known as the Cadillac tax, on the firm’s health benefits decisions. Also new in 2018 are questions asking about smaller firms’ use of level-funded premium plans, an alternative self-funding method with integrated stop loss coverage and a fixed monthly premium.
The 2018 survey also expands on retiree health benefits questions, asking firms about cost reduction strategies, whether they contribute to the cost of coverage, and how retiree benefits are offered (e.g., through a Medicare Advantage contract, a traditional employer plan, private exchange, etc.).
In 2018, we moved the battery of worker demographics questions from near the beginning of the survey to the end of the survey in an effort to improve the flow. There is no evidence that this move has impacted our survey findings and we will continue to monitor any suspected impacts.
Starting in 2018, we allowed respondents who did not know the combined maximum incentive or penalty an employee could receive for health screening and/or wellness and health promotion to answer a categorical question with specified ranges. This method is consistent with how we handle the percent of low-wage and high-wage workers at a firm. In 2018, 18% of respondents did not know the dollar value of the their incentive or penalty and 39% were able to estimate a range.
Starting in 2018, the survey began asking small firms who indicated that their plan was fully-insured whether the plan was level-funded. In a level-funded plan, employers make a set payment each month to an insurer or third party administrator which funds a reserve account for claims, administrative costs, and premiums for stop-loss coverage. These plans are often integrated and firms may not understand the complexities of the self-funded mechanisms underlying them. Some small employers who indicate that their plan is self-funded may also offer a plan that meets this definition. Respondents offering level funded plans were asked about any attachment points applying to enrollees. These firms were not less likely to answer this question, and including them doesn’t not substantially change the average. Prior to 2018, all firms reporting coverage as underwritten by an insurer were excluded from the stop-loss calculations.
For prescription drug coverage, similar to years past, if the firm reports that the worker pays the full cost for drugs on a particular tier and/or that the plan only offers access to a discount program, we do not consider this as offering covering for that drug tier. Hospital, outpatient surgery and prescription drug cost-sharing was only asked of a firm’s largest plan type.
The response option choices for the type of incentive or penalty for completing biometric screening or a health risk assessment changed from 2017 to 2018.
Values below 3% are not shown on graphical figures to improve the readability of those graphs. The underlying data for all estimates presented in graphs are available at http://www.kff.org/ehbs.
Annual inflation estimates are usually calculated from April to April. The 12 month percentage change for May to May was 2.5%.8
Data in this report focus primarily on findings from surveys conducted and authored by the Kaiser Family Foundation since 1999. Between 1999 and 2017, the Health Research & Educational Trust (HRET) co-authored this survey. HRET’s divestiture had no impact on our survey methods, which remain the same as years past. Prior to 1999, the survey was conducted by the Health Insurance Association of America (HIAA) and KPMG using a similar survey instrument, but data are not available for all the intervening years. Following the survey’s introduction in 1987, the HIAA conducted the survey through 1990, but some data are not available for analysis. KPMG conducted the survey from 1991-1998. However, in 1991, 1992, 1994, and 1997, only larger firms were sampled. In 1993, 1995, 1996, and 1998, KPMG interviewed both large and small firms. In 1998, KPMG divested itself of its Compensation and Benefits Practice, and part of that divestiture included donating the annual survey of health benefits to HRET.
This report uses historical data from the 1993, 1996, and 1998 KPMG Surveys of Employer-Sponsored Health Benefits and the 1999-2017 Kaiser/HRET Survey of Employer-Sponsored Health Benefits. For a longer-term perspective, we also use the 1988 survey of the nation’s employers conducted by the HIAA, on which the KPMG and KFF surveys are based. The survey designs for the three surveys are similar.
Additional information on the 2018 Employer Health Benefit Survey is available at http://www.kff.org/ehbs, including an article in the Journal Health Affairs, an interactive graphic, historic reports and a technical supplement. Researchers may also request a public use dataset here: https://www.kff.org/contact-us/
Published: October 3rd, 2018. Last Updated: September 27, 2018.
- HDHP/SO includes high-deductible health plans with a deductible of at least $1,000 for single coverage and $2,000 for family coverage and that offer either a Health Reimbursement Arrangement (HRA) or a Health Savings Account (HSA). Although HRAs can be offered along with a health plan that is not an HDHP, the survey collected information only on HRAs that are offered along with HDHPs. For specific definitions of HDHPs, HRAs, and HSAs, see the introduction to Section 8.↩
- Response rate estimates are calculated by dividing the number of completes over the number of refusals and the fraction of the firms with unknown eligibility to participate estimated to be eligible. Firms determined to be ineligible to complete the survey are not included in the response rate calculation.↩
- Estimates presented in Figures 2.1, 2.2, 2.3, 2.4, 2.5, and 2.6 are based on the sample of both firms that completed the entire survey and those that answered just one question about whether they offer health benefits.↩
- General information on the OES can be found at http://www.bls.gov/oes/oes_emp.htm#scope. A comparison between the OES and the NCS is available at https://www.bls.gov/opub/mlr/2013/article/lettau-zamora.htm↩
- Bureau of Labor Statistics, Consumer Price Index, U.S. City Average of Annual Inflation, 1998-2018; (cited 2018 July 20). https://beta.bls.gov/dataViewer/view/timeseries/CUUR0000SA0.↩