When you pay an insurance company to (whatever they do) then they invest the profits to hire and train more marketeers and salesman. The benefit that society receives from insurance company is lying commercials, lying campaigns and lying salesman. Health Insurance doesn't have jack squat to do with health care. Insurance is a financial product to protect assets. It's a product to benefit people who have assets. It's ridiculous for people who have very little or no assets. The insurance companies don't spin it that way.
I never purchased health insurance but I would have purchased if my net worth ever warranted it. Wealthy people who need health insurance are sophisticated enough to know the profit motive that exist. The relationship got all jacked up when poor people started buying insurance and making unrealistic demands. The insurance company got greedy and started marketing their product to people that should have never been buying it.
I hate health insurance as a product and as an industry but I didn't ever buy it. It wasn't my concern. This doesn't mean I am a moron and I think health insurance exist to finance everyone's personal tragedy.