3/2/2023 0 Comments Vanquish definition in korean![]() One problem with Phase 1 is that people who are more thoughtful and careful are made to subsidize those who are more thoughtless and careless. Phase 1 avoids discriminating based on race, ethnicity, gender, religion or anything else for that matter, but that doesn’t make it fair, practical, or even legal. This was commonplace in insurance until the 18th century. To understand why, let’s think about the process of using data to segment – or ‘discriminate’ – as evolving in 3 phases. You see, problems that arise while using 5 factors aren’t multiplied by millions of data – they are divided by them. If 5 factors mimic race unwittingly, they say, imagine how much worse it will be in the era of big data!īut while it’s easy to be alarmist, machine learning and big data are more likely to solve the ‘credit score problem’ than to compound it. Many fear that such “black box” systems will make matters worse, producing the kind of proxies for race that credit scores do, but without giving us the ability to scrutinize and regulate them. What happens when those capabilities are harnessed for assessing risk and pricing insurance? Every time it plays (and it plays millions of times a day), the machine learns, and the algorithm morphs. The AI encodes its own fabulously intricate instructions, using billions of data to train its machine learning engine. AI crushes humans at chess, for example, because it uses algorithms that no human could create, and none fully understand. Credit scores are a 3 digit number, derived from a static formula that weighs 5 self-explanatory factors.īut in the era of big data and artificial intelligence, all that could change. ![]() Reasonable people may disagree on whether credit scores discriminate fairly or unfairly – and we can have that debate because we can all get our heads around the question at hand. For this reason, California, Massachusetts, and Maryland don’t allow insurance pricing based on credit scores. But it turns out there is something not to like: credit scores are also highly predictive of skin color, acting in effect as a proxy for race. What’s not to like? Indeed, most regulators allow the use of credit-based insurance scores, and in the US these can impact your premiums by up to 288%. Take your garden-variety ‘credit score.’ Credit scores are derived from objective data that don’t include race, and are highly predictive of insurance losses. But while that sounds like the end of the matter, it’s not. It’s illegal, unethical, and unprofitable. Thankfully, no insurer will ever use membership in a “protected class” (race, gender, religion…) as a pricing factor. As no two people are entirely alike, that means treating different people differently. But how to segment people without discriminating unfairly? ![]() Insurance is the business of assessing risks and pricing policies to match. Algorithms we can’t understand can make insurance fairer ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |