Every state in the United State, with the exception of New Hampshire, requires every driver have some form of insurance. Each state has its own requirements, but in general, there is always some kind of insurance
every driver has to have. The reason car insurance
is mandatory is to protect others in the case of an accident but it’s
ultimate purpose is to protect you financially.