by Felix De Los Santos
When buying a car you should never drive it away from the dealership without having car insurance. Not only is this illegal but you are exposing yourself to great personal liability. However, when should you get car
insurance - before or at the dealership?
0 comments:
Post a Comment