How do car dealerships make money on financing?
Buying a car in the United States often involves more than choosing a model and color. For most Americans, financing plays a major role in the purchase decision. Car dealerships are not just selling vehicles, they are also selling financial products. This surprises many buyers who think the dealer only profits from the car price. … Read more