I'm buying a car at the weekend and deliberating over whether to take out the warranty. Is this just an 'added extra' that I don't need?
The dealer's warranty adds quite a lot to the cost of my insurance. I'm not sure whether I need it but don't want to end up paying a price for turning it down. Any thoughts?