An interesting but seldom-addressed topic here is that of tolerance analysis and calibration.
It's definitely true that in the first-order understanding of this process, shifting and scaling your actual range of interest (12V plus or minus a few volts) to maximize your ADC's full-scale range (0-3.3V) will give you the best resolution.
What's sometimes lost is that (especially for "DC" measurements) accuracy is not limited at all by resolution, but by the analog front-end doing all that shifting and scaling. AFEs like these can easily introduce orders of magnitude more gain/offset error than whatever quantization error they remove. Even a simple voltage divider made of two .1% resistors can introduce up to .2% error at the ADC pin. That's one part in 500; quite a lot when compared to a 13-bit ADC where an LSB is 1 part in 8000. And there are worse sources of DC error than just the resistors here.
As your AFEs grow in complexity, so will the DC error they introduce. Calibration after assembly can address these issues, but that's another topic. The difference between this kind of error analysis at DC and AC is another topic still.
My general advice for improving performance here is to do nearly the opposite thing you're doing. Keep your AFE as simple as you possibly can, so that your ADC is measuring as close to the real truth as possible. Maximizing full scale range won't help anything if the hardware to do it gives you 4% error at the ADC input. Remember you're not interested in measuring the voltage at the ADC accurately, you're just interested in what that measurement can tell you about the battery voltage.