Parameter estimation and model order selection for linear regression models are two classical problems. In this article we derive the
minimum mean-square error (MMSE) parameter estimate for a linear regression model with unknown order. We call the so-obtained estimator the Bayesian Parameter estimation Method (BPM).
We also derive the model order selection rule which maximizes the probability of selecting the correct model. The rule is denoted BOSS---Bayesian Order Selection Strategy. The estimators have several advantages: They satisfy certain optimality criteria, they are non-asymptotic and they have low computational complexity. We also derive ``empirical Bayesian'' versions of BPM and BOSS, which do not require any prior knowledge nor do they need the choice of any ``user parameters''. We show that our estimators outperform several classical methods, including the AIC and BIC for order selection.