High nonresponse rates have become a rule in survey sampling. In panel surveys there occur additional sample losses due to panel attrition, which are thought to worsen the bias resulting from initial nonresponse. However, under certain conditions an initial wave nonresponse bias may vanish in later panel waves. We study such a "Fade away" of an initial nonresponse bias in the context of regression analysis. By using a time series approach for the covariate and the error terms we derive the bias of cross-sectional OLS-estimates of the slope coefficient. In the case of no subsequent attrition and only serial correlation an initial bias converges to zero. If the nonresponse affects permanent components the initial bias will decrease to a limit which is determined by the size of the permanent components. Attrition is discussed here in a worst case scenario, where there is a steady selective drift into the same direction as in the initial panel wave. It is shown that the fade away effect dampens the attrition effect to a large extent depending on the temporal stability of the covariate and the dependent variable. The attrition effect may by further reduced by a weighted regression analysis, where the weights are estimated attrition probabilities on the basis of the lagged dependent variable. The results are discussed with respect to surveys with unsure selection procedures which are used in a longitudinal fashion, like access panels.