Phosphorus is a critical nutrient in freshwater ecosystems and is often the limiting nutrient. So human inputs of phosphorus associated with agricultural runoff or treated sewage can cause eutrophication and lead to hypoxia and food web disruptions. But managing phosphorus concentrations (measured as total P, TP) is difficult because naturally occurring phosphorus can vary over orders of magnitude with changes in geology and climate. Failing to account for this natural variation could result in regional expectations that likely over or under protect individual waterbodies. Predicting TP concentrations is complicated by the fact that in many waterbodies naturally occurring TP levels are below limits of detection, requiring some method to estimate this “missing” data before modeling. Therefore, we trained two separate random forest models of temporal and spatial variation in natural TP concentrations using 16,014 observations from 2612 minimally impacted sites from across the contiguous US. The first model predicts if TP concentrations are above a minimum detection limit of 0.001 mg/L. This first model correctly predicted whether TP concentrations were above or below limits of detection with 97% accuracy using 5 predictors (erodibility, evapotranspiration, geochemistry and soil permeability). The second model predicts expected concentrations for those sites predicted to be above detection limits. The second model explained 63% of the variation in log TP values expected to be above limits of detection, with a back-transformed RMSE of 1.11 mg/L. This model used 24 predictors derived from soils, geology, climate, and vegetation data. Because both of these models use readily available spatial predictors, background TP values can be calculated for all streams. These predictions of natural background can be used to inform managers and regulators of the amounts of TP observed at specific sites that are likely derived from natural sources and what additional amounts are likely from anthropogenic sources.