Detecting forest decline is crucial for effective forest management in arid and semi-arid regions. Remote sensing using satellite image time series is useful for identifying reduced photosynthetic activity caused by defoliation. However, current studies face limitations in detecting forest decline in sparse semi-arid forests. In this study, three Landsat time-series-based approaches were used to distinguish non-declining and declining forest patches in the Zagros forests. The random forest was the most accurate approach, followed by anomaly detection and the Sen’s slope approach, with an overall accuracy of 0.75 (kappa = 0.50), 0.65 (kappa = 0.30), and 0.64 (kappa = 0.30), respectively. The classification results were unaffected by the Landsat acquisition times, indicating that rather, environmental variables may have contributed to the separation of declining and non-declining areas and not the remotely sensed spectral signal of the trees. We conclude that identifying declining forest patches in semi-arid regions using Landsat data is challenging. This difficulty arises from weak vegetation signals caused by limited canopy cover before a bright soil background, which makes it challenging to detect modest degradation signals. Additional environmental variables may be necessary to compensate for these limitations.