Fluctuating timescales are present in nature and are commonly observed in music, movies, brain activity, and speech. In human speech, semantic timescales span from single words to complete sentences and vary throughout conversation. Similarly, the brain's intrinsic neuronal timescales (INT), reflected in temporally correlated activity, carry information across time. How are these semantic and neuronal timescales related? Our combined semantic input and functional magnetic resonance imaging (fMRI) study using the 7 Tesla Human Connectome Project movie-watching dataset reveals information transfer from speech's semantic timescales to the brain's INT. We extracted two semantic time-series, sentence similarity and word depth, using Sentence-BERT (SBERT) and WordNet, respectively. The timescales of both semantic signals and the brain's activity were quantified using the autocorrelation window (ACW), with a dynamic, time-varying analysis approach. This allows testing for information transfer from the simultaneously varying semantic timescales to the brain's varying timescales via Transfer Entropy (TE). We report three main findings: (1) Sentence similarity and word depth time-series exhibit high and systematic fluctuations over time. (2) Dynamic ACW analysis captures the dominant timescales in both semantic input (sentence similarity and word depth) and the brain's continuously varying INT. (3) Significant TE from the varying semantic timescales to the brain's simultaneously varying INT. We also demonstrate that the information transfer only emerges on the level of timescales, and is absent when comparing the two raw semantic input time-series with the BOLD signal, respectively. Conclusively, we demonstrate the key role of timescales in the information transfer from semantic inputs to the brain's neural activity.