Prior research indicates that language stimuli, when co-presented with sensory inputs, can enhance perceptual discrimination. However, whether this facilitation is unique to spoken language as opposed to non-verbal auditory stimuli, such as musical patterns, remains unclear. To address this question, we used difficult-to-discriminate tactile stimulus patterns and paired them repeatedly either with specific verbal, language-like labels or with matched musical sequences. Crucially, we implemented a within-subject learning design with well-matched stimuli counterbalanced across subjects. This approach involved pairing specific tactile patterns with either linguistic labels or matched sequences of musical tones and exposing all subjects to both conditions. Participants’ discrimination ability of the tactile patterns presented in isolation was evaluated both before and after associative learning. Results demonstrated that after 5 days of learning, only the tactile pattern sets associated with language stimuli – not those paired with musical sequences – showed significant improvement in discrimination. These results indicate that spoken language may indeed have an advantage over other forms of auditory input in facilitating perceptual discrimination. We discuss the underlying mechanisms of this observed perceptual advantage.