Jensen’s inequality is ubiquitous in measure and probability theory, statistics, machine learning, information theory and many other areas of mathematics and data science. It states that, for any convex function𝑓∶ 𝐾 → ℝ defined on a convex domain 𝐾 ⊆ ℝ𝑑 and any random variable 𝑋 taking values in 𝐾, 𝔼[𝑓(𝑋)] ⩾ 𝑓(𝔼[𝑋]). In this paper, sharp upper and lower bounds on 𝔼[𝑓(𝑋)], termed ‘graph convex hull bounds’, are derived for arbitrary functions 𝑓 on arbitrary domains 𝐾, thereby extensively generalizing Jensen’s inequality. The derivation of these bounds necessitates the investigation of the convex hull of the graph of 𝑓, which can be challenging for complex functions. On the other hand, once these inequalities are established, they hold, just like Jensen’s inequality, for any 𝐾-valued random variable 𝑋. Therefore, these bounds are of particular interest in cases where 𝑓 is relatively simple and 𝑋 is complicated or unknown. Both finite- and infinite-dimensional domains and codomains of 𝑓 are covered as well as analogous bounds for conditional expectations and Markov operators.