In the spirit of the Whitney Extension Theorem, we consider a function on a compact subset of Euclidean space to be ``Whitney differentiable'' if it is a restriction of a continuously Fréchet-differentiable function with an open domain. Whitney differentiable functions have useful (yet possibly nonunique) derivatives even on the boundaries of their domains; these derivatives obey the usual chain rule and are valid subgradients if the function is convex. This presentation considers optimal-value functions for parametric convex programs whose constraints are translated by the parameters. Such functions may be nonsmooth; consider, for example, the mapping $p\in\mathbb{R}^2\mapsto\min\{x\in\mathbb{R}:x\geq p_1,x\geq p_2\}=\max\{p_1,p_2\}$. Nevertheless, it is shown that when the objective and all constraints are Whitney-differentiable and a weakened variant of the linear-independence constraint qualification holds, then these optimal-value functions are guaranteed to be Whitney differentiable. This result extends classic sensitivity results for convex programs. Examples are presented, and implications are discussed for generating continuously differentiable convex underestimators of nonconvex functions for use in methods for deterministic global optimization.