The equation x = e^-x has a real root in [0,1] if and only if the function
f(x) = e^-x - x
has a zero in [0,1].
f(0) = e^(-0) - 0 = 1
and
f(1) = e^(-1) - 1 = 1/e - 1 < 1 - 1 = 0,
so f(x) undergoes at least one sign change in [0,1]. This is a consequence of the continuity of f(x) and the Intermediate value theorem , which in this case says the following:
Since f(0)>0 and f(1) < 0, there exists some real number x_m in (0,1) such that f(x_m) = 0, and we have proved the existence of a real zero of f(x) in [0,1].
Due to the relation between the zeros of f(x) and the roots of x = e^-x, we can conclude that x = e^-x has a real root in [0,1].
EDIT: As pointed out by George, the function f(x) must be continuous and defined on the entire interval of interest. You can, as an exercise, show that this is in fact the case. One shortcut is to use that f(x) is a sum of two analytic functions in combination with (https://proofwiki.org/wiki/Combination_Theorem_for_Continuous_Functions/Sum_Rule) .