Let f and g be two real-valued functions such that f(x+y) + f(x-y) = 2f(x).g(y) AA x,y in R. If f(x) is not identically zero and |f(x)| <=1 AA x in R, then prove that |g(y)| <=1 AA y in R?
1 Answer
Jun 24, 2018
We have:
f(x+y) +f(x-y) =2f(x)g(y) AA x,y in RR
Where (it is assumed) that
y=0 => f(x+0) +f(x-0) = 2f(x)g(y)
\ \ \ \ \ \ \ \ \ => 2f(x) = 2f(x)g(y)
\ \ \ \ \ \ \ \ \ => 1 = g(y) (becausef(x)!=0 )
And trivially