Question #68721

1 Answer
Mar 15, 2017

= -1/b cos (a + bx) + C

Explanation:

Often pays to work backwards from differentiation when doing integrals.

So we know that:

d/dx ( cos x ) = - sin x

And from the chain rule:

d/dx ( cos u(x) ) = - sin u (du)/dx

Apols if I'm labouring this but, therefore, we know that:

d/dx ( cos (a + bx) )= - sin (a + bx) cdot b

implies d/dx ( -1/b cos (a + bx) ) = sin (a + bx)

So:

int \ sin(a+bx) \ dx

= int \ d/dx ( -1/b cos (a + bx) ) \ dx

And as we know that integration and differentiation are like inverse operations (specifically we are using the Fundamental Theorem of Calculus), we say that:

int \ sin(a+bx) \ dx = int \ d/dx ( -1/b cos (a + bx) ) \ dx

= -1/b cos (a + bx) + C

Just my 2 cents but whilst that might look more arduous than a simpler algebraic substitution, eg let y = a + bx, dy = b \ dx, I laboured it and it cn be done in just a few lines.

Plus my personal experience is that the further you get into integration, pattern recognition from what you know about differentiation can often make life a lot easier.

So that's not the best answer but hopefully it helps in some way.