How do you factor (2x+1)^3 + (2y)^3?

2 Answers
May 28, 2015

For any numbers a and b, a^3 + b^3 = (a+b)(a^2-ab+b^2)

So, this expression becomes:

((2x+1)+2y)((2x+1)^2-(2x+1)(2y)+(2y)^2)

=(2x+2y+1)((4x^2+4x+1)-(4xy+2y)+(4y^2))

=(2x+2y+1)(4x^2+4y^2-4xy+4x-2y+1)

These 2 numbers look like they can be factored, but they can't because x and y are different variables.

May 28, 2015

You can use the identity for a sum of cubes:

a^3+b^3 = (a+b)(a^2-ab+b^2)

substituting a=2x+1 and b=2y to get:

(2x+1)^3+(2y)^3

= ((2x+1)+2y)((2x+1)^2 - (2x+1)2y + (2y)^2)

= (2x+2y+1)((4x^2+4x+1)-(4xy+2y)+4y^2)

= (2x+2y+1)(4x^2-4xy+4y^2+4x-2y+1)