To establish polar coordinates on a plane, we choose a point O - the origin of coordinates, the pole, and a ray from this point to some direction OX - the polar axis (usually drawn horizontally).
Then the position of every point A on a plane can be defined by two polar coordinates: a polar angle varphi from the polar axis counterclockwise to a ray connecting the origin of coordinates with our point A - angle /_ XOA (usually measured in radians) and by the length rho of a segment OA.
To graph a function in polar coordinates we have to have its definition in polar coordinates.
Consider, for example a function defined by the formula
rho=varphi for all varphi>=0.
The function defined by this equality has a graph that starts at the origin of coordinates O because, if varphi=0, rho=0.
Then, as a polar angle varphi increases, the distance from an origin rho increases as well. This gradual increase in both polar angle and distance from the origin produces a graph of a spiral.
After the first full circle the point on a graph will hit the polar axis at a distance 2pi. Then, after the second full circle, it will intersect the polar axis at a distance 4pi, etc.