How to get coordinates of a point in a coordinate system based on angle and distance

You use Math.cos, Math.sin like this:

pointX = x + distance * Math.cos(angle)
pointY = y + distance * Math.sin(angle)

enter image description here

Note about radians / degrees: Math.cos and Math.sin assumes the argument is given in radians. If you have the angle in degrees, you would use Math.cos(Math.toRadians(angle)) for instance.


If d is the distance and A is the angle, than the coordnates of the point will be

(x+d*Cos(A), y+ d*Sin(A))


If r is the distance from origin and a is the angle (in radians) between x-axis and the point you can easily calculate the coordinates with a conversion from polar coordinates:

x = r*cos(a)
y = r*sin(a)

(this assumes that origin is placed at (0,0), otherwise you should add the displacement to the final result).

The inverse result is made by computing the modulo of the vector (since a distance + angle make a vector) and the arctangent, which can be calculated by using the atan2 funcion.

r = sqrt(x*2+y*2)
a = atan2(y,x)