proxyprochy
New member
- Joined
- Aug 25, 2023
- Messages
- 4
- Programming Experience
- 1-3
Hello, so I've got some code that works normally in production, but whenever i run test (MSTest), the result differs by 3.2034 degrees, thus the test fails. Based on my calculations, the test calculates wrongly. Any suggestions what might be wrong?
For testing purposes, I've hard-coded the point values and moved everything to a separate solution.
For testing purposes, I've hard-coded the point values and moved everything to a separate solution.
Main method:
static void Main(string[] args)
{
// Results in 178.898293884794
Console.WriteLine(Angle.NormalizeAngle(Angle.GetAngleByVectors(),0));
Console.ReadLine();
}
Unit testing:
[TestMethod]
public void TestMethod1()
{
// Act
double angle = Angle.NormalizeAngle(Angle.GetAngleByVectors(), 0);
// Results in 181.10170611520635
double normalizedAngle = Angle.NormalizeAngle(angle, 0);
// Assert
double expectedAngle = 177.8983;
Assert.AreEqual(expectedAngle, normalizedAngle, Tolerance);
}
Methods used:
public static double GetAngleByVectors()
{
var pt1 = new Point(757, 473);
var pt2 = new Point(705, 472);
const double Rad2Deg = 180.0 / Math.PI;
double result = Math.Atan2(pt2.Y - pt1.Y, pt2.X - pt1.X);
double resultInDegrees = result * Rad2Deg;
return resultInDegrees;
}
public static double NormalizeAngle(double angle, double offset)
{
angle += offset;
angle %= 360.0;
if (angle < 0)
{
angle += 360.0;
}
return 360 - angle;
}
Last edited: