Title

Thursday, 5 February 2015

How to avoid loss of precision in .NET for converting from Unix Timestamp to DateTime and back?


Consider the following snippet

var original = new DateTime(635338107839470268);  var unixTimestamp = (original - new DateTime(1970,1,1)).TotalSeconds;  // unixTimestamp is now 1398213983.9470267    var back = new DateTime(1970,1,1).AddSeconds(1398213983.9470267);  // back.Ticks is 635338107839470000 

As you can see the Ticks value that we got back is different from what we started with.

How can we avoid this loss of precision in C# while converting from unix time stamp to date time and back?

Answer

http://msdn.microsoft.com/en-us/library/system.datetime.addseconds.aspx

DateTime.AddSeconds() per the documentation rounds to the nearest millisecond (10,000 ticks).

You could do this with straight math, converting everything to ticks yourself.

Answer2

there is no loss in your timespan, don't compare with TotalSeconds and AddSeconds method results. you need to check for Ticks

var original = new DateTime(635338107839470268);  var Ticks = (original - new DateTime(1970,1,1)).Ticks;  // Ticks is now 13982139839470268    var back = new DateTime(1970,1,1).AddTicks(13982139839470268);  //back.Ticks is 635338107839470268

No comments:

Post a Comment