Dates and times are notoriously difficult to get right in software. Many of the difficulties come from timezones and how unintuitive they are to think about. This is made worse by timezone offsets that change during the year for the same place, such as British Summer Time.
It’s easier to get timezones right when you specify a region/city timezone, such as
Europe/London. Trying to think about and specify timezone offsets such as
+01:00 is more likely to go wrong.
A common example is a breakage when Britain begins or ends British Summer Time. The timezone offset changes from
+01:00, but the timezone is still
Europe/London. If you’ve specified either
+01:00 for your timezone, your application is going to break twice a year in that case. Specifying
Europe/London automatically gets it right, as whatever datetime library you’re using should know about the changing offset for that region/city identifier.
It’s common to try and use a model of “everything is UTC”, but this breaks in the same way as it’s the same as specifying a timezone offset of
00:00. At some point you’ll need to deal with things happening and time passing in other locations. Again, location-based timezones such as
America/New_York will avoid this problem and get it right naturally.
Using region/city identifiers avoids bugs by being easier to think about. It’s more intuitive to think about “users in London”, “a stock exchange in New York” or “a warehouse in Rotterdam”, than “the client is at an offset of UTC -01:00 and the warehouse is at an offset of UTC +02:00, therefore…". Use the real locations and let your datetime library get it right.
Note that not all timezone offsets are in units of whole hours, such as
Asia/Tehran (+03:30 - +04:30) or
Asia/Kathmandu (+05:45). Your first reaction to that might be “well my application will never need to deal with those timezones”. It’s better to be alive than right, though, and your datetime library can automatically handle the region/city timezones for you so you don’t have to think about it more than that.