I have a program which saves some data to an NFC tag. The NFC tag only has some bytes for memory. And because I need to save a date and time in minutes (decimal) to the tag, I need to save this in the most memory efficient way possible. For instance the decimal number 23592786 requires 36 bits, but if the decimal number is converted to a base36 value it only requires 25 bits of memory.
Number 23592786 requires 25 bits, because binary representation of this number is 25-bit length. You can save some bits, if date range is limited. One year contains about 526000 minutes, so interval in minutes from 0:00 1st Jan 2000 (arbitrary start date) will take 24 bits (3 bytes) and represents dates till 2031 year.
The simplest might be to use a Unix time this gives the the number of seconds since Jan 1 1970, this typically takes 32 bits. As MBo has said you can reduce the number of bits by 6, by jut counting minutes or by choosing a more recent start date. However there are advantages in using an industry standard. Depending on you application you might be able to get it down to 2 byte which could represent about 45 days.