• nesc@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    This pyramid visualisation doesn’t work for me, unless you read time starting with seconds.

  • Gork@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    I often have to refrain myself from using ISO-8601 in regular emails. In a business context the MM/DD/YYYY is so much more prevalent that I don’t want to stand out.

    Filenames on a share drive though? ISO-8601 all the way idgaf

  • Maggoty@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    Mmm US military date and time is fun too.

    DDMMMYYYYHHMM and time zone identifier. So 26JAN20251841Z.

    So much fun.

      • boonhet@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        Honestly look very readable to me, though I’m not sure on the timezone bit. Maybe they left it out? Ohterwise it’s 26th of January 2025, 18:41

        It’s gonna be problematic when there’s 5 digit years, but other than that it’s… not good, but definitely less ambiguous than any “normally formatted” date where DD <= 12. Is it MM/DD or DD/MM? We’ll never fucking know!

        Of course, YYYY-MM-DD is still the king because it’s both human readable and sortable as a regular string without converting it into a datetime object or anything.

        • jagungal@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          8 months ago

          All you’d have to do to make it much more readable is separate the time and the year with some kind of separator like a hyphen, slash or dot. Also “Z” is the time zone, denoting UTC (see also military time zones)

          • boonhet@lemm.ee
            link
            fedilink
            arrow-up
            0
            ·
            8 months ago

            Oh, duh. It’s why all my timestamps have Z’s in the database lmao

            Thing is, you’re right that the separation would help, but this is still way less ambiguous that MM/DD vs DD/MM if you ask me.

  • PeriodicallyPedantic@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    I just use millis since epoch

    (Recently learned that this isn’t accurate because it disguises leap seconds. The standard was fucked from the start)

      • namingthingsiseasy@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        Nah, ISO is a shit organization. The biggest issue is that all of their “standards” are blocked behind paywalls and can’t be shared. This creates problems for open source projects that want to implement it because it inherently limits how many people are actually able to look at the standard. Compare to RFC, which always has been free. And not only that, it also has most of the standards that the internet is built upon (like HTTP and TCP, just to name a few).

        Besides that, they happily looked away when members were openly taking bribes from Microsoft during the standardization of OOXML.

        In any case, ISO-8601 is a garbage standard. P1Y is a valid ISO-8601 string. Good luck figuring out what that means. Here’s a more comprehensive page demonstrating just how stupid ISO-8601 is: https://github.com/IJMacD/rfc3339-iso8601

          • Derpgon@programming.dev
            link
            fedilink
            arrow-up
            0
            ·
            8 months ago

            Sure, it means something, and the meaning is not stupid. But since it is the same standard, it should be possible to be used to at least somehow represent the same data. Which it doesn’t.

            • groet@infosec.pub
              link
              fedilink
              arrow-up
              0
              ·
              8 months ago

              I think it is reasonable to say: “for all representation of times (points in time, intervals and sets of points or intervals etc) we follow the same standard”.

              The alternative would be using one standard for points in time, another for intervals, another for time differences, another for changes to a timezone, another for …

              • lad@programming.dev
                link
                fedilink
                English
                arrow-up
                0
                ·
                8 months ago

                The alternative would be

                More reasonable, if you ask me. At least I came to value modularity in programming, maybe with standards it doesn’t work as good, but I don’t see why

                • groet@infosec.pub
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  8 months ago

                  Standards are used to increase interoperability between systems. The more different standards a single system needs the harder it is to interface with other systems. If you have to define a list of 50 standard you use, chances are the other system uses a different standard for at least one of them. Much easier if you rely on only a handful instead

    • ByteJunk@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      That’s an ISO date, and it’s gorgeous. It’s the only way I’ll accept working with dates and timezones, though I’ll make am exception for end-user facing output, and format it according to locale if I’m positive they’re not going to feed into some other app.

  • Miles O'Brien@startrek.website
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    In one work report, I recorded the date as “1/13/25”, “13/1/25” and “13JAN2025”

    I have my preference, but please for the love of all that is fluffy in the universe, just stick to one format…

  • istdaslol@feddit.org
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    My stupid ass read this top to bottom and I was confused why anyone would start with seconds

  • Bo7a@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    I don’t know why anyone would ever argue against this. Least precise to most precise. Like every other number we use.

    (I don’t know if this is true for EVERY numerical measure, but I’m sure someone will let me know of one that doesn’t)

      • Bo7a@lemmy.ca
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        You misunderstand my comment.

        I’m saying the digits in a date should be printed in an order dictated by which units give the most precision.

        A year is the least precise, a month is the next least, followed by day, hour, minute, second, millisecond.

        • Umbrias@beehaw.org
          link
          fedilink
          arrow-up
          0
          ·
          8 months ago

          You are looking not for precision but for largest to smallest, descending order. this is distinct from precision, a measure of how finely measured something is. 2025.07397 is actually more precise than 2025/01/27, but is measured by the largest increment.

          • Kacarott@aussie.zone
            link
            fedilink
            arrow-up
            0
            ·
            8 months ago

            Largest to smallest is also wrong. In 2025/01/28, the 28 is larger than the 01.

            It should be “most significant” to “least significant”

            • Umbrias@beehaw.org
              link
              fedilink
              arrow-up
              0
              ·
              8 months ago

              largest to smallest is correct. 1 mile is larger than 20 meters. if i had specified numerical value or somesuch, maybe you’d be correct. though significance works as well.

              • Kacarott@aussie.zone
                link
                fedilink
                arrow-up
                0
                ·
                8 months ago

                Largest to smallest is at best ambiguous. It can refer to the size of the number itself, or the size of the unit.

                There is a reason this exact concept in maths/computer science is known as the “significance” of the digit. Eg. The “least significant bit” in binary is the last one.

          • Bo7a@lemmy.ca
            link
            fedilink
            arrow-up
            0
            ·
            8 months ago

            And to address the argument on precision versus descending. I disagree. An instrument counting seconds is more precise than a machine counting minutes, hours, days, weeks, months etc… And that holds true through the chain. The precision is in the unit.

          • Bo7a@lemmy.ca
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            8 months ago

            We can debate this all day. And I can’t honestly say that I would take either side in a purely semantics argument.

            But the wording comes directly from RFC3339 which is, to me, the definitive source for useful date representation.

            https://www.ietf.org/rfc/rfc3339.txt

            5.1. Ordering

            If date and time components are ordered from least precise to most precise, then a useful property is achieved. Assuming that the time zones of the dates and times are the same (e.g., all in UTC), expressed using the same string (e.g., all “Z” or all “+00:00”), and all times have the same number of fractional second digits, then the date and time strings may be sorted as strings (e.g., using the strcmp() function in C) and a time-ordered sequence will result.

        • millie@beehaw.org
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          Sorting with either the month or the day ahead of the year results in more immediately relevant identifiable information being displayed first. The year doesn’t change very often, so it’s not something you necessarily need to scan past for every entry. The hour changes so frequently as to be irrelevant in many cases. Both the month and the day represent a more useful range of time that you might want to see immediately.

          Personally, I find the month first to be more practical because it tells you how relatively recent something is on a scale that actually lasts a while. Going day first means if you’ve got files sorted this way you’re going to have days of the month listed more prominently than months themselves, so the first of January through the first of December will all be closer together then the first and second of January in your list. Impractical.

          Year first makes sense if you’re keeping a list around for multiple years, but the application there is less useful in the short term. It’s probably simpler to just have individual folders for years and then also tack it on after days to make sure it’s not missing.

          Also, like, this format is how physical calendars work assuming you don’t have a whole stack of them sitting in front of you.

          • Kacarott@aussie.zone
            link
            fedilink
            arrow-up
            0
            ·
            8 months ago

            By keeping years in different folders you are just implicitly creating the ISO format: eg. 2025/"04/28.xls"