2020 is a leap year

2020 is a leap year.  What got me thinking about this is a sentence that I just wrote in an email message to instructing counsel in Turkey for a US case that our firm is handling:

We now have an Office Action and it is attached. To avoid abandonment a response must be made by February 29, 2020.

Which prompts a discourse on the inadequacy of integers.  All but the most diehard blog readers are invited to skip the following.In childhood the first math that we learn is integer math.  Counting numbers.  One, two, three.  Sometimes we call them “cardinal numbers” which are numbers that denote quantity (“one”, “two”, “three”).  We also sometimes refer to “ordinal numbers” which are numbers that denote a position in a list (“first”, “second”, “third”).  And yes, the first math we all learn as children is integer math.

Throughout human history there are many very artificial and awkward things about how humans interact with each other, many artificial and awkward things about how we lead our lives, that may be correctly understood as ways of coping with the simple fact that integers are inadequate.  By this I mean that almost nothing that matters in life ever turns out to be a convenient or neat integer count.  No two things in life that matter ever turn out to be reliably connected by integer counts.  The main point of today’s blog article being:

Integers are inadequate, and indeed are woefully inadequate.

The fact that there are leap years is an example of this.  Leap years came into existence in 46 BC with the establishment of the Julian calendar.  The point of leap years is that if you set to yourself the task of keeping close track of two events — the rotation of the Earth on its axis, and the orbit of Earth around the Sun — you find that they are not a convenient integer multiple of each other.  Prior to 46 BC, the Roman world had been following a calendar which assumed in a simple-minded way that of course the relationship between those two things was the number 365.  But by 46 BC the discrepancy between the true mathematical relationship and the oversimplified “times 365” relationship had gotten bad enough that Julius Caesar felt the need to insert 67 days as a one-time correction factor and to start having “leap years”.   The idea was that starting in 46 BC, every four years an extra day would be inserted.  This was tied to an understanding that although the two events of interest were unfortunately not connected by a simple integer 365, it was hoped that 365¼ would do the job.  This is almost the same as integer math, and is only slightly more complicated than integer math.  It only requires keeping track of whether the year that we are in happens to be divisible by four.

The most alert of our readers of course are getting ready to smack the buzzer and post a comment that the failure of integers to accurately link up these two things (Earth rotating on its axis, and Earth orbiting the Sun) is a failure that is so bad that even a shift from the number 365 to the number 365¼ eventually proved to be inadequate.  By 1582 the failure of the number 365¼ to fulfill its responsibility had accumulated an error of some ten days, and Pope Gregory imposed the Gregorian calendar upon humankind.  The imposition of the Gregorian calendar affected humankind in two ways:

  • a recurring correction that some years that you might think would be a leap year would henceforth not be a leap year after all;  and
  • a one-time correction subtracting ten days from the calendar.

The recurring correction of the Gregorian calendar is sort of a snoozer, not the sort of thing that people would ever get all wound up about.  It is that if a year is evenly divisible by 100 but is not evenly divisible by 400, then it will not be a leap year even though it is divisible by 4.  Yeah, a snoozer.  So the year 2000 (the only year divisible by 100 during which any reader of this blog will have been alive) was a leap year just as normal, despite the shift from the Julian calendar to the Gregorian calendar.  As the year 2100 approaches, many will doubtless assume that its divisability by four will mean that it will be a leap year, but it will not in fact be a leap year.  But none of us blog readers will be alive then to remark upon that oddity.

The one-time correction of the Gregorian calendar was, however, extremely controversial.  Not until 1923 did the last holdouts in Western culture on this point (the country of Greece) give up and accept the ten-day correction.  You can read about believers in an afterlife who complained that ten days of their Earthly life were being taken away by Pope Gregory’s edict.  And imagine a lender that had been promised a payment of some amount of interest per day on a loan taken out before Pope Gregory’s edict, complaining when the borrower paid back the loan ten days “early” on a date after Pope Gregory’s edict.  Or a tenant paying a years’ rent on a property on a date before Pope Gregory’s edict, and then only getting enjoyment of the property for 354 days instead of 364.  

The inadequacy of integers shows itself in other ways.  The pesky failure of the moon to orbit the earth in a way that adds up to an integer count over the course of a year.  Meaning that the “month” (defined as one “moon” cycle) is awkward.  Some 2200 years ago it got fairly settled that there ought to be a thing called a “month” which was tied to the “moon” and everybody agreed there were (or ought to be) twelve of them in a “year”.  But once you somehow agree as a society that “12” is the correct number for the number of months in a year, and then you somehow agree as a society that you care about “years”, and then you notice that the number of days in a year is 365 (or 365¼, or the Gregorian number that turns out to be 365.2425), you end up with no choice but to make the month of a duration that is very awkward like sometimes 30 and sometimes 31.

This pesky failure of the moon to orbit the earth in a way that works as an integer multiple of the way that the earth orbits the sun also leads to things like Easter and Ramadan and Chinese New Year falling on different days of the year from one year to the next.  

By the time of Pythagoras there was at least a grudging acknowledgment that probably nothing that matters is ever a convenient integer multiple of anything else, but at least it was hoped that anything that mattered would at least turn out to be a ratio of anything else that mattered.  The idea being you could take one thing that you care about, multiply it by some first integer, and any other thing that you care about, and multiply it by some second integer, and at least you could get those two products to match.  Another way to say this is that maybe “decimal fractions” taken together with integers would turn out to be enough to link up any two things that matter in life.  This hope, that decimal fractions would do the job, is again tied to this simple bit of psychology that counting is easy compared with lots of other things that are hard, and of course multiplying two integers is fairly easy.  Again the alert reader knows where this is going — it turns out that all sorts of things that matter to humans cannot be adequately described or covered or calculated by means of decimal fractions.  The relationship between the area of a circle and the area of a square into which it fits.  No decimal fraction will ever fully set forth the relationship between those two areas.  Counting is easy, but counting things, and even writing down a fraction that you get when you count two things, turns out to be woefully inadequate.

The fact that counting is easy (compared with other kinds of math) also leads to completely unjustified fixation or even obsession on things that, objectively, do not matter at all.  It happens that most humans have ten fingers.  And as just mentioned, counting is easy.  So it is natural that humans would count their fingers (arriving at a count of “ten”).  Having carried out this task of finger-counting, humans then started using the number “ten” as a convenient grouping for other kinds of counting like decades and centuries.  The observed behavior is that humans then give enormous amounts of attention to, say, the fact of a nation or a person reaching an age of one hundred.  But there is objectively nothing even remotely interesting or important about (for example) a “centennial event”.  A moment’s reflection makes us realize that for example if humans had evolved with eight fingers instead of ten, the super-important number of years that everybody would be celebrating would be 64 years, not 100 years.  Of course we would probably be using octal notation, not decimal, so the number would be expressed as 100 (octal).  If humans had evolved with sixteen fingers instead of ten, the super-important number of years that everybody would be celebrating would be 256 years, not 100 years.  In that case we would probably be using hexadecimal notation, not decimal, so the number would be expressed as 100 (hex).  All of this happens because counting happens to be easy to do, and so it’s the thing that people do when they cannot think of anything else to do.  And of course counting is (by definition) carried out with integers, and the bad luck about life is that integers almost never turn out to be adequate to describe anything that actually matters about life.

All of this because I was reporting an Office Action for which the response date falls on a leap day.  That’s what you get if you read this blog.  

5 Replies to “2020 is a leap year”

  1. A leap year in the Hebrew calendar, (which is a lunar one), has 13 months instead of 12 and occurs 7 times in a 19-year cycle. This ensure that the holidays do not “float” and that Passover, for instance, is always in the spring, as per the Torah.

    10 of the Hebrew months (or 11 in a leap year) are 30 days, and 2 are either 29 or 30 days, depending upon certain factors:

    https://en.m.wikibooks.org/wiki/Mathematics_of_the_Jewish_Calendar/The_lengths_of_the_months

    See also:
    https://en.m.wikipedia.org/wiki/Hebrew_calendar?wprov=sfla1

  2. Another example of the arbitrary calendar markers of time is the birth of my twin children five minutes apart yet on different sides of midnight. Legally one twin is considered a day older than the other twin.

  3. Counting using integers is also essential to modern digital computing. Which we all now take for granted. Ones and zeroes. And which many us can understand from our basic math classes grade school and perhaps high school. However, in something less than 100 years, quantum computer will be essential to computer. I wonder how many of the, then living human beings, will understand how a quantum computer works. Will they for example use roulette wheel analogies to explain how computer occurs? Of will everyone necessarily need a course in classical quantum mechanics?

Leave a Reply

Your email address will not be published. Required fields are marked *