Let's stop counting centuries
Count centuries as "the 1700s" for clarity, addressing confusion with decades. Language evolution and adapting conventions are crucial. Emphasizes easy language evolution and practical time period references.
Read original articleThe article discusses the convention of counting centuries and suggests using "the 1700s" instead of "the 18th century" to avoid confusion. It highlights how our normal interaction with dates differs from counted centuries and proposes a simple solution to make it easier to understand. The author also addresses the ambiguity between decades and centuries, proposing a convention where trailing zeros indicate ranges. Additionally, the article touches on language evolution and the importance of adapting conventions for clarity. It concludes by emphasizing the need for language to evolve easily and suggests practical ways to refer to specific time periods. The piece also includes sections on travel advice, survey results, things that may not work well for most people, and various short topics like jokes and observations.
Related
Identifying Leap Years (2020)
David Turner explores optimizing leap year calculations for performance gains by using bitwise operations and integer bounds. He presents efficient methods, mathematical proofs, and considerations for signed integers, highlighting limitations pre-Gregorian calendar.
Y292B Bug
The Y292B bug is a potential timekeeping issue in Unix systems due to a rollover in the year 292,277,026,596. Solutions involve using dynamic languages or GNU Multiple Precision Arithmetic Library in C, emphasizing the need for kernel-level fixes.
Rounding Percentages
Users face frustration with misleading progress indicators, especially rounding percentages in UI design. Clear rules proposed: 0% for no progress, 100% for completion, and interpolation for in-between values. Examples show accurate rounding methods for better user understanding.
The Byte Order Fiasco
Handling endianness in C/C++ programming poses challenges, emphasizing correct integer deserialization to prevent undefined behavior. Adherence to the C standard is crucial to avoid unexpected compiler optimizations. Code examples demonstrate proper deserialization techniques using masking and shifting for system compatibility. Mastery of these concepts is vital for robust C code, despite available APIs for byte swapping.
All I want for Christmas is a negative leap second
The author explores leap seconds, discussing the potential for a negative leap second due to Earth's rotation changes. They argue against abolishing leap seconds, emphasizing the rarity and complexity of a negative leap second.
https://www.youtube.com/watch?v=KDTxS9_CwZA
The "Final Jeopardy" question simply asked on what date did the 20th century begin, and all three contestants got it wrong, leading to a 3-way tie.
We do the same with people's ages. For the entire initial year of your life you were zero years old. Likewise, from years 0-99, zero centuries had passed so we should call it the zeroth century!
At least this is how I justify to my students that zero-indexing makes sense. Everyone's fought the x-century vs x-hundreds before so they welcome relief.
Izzard had the right idea: https://youtu.be/uVMGPMu596Y?si=1aKZ2xRavJgOmgE8&t=643
> There’s no good way to refer to 2000-2009, sorry.
This isn't really an argument against the new convention, since even in the old convention there was no convenient way of doing so.
People mostly just say "the early 2000s" or explicitly reference a range of years. Very occasionally you'll hear "the aughts".
There is the apostrophe convention for decades. You can refer to the decade of 1800–1809 as "the '00s" when the century is clear from the context. (The Chicago Manual of Style allows it: https://english.stackexchange.com/a/299512.) If you wanted to upset people, you could try adding the century back: "the 18'00s". :-)
There is also the convention of replacing parts of a date with "X" characters or an em dash ("—") or an ellipses ("...") in fiction, like "in the year 180X". It is less neat but unambiguous about the range when it's one "X" for a digit. (https://tvtropes.org/pmwiki/pmwiki.php/Main/YearX has an interesting collection of examples. A few give you the century, decade, and year and omit the millennium.)
Edit: It turns out the Library of Congress has adopted a date format based on ISO 8601 with "X" characters for unspecified digits: https://www.loc.gov/standards/datetime/.
I will resent it till I die.
* Sorry, I don't know how to write that in past, like haber sido in Spanish, my main language.
I'm running into a similar issue recently. Turns out that many people saying they are '7 months pregnant' actually mean they are in the 7th month, which starts after 26 weeks (6 months!)
I’ve no idea. When did the American revolution happen?
Not everyone’s cultural frame of reference is the same as yours. I can tell you when the Synod of Whitby happened, though.
In spoken conversation, I dunno, it doesn’t seem to come up all that often. And you can always just say “20 years ago” because conversations don’t stick around like writing, so the dates can be relative.
I have heard people use "the aughts" to refer to this time range [1]. I guess if I was trying to be specific about which century one could say "the two thousand aughts" or "the eighteen hundred aughts". But I think in that context i'd be more likely to say "in the first decade of the 1800s"
Well, there are a few longtermists who do. Do you mean the 01700s? Or some other 1700s that have been cut to four digits for conversational convenience?
:)
When someone's salary depends on being confused, there are ways to dissuade them. But when it's their hobby, forget it.
The author misses. The author would like to skip some significant interpretive steps, chiefly when our dynamically typed language uses number words or characters in different contexts. Suggested reading is about the differences between and use cases of nominal, ordinal, cardinal, interval and ratio.
https://www.statisticshowto.com/probability-and-statistics/s...
That said, in Finnish language people never count centuries. It's always "2000-luku" and "1900-luku", not 21th and 20th.
The kind of person who cares and reads about "the Xth century" can also trivially understand the date range involved.
The kind of person who can't tell 18th century is the 1700s and 21st century is 2000s, it would make them little good to read history, unless they get the basics of counting, calendars, and so on down.
I deal with the "2000s-problem" by using "00s" to refer to the decade, which everyone seems to understand. Sometimes I also use "21st century"; I agree with the author that it's okay in that case, because no one is confused by it. For historical 00s I'd probably use "first decade of the 1700s" or something along those lines. But I'm not a historian and this hasn't really come up.
ISO 8601-2:
> Decade: A string consisting of three digits represents a decade, for example “the 1960s”. It is the ten-year time interval of those years where the three specified digits are the first three digits of the year.
> Century: Two digits may be used to indicate the century which is the hundred year time interval consisting of years beginning with those two digits.
I believe this time is called 'the aughts', at least online. I say it in person but I might be the outlier.
How about Americans stop with dropping the "and" in nineteen hundred AND twenty?
- 1500s: The Columbian Century
- 1600s: The Westphalian Century
- 1700s: The Century of Enlightenment
- 1800s: The Imperial Century
- 1900s: The Century of Oil
- 2000s: The Current Century (to be renamed in 2100)
You might think it's Eurocentric, and you'd be right. But every language gets to name them differently, according to local history.I like the German Nullerjahre (roughly, the nil years). Naught years or twenty-naughts works pretty well too imho.
1 BC should be renamed year 0. Then the years 0-99 are the 0th century, the years 1900-1999 are the 19th century, etc.
To avoid confusion between new style and old style centuries, create a new word, "centan", meaning "100 years" and use cardinal instead of ordinal numbers, for conciseness. Then the years 1900-1999 are the 19-centan.
I understand the difficulty, but I don't think it is too terrible for us to get used to it and we aren't gonna change the past 500 years of literature that already did this.
Also, it's ironic for a bunch of people who literally count arrays from zero to be complaining about this... :-P
But oh, dear writer, slightly irksome that you learned copyediting but do not use en-dashes for your date ranges!
> starting in 1776, not in the 76th year of the 18th century.
"Noughty", Naughty!
Not sure what a good word for this would be, but maybe just use what we already say — “hundreds”.
So, in the late 17th hundreds, …
The author is wrong here. The correct way (at least in spoken West Coast American English) is the Twenty-aughts. There is even a Wikipedia page dedicated to the term: https://en.wikipedia.org/wiki/Aughts If you want to be fancy you could spell it like the 20-aughts. I suppose there is no spelling it with only digits+s though, which maybe what the author was looking for.
* You know, these you wierdos call 'normies'
In terms of music this is true.
I realize this is counting by decades/centuries again, but if we just do it for the first decade/century under a larger span it's easy to read.
Well now I have to
We still say “20th century” though because that’s idiomatic.
Also in English it sounds weird, as you have to pronounce it "seventeen-hundreds" whereas the correct pronunciation is "one-thousand-seven-hundred". So 1700s is unsuitable for formal writing or speaking and doesn't map naturally to most languages of Western civilization.
But yeah, I guess the author finds it hard to subtract 1 in his mind :) I could go off about the typical US-centric arrogance that I see on this site, but I think it's already pretty funny as it is.
I wish we had some calendar with a departure point far less anthropocentric. So instead of all the genocides of Roman empire, each look at a calendar would be an occasion to connect to the vastness of the cosmos and the vacuity of all human endeavors in comparison to that.
Related
Identifying Leap Years (2020)
David Turner explores optimizing leap year calculations for performance gains by using bitwise operations and integer bounds. He presents efficient methods, mathematical proofs, and considerations for signed integers, highlighting limitations pre-Gregorian calendar.
Y292B Bug
The Y292B bug is a potential timekeeping issue in Unix systems due to a rollover in the year 292,277,026,596. Solutions involve using dynamic languages or GNU Multiple Precision Arithmetic Library in C, emphasizing the need for kernel-level fixes.
Rounding Percentages
Users face frustration with misleading progress indicators, especially rounding percentages in UI design. Clear rules proposed: 0% for no progress, 100% for completion, and interpolation for in-between values. Examples show accurate rounding methods for better user understanding.
The Byte Order Fiasco
Handling endianness in C/C++ programming poses challenges, emphasizing correct integer deserialization to prevent undefined behavior. Adherence to the C standard is crucial to avoid unexpected compiler optimizations. Code examples demonstrate proper deserialization techniques using masking and shifting for system compatibility. Mastery of these concepts is vital for robust C code, despite available APIs for byte swapping.
All I want for Christmas is a negative leap second
The author explores leap seconds, discussing the potential for a negative leap second due to Earth's rotation changes. They argue against abolishing leap seconds, emphasizing the rarity and complexity of a negative leap second.