Ok, so I'm working on a new time tracking system for work. We're paid bi-weekly, and this time instead of having the system enter the pay periods into the database "on the fly", I'm going to just enter 30 years worth of them all at once. Makes things simpler.
But in doing this I stumbled onto something probably most math majors know about, but which I had never encountered before:
26 pay periods x 14 days = 364 days
So where does that damned extra day go? Are we all getting gypped somehow? This is such a common thing I can't help but think it must be in a computer programming textbook somewhere. Anyone ever come across this? What's the answer?
3:45 pm, Update: Solved it. The trick was not to look at the pay periods, but to look at the fiscal years to which they belonged. To wit:
So, the solution is to keep an eye out on the pay period dates. When the start and end date of the first pay period of a new fiscal year are completely inside the second quarter, then assign that pay period to the previous year and assign the next pay period to the "new" one.
Effectively, every 14 years a fiscal year will come along with 27 pay periods in it instead of 26. A google search reveals this to in fact be the case. Sort of like an old mechanical typewriter whose bell "rings" after fourteen key presses.
Sometimes this job seems to be nothing more than the computer helpdesk equivalent of wiping grownup's bottoms (true, it doesn't smell as bad, but it's a lot more frustrating). But, every once in awhile, it's really cool.