Still wrong...
Nothing complicated about it... its SIMPLE... you have to make a choice... based on the best available information. Pick one:
Death rate = number of deaths today / number of cases today
Death rate = number of deaths today / number of cases 5 days ago Death rate = number of deaths today / number of cases 7 days ago
Death rate = number of deaths today / number of cases 14 days ago
The variation you get in the computation is in error, in part, in the degree you choose the wrong time lag.
But NO ONE dies the instant they become ill. There is ALWAYS some lag to account for.
You are killed by the disease you caught... not by the disease someone else caught in the 5, 7 or 14 days after you became ill... so it makes no sense at all to account for your death as a percentage of the total number of infections that had not even occurred yet when you became ill.
It's like filing your taxes... you do that in April... for the prior tax year... and the tax you owe for that accounting period STAYS tied to ONLY the income earned in that specific time period in which you earned the opportunity to volunteer to pay that tax. The accounting lags current history... and the accounting stays tied to a prior time period. If the IRS said... "no, by our accounting you owe us all the taxes we want right now... right up until today"... because the government is "confused" by this concept of "lagging" data while fixing the accounting to a particular prior time period ? Or if they demand you pay up for "income" six months into the future based on a rate calculation and projections of future earnings... "because we're not tying this accounting for events that "have already occurred" down to any particular time limit in an accounting period" ? I suspect the "complexity" of <not suspending the logic of causality> to lag the data while fixing the accounting done to the fixed particular time period that matters... would suddenly not be that challenging.
So the known least accurate choice is to use "number of cases today".
More accuracy... depends on getting closer to using the right lagging time period in the delay... using the best available information on the right lag period to apply.
When you do use that known wrong choice in zero delay, you do impose an easily avoidable error, the magnitude of which varies... as it depends both on how wrong you are in the lag period used, and on the actual R0... or the rate at which the disease is spreading. As R0 approaches 1... meaning a constant 1 person to only 1 other person spread... the magnitude of the error imposed by the wrong choice is diminished. As R0 rises... as 1 person infects more than 1 other person... the faster the disease spreads... the shorter the period of time it takes for the disease to double the number who are infected... the more new cases will pop out in that period of the time between when you became ill and when you die.
The magnitude of the error imposed depends on the time period during which the error is imposed, as the error compounds based on the rate of change in that time period.
In South Korea... the disease is doubling DAILY... in other places the doubling period has varied from two days... to five days...
Say 10 out of 100 cases will die as a fixed function of the virus given X level of care ? What's the error if you don't lag the data at all ... and wrongly use deaths today vs. cases today ?
1. At R0 = 1 the constant transmission means not lagging imposes little to no error: 10% mortality
2. As R0 increases, if the doubling period equals the lagging period: 5% mortality... understates the risks.
3. Daily doubling with a 7 day lag: 10 deaths logged out of 12,800 cases: 0.00078125 % mortality.
You will say that error you are making, in case 3., means 10 people out of 12,800 will die... no big deal... that's just like the normal flu...
I will say that error you are making, in case 3., means 10 people out of 100, and 1,280 people out of 12,800 will die... which is a big deal...
Changing the infection rate... or the time period in which the disease doubles... doesn't change the odds of you dying, if infected ? Those changes don't make it more likely you will die... nor do they make it less likely you will die ? But getting the time lag wrong in computing it... CAN dramatically change the impact of the error imposed in the number of cases used in the computation... making the math error have real impact... if you don't lag the data properly.
The higher the mortality rate is in fact... the higher the rate of transmission is in fact... the longer the lag period is in fact... the larger the error in the computation.
Apply better health care... and note that extending the lag period between you becoming infected and you becoming dead... doesn't change the percentage in the risk in how dead you are at day 14... but it will make your computation of % mortality more in error... IF you don't account for the change in the longer lag period.
Those changes not accounted for properly... will change the MATH applied in computing the percentage mortality... if you don't lag the data properly. And, if you don't lag the data properly... you will UNDERSTATE the risk... more dramatically the worse the reality is...
Worst case is high mortality rates, high rates of transmission, and long lag times before dying... that has your error make you say it is LESS of a problem...
That's the case here...
Understating the risks... even leads people into making serious errors in judgment that might prevent them responding appropriately in a timely manner... even by delaying their perception that there is a window of time in which preparation matters... perhaps resulting in them not figuring out how big a problem it is... until after there is no more food on the shelves in the stores ?
|