Still planning to do those two new columns for "total SoPF>3" and "total abs3exp" soon.
In the meantime, I performed a regression analysis on our thresholds for various of these metrics across the existing precision levels, to validate where we've set them.
We set our threshold for SoPF>3 at 51. But if you look at the maximum SoPF>3 for each precision level and consider it proportionally to its EDA size ((21,18), (47,30) (58,32), (233,46), (809,??)) you get a logarithmic fit:
Which suggests that perhaps we should consider commas with SoPF>3 up to 61.
For abs3exp threshold we started with 12 and Dave has asked to try extending to 13. Well, the maximum abs3exp is currently 14, and the progression by precision level goes 8, 11, 14, 14. For intervals as important as the very small ones that will recur throughout the notation we should aim for much lower though; if we only consider the JI intervals corresponding to the single steps of 21-EDA, 47-EDA, 58-EDA, and 233-EDA, then their abs3exps are 6, 3, 8, 2, for which there is no good line of fit, but they are overall quite a bit lower. And I also don't really see this metric as naturally growing with each more precise level (as makes sense for the SoPF>3) so much as leveling out at 14. But a logarithmic fit for ((21,8), (47,11), (58,14), (233,14)) would give closer to 15. So... I'll give up to 15 a shot.
For prime limit, @Ash9903b4
stuck with 19 and I went up to 37. A logarithmic fit to ((21,17), (47,23), (58,23), (233,37*)) says that 47** should be the limit with 809-EDA. Of course we are extremely unlikely to find useful n-tinas with SoPF>3 ≤ 61 but limit 47. But we could try.
For error, it only really makes sense to consider the errors for the primary commas for the single steps of each EDA. For 21-EDA, e.g., it's the
. which at 5.758¢ is 1.063 steps of 21-EDA (where 1 step of 21-EDA is 5.414¢) and thus has error of 0.063. The errors for the 4 established precision levels are thus 0.063, 0.396, 0.003, and 0.153, which don't really have much of a pattern, but if you ask Wolfram for a linear fit it'll tell you the next one should be about 0.125, which is half the threshold we've been using so far. All our yellow commas so far are within that (well, except the 5-tina, which is on the cusp at 0.13). So perhaps we should be stricter about the error.
By the way, I realized that at least I have not being calculating tina error quite correctly. I've simply been taking the absolute value of the difference between the tina column and the whole tina, so that e.g. the 7:221n which is 0.83 has an error of 0.17. But actually I should have been taking the error of the absolute ratio
, or as Dave and I call it the undirected value
of the ratio between the comma and one step of 809-EDA. So I'll get that right in my next chart.
In any case, I'm approaching the results of this analysis not in terms of where we should move a yes/no cutoff, but more like how we should weight these factors in a forthcoming consolidated tina badness metric.
*Assuming we do eliminate the 1/47S from the Extreme precision level.
**But we may see the 1/47S come right back in the Insane precision level, then!