> One second is multiple billions – billions! – of executed instructions. One second is an eternity for a computer.
> Yet I sometimes wonder whether one second is the smallest unit of time most programmers think in. Do they know that you can run entire test suites in 1s and not just a single test? Do they know that one second is slow?
https://registerspill.thorstenball.com/p/allergic-to-waiting
#perf perspective
> ... Or do they think one second is acceptable, because my generation of programmers grew up on the Internet? On the Internet, Jeff Dean’s Numbers Everyone Should Know seems archaic if not irrelevant: why worry about nanoseconds when pinging the nearest Google server takes 13 milliseconds? On the Internet, one second doesn’t seem too bad maybe.
Good tirade meandering to how time scale can anchor thinking.
It also makes me reflect on how beneficial good time budgets are in a project. 1 ms means a lot when targeting 60fps, but how much does 1 ms of server time cost? Or in a page load, it seems small but if you budget 1 sec then you have limited 1 ms tasks.