New ask Hacker News story: How might software development have unfolded if CPU speeds were 20x slower?
How might software development have unfolded if CPU speeds were 20x slower?
4 by EvanWard97 | 8 comments on Hacker News.
I was pondering how internet latency seems to be just barely sufficient for a decent fast-paced online multiplayer gaming experience. If human cognition were say, 20x faster relative to the speed of light, we'd be limited to playing many games only with players from the same city. More significantly, single-threaded compute performance relative to human cognition would effectively be limited to the equivalent of 300 MHz (6 GHz / 20), which I suspect makes it a challenge to run even barebones versions of many modern games. This led me to wondering how software development would have progressed if CPU clock speeds were effectively 20x slower. Might the overall greater pressure for performance have kept us writing lower-level code with more bugs while shipping less features? Or could it actually be that having all the free compute to throw around has comparatively gotten us into trouble, because we've been able to just rapidly prototype and eschew more formal methods and professionalization?
4 by EvanWard97 | 8 comments on Hacker News.
I was pondering how internet latency seems to be just barely sufficient for a decent fast-paced online multiplayer gaming experience. If human cognition were say, 20x faster relative to the speed of light, we'd be limited to playing many games only with players from the same city. More significantly, single-threaded compute performance relative to human cognition would effectively be limited to the equivalent of 300 MHz (6 GHz / 20), which I suspect makes it a challenge to run even barebones versions of many modern games. This led me to wondering how software development would have progressed if CPU clock speeds were effectively 20x slower. Might the overall greater pressure for performance have kept us writing lower-level code with more bugs while shipping less features? Or could it actually be that having all the free compute to throw around has comparatively gotten us into trouble, because we've been able to just rapidly prototype and eschew more formal methods and professionalization?
Comments
Post a Comment