After seeing this a couple years ago I started to be more responsible for my PHP code, writing it even with the slightest micro-optimizations possible, since more times people use my code (especially due to nature of PHP being executed in every http request), more energicity they waste because of my lazyness and more they pollute the planet.
JIT was NOT included in this experiment, so the number can be much more lower.
I hope you will be responsible for your code so on too (generally I mean people using interpreted languages).
OK, but are they taking into account the energy expenditure of the programmer's brain while writing the program? The amount of calories his/her brain has to burn in order to produce & debug the code?
good question, probably the transpiled code does not match the optimized js. Maybe, if they targeted the same js version as js version they are benchmarking, the results would be equivalent?
Also, if they are using a node version with ts support, it will compile the ts before execution which means they are measuring the impact of the compiler, which can be a lot for small snippets.
The JS one is not surprising at all. There's no other loosely typed multi-paradigm scripting language where such insane shitloads of money and developer time have been spent for optimizing its execution (by some of the largest tech companies). Kinda funny considering that the language design is complete horse shit.
Your 2nd point is really quite surprising. I also wouldn't have thought that java would beat Go in both energy and time by that margin!
Without any information how this test result got achieved. It's kind of useless. It's like to read the headline of a paper. So sure you should question the accuracy of this image. But i would agree it's fun to look at it
Pascal is a simpler and more limited language, so it's not entirely surprising. It also has less and smaller standard libraries to link in.
As to C# and F#, what's wrong with the difference? The functional coding style of F# prefers immutable data over possibly mutable ones in C# and that requires more allocations and garbage collection.
I haven't looked into the details of the actual code, but I would expect the compiler optimizations and JIT to figure it all out and end up with very similar native code. Especially since both languages are mature and had enough time to reach such goals. But it's quite possible my assumptions are incorrect.
Common PHP-FPM for example allows for persistent PHP processes to handle multiple requests, reducing the overhead of creating a new instance for each request.
Also then there was no stable JIT while doing this study, so probably situation has changed a lot in a better way, but writing efficient code will always be a requirement.
That allows processes to be reused but the interpreter must still be set up cleanly each time and torn down. That includes things like open files, database connections and application configurations.
99% of my code is in perl. My local power source is 100% hydroelectric. I therefore choose to believe that nuclear energy would result in my code quality improving.
"Modern Perls are supposedly faster" I thought, until I checked and apparently they used a very recent Perl.
So now my denial is along the lines of "Well they're asking Perl to do things it doesn't need to, like implementing merge sort and binary trees, and, and!, TIMTOWTDI! They're probably choosing a slow way to do things too!"
The other denial idea was: "Interpreted languages offer rapid prototyping and easier debugging, which saves energy during the development process, and that isn't being taken into account here."
...but then I see the ridiculously low scores for JavaScript. I wonder if Perl (or other interpreted languages) had received the amount of scrutiny and attention that JS has had in order that browsers remain relatively fast, whether it would be any faster now.
That, and they might be using Object Pascal. I don't know (I haven't touched Pascal in thirty years) but I would not be surprised if there's some overhead there.