Skip Navigation

Free-threaded CPython is ready to experiment with!

labs.quansight.org Free-threaded CPython is ready to experiment with!

An overview of the ongoing efforts to improve and roll out support for free-threaded CPython throughout the Python open source ecosystem

Free-threaded CPython is ready to experiment with!
36
Hacker News @lemmy.smeargle.fans bot @lemmy.smeargle.fans
BOT
Free-threaded CPython is ready to experiment with

You're viewing part of a thread.

Show Context
36 comments
  • The only interpreted language that can compete with compiled for execution speed is Java

    "Interpreted" isn't especially well defined but it would take a pretty wildly out-there definition to call Java interpreted! Java is JIT compiled or even AoT compiled recently.

    it can be blazingly fast

    It definitely can't.

    It would still be blown out of the water by similarly optimized compiled code

    Well, yes. So not blazingly fast then.

    I mean it can be blazingly fast compared to computers from the 90s, or like humans... But "blazingly fast" generally means in the context of what is possible.

    Port component to compiled language

    My extensive experience is that this step rarely happens because by the time it makes sense to do this you have 100k lines of Python and performance is juuuust about tolerable and we can't wait 3 months for you to rewrite it we need those new features now now now!

    My experience has also shown that writing Python is rarely a faster way to develop even prototypes, especially when you consider all the time you'll waste on pip and setuptools and venv...

    • "Interpreted" isn't especially well defined but it would take a pretty wildly out-there definition to call Java interpreted! Java is JIT compiled or even AoT compiled recently.

      Java is absolutely interpreted, supposing that the AoT isn't being used. The code must be interpreted by JVM (an interpreter and JIT compiler) in order to output binary data that can run on any system, the same as any interpreted language. It is a pretty major stretch, in my mind to claim that it's not. The simplest test would be: "Does the program require any additional programs to provide the system with native binaries at runtime?"

      It definitely can't.

      Well, yes. So not blazingly fast then.

      I mean it can be blazingly fast compared to computers from the 90s, or like humans... But "blazingly fast" generally means in the context of what is possible.

      I find that context marginally useful in practice. In my experience it is prone to letting perfect be the enemy of good and premature optimization.

      My focus is more in tooling, however, so, might be coming from very different places. In my contexts, things are usually measured against existing processes and tooling and frequently on human scale. Do my something in 5 seconds that usually takes a human 15 minutes and that's an improvement of nearly 3 orders of magnitude.

      My extensive experience is that this step rarely happens because by the time it makes sense to do this you have 100k lines of Python and performance is juuuust about tolerable and we can't wait 3 months for you to rewrite it we need those new features now now now!

      You're not wrong. I'm actually in the process of making such a push where I'm at, for the first time in my career. It helps a lot if you can architect it so that you can have runner and coordinator components as those, at their basics, are simple to implement in most languages. Then, things can be iteratively ported over time.

      My experience has also shown that writing Python is rarely a faster way to develop even prototypes, especially when you consider all the time you'll waste on pip and setuptools and venv...

      That's... an odd perspective to me. Pip and venv have been tools that I've found to greatly accelerate dev setup and application deployment. Installing any third-party dependencies in a venv with pip means that one can pip freeze later and dump directly to a requirements.txt for others (including deployment) to use.

You've viewed 36 comments.