(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=39286458

虽然 Python 在易于学习和用于脚本编写方面确实具有一定的实际优势,但由于性能和可扩展性等方面的限制,它无法取代其他高级通用编程语言,例如 Java、C++ 或 Rust 。 These languages have advantages in handling resource-intensive and large-scale applications, which Python typically lacks。 While Python remains useful and beneficial in its domain, limiting oneself to it for more advanced programming needs limits flexibility and innovation。 Ultimately, choosing a language ultimately depends on individual needs and preferences。

相关文章

原文
Hacker News new | past | comments | ask | show | jobs | submit login
RustPython (rustpython.github.io)
581 points by BerislavLopac 1 day ago | hide | past | favorite | 208 comments










> "it can be compiled to WebAssembly in order to run Python in the browser."

I have seen this approach with C-python and NodeJS already and I think it simply not viable, what they are suggesting is compiling the runtime (the same one you use in non-wasm projects) to wasm and then run your python code on top of it.

This is a double-whammy of performance degradation, you basically have two JIT-compilation steps happening (once by the wasm runtime to compile the rust-python wasm and again by the rust-python code compiling your python code). And this is on top of normal performance degradation from using a dynamically typed language compared to a statically typed language

To make dynamic languages (even JS) viable to run in a wasm runtime, the language must be compiled directly to wasm.

Project still looks pretty cool and useful though, there is plenty of python code that could be useful to use in the browser no matter how badly it runs. Just don't try to build a web framework on top of this kind of approach.

Edit: Let me reframe this a bit, this is what I think, I haven't really benchmarked anything and rust python might be doing some tricks I am not aware of.



The reality is that the "dark" majority of preexisting code has essentially no performance requirements/concerns; they're business scripts that could literally run on a toaster with no problem if you could get the code onto it.

So really most business logic can easily be satisfied by "compile the interpreter to wasm and then run the dynamic language on that", and doing it this way can move existing "learned the hard way special cases" byzantine business code to something that can run on a web server and be accessed by the companies employees rather than passing around scripts for them to run, with a lot of benefits including instant upgrades for everyone for bug fixes.

That said, this specific impl claims to only support half of the standard library so I kinda doubt its ready for any 'serious' business usecases yet anyway.



I don't think you grasp quite the implications of what I was saying, this kind of approach could take _seconds_ to even start running your python application. Large python codebases could take like a minute to start if loaded that way.

Once it does start then your arguments can make sense, but even so it would still make it impractical for most things.

Trust me, when the Javascript dev tells you something will be slow, it WILL be very slow



Pyodide (standard cPython in WebAssembly) loads surprisingly quickly.

My https://lite.datasette.io application usually starts up in less than 10s - most of that is downloading about 10MB of WASM blobs, and 10MB isn't actually that big these days (many sites serve more than that in image headers).

When I built Datasette Lite I did it as a research project, assuming it would be far too slow loading to be useful. I've since changed my mind on that.



Your project seems very cool, and good on you for it.

10 seconds is absurdly slow, though. That's like time to install Mathematica from a disk image level slow.



> 10 seconds is absurdly slow, though.

I hate to say this, but have you used any $ModenWebApp with $HotJSFramework recently? I thank the gods when those pages load without a 5-10 second of fancy spinning animation. Really thought we would be in a better place by 2024 but nope.



I haven't used any such slow thing yet (am mostly a commandline world person).


Don't work for a company where you need to turn in receipts or fill out info in crappy HR software? I mean, good for you and knowing the command line, but doing the less-fun, less-specialized parts of jobs usually involves regressing to the mean of what interfaces people know how to use.


I started a job in an unusual way (first week, I deployed to the Greenland ice sheet). Workday wouldn't work over the high-latency connection so I couldn't fill out my HR paperwork without VNCing into a computer in CONUS...


I don’t. But I understand what you’re describing


I thought it was too slow as well, but apparently it's not - plenty of people are using it now, and I myself use it way more than I thought I would.


Curious - could these blobs be cached? I'm assuming they change only on upgrade?


Yes, they're cached using standard browser HTTP caching.

A very rough estimate just now on my iPhone over 5G: 12s for the first load, then when I visited again took around 3s.



> I don't think you grasp quite the implications of what I was saying, this kind of approach could take _seconds_ to even start running your python application.

Indeed, the web demo takes about 5 seconds to cold-start on my beefy PC, between downloading the 22MB WASM blob and compiling it. It also grows the WASM heap to 160MB after running the simple Fibonacci example, and WASM heaps can't (yet) be shrunk, so the only way to reclaim any of that memory is to discard the whole instance and start over.

It's cool that it works, but not very practical.



Depends a little on if you're going to a website to use an app, or running something on a always on PC on say a production floor where the app never gets exited, I'd think.


If you're not targeting the web, what would be the point of running a Python runtime on top of a WASM runtime?

You could just run RustPython as a native binary, or use ol' reliable CPython.



As far as I'm aware, even discarding the instance isn't good enough, since v8 doesn't seem to reclaim the Wasm Linear Memory ever. I think the only thing you can do is start it in a worker and then terminate the entire worker.


I guess parts of this can be cached, so if you use the app more than once it will be faster (or at least it has the potential for it).


What do you envision these scripts to be doing that it would matter if it takes minutes, hours even, to start? Fire and forget.

Granted, the environmental cost to all that extra energy consumption may not be palpable. Then again, you're exactly choosing Python in the first place if you care about the environment.



> this kind of approach could take _seconds_ to even start running

Only seconds? Today it takes 45 minutes to do the update by hand, but there's this handy 20-line Python script that does it instantly.

(Hypothetical example only, but it's not uncommon.)



>> The reality is that the "dark" majority of preexisting code has essentially no performance requirements/concerns; they're business scripts that could literally run on a toaster with no problem if you could get the code onto it.

Which means this whole thing is pointless from an end user point of view. The technology stack is getting very deep - Python, Rust interpreter, WASM, in a browser. I'd love to get back to running things on a toaster with no dependencies.



Technically this approach puts the burden on the build toolchain, the toaster only needs a WASM runtime which is actually not that big of an ask (it is far easier to put a standalone WASM runtime in a toaster than a full browser)


Use go for that :)


> can move existing "learned the hard way special cases" byzantine business code to something that can run on a web server and be accessed by the companies employees rather than passing around scripts for them to run

Or… you could just use Django. The framework built for running python on a web server.



That's totally different, at least out of the box. The use-case seems to be running user-generated scripts that aren't known in advance and can be added/edited/ran in a self-service manner.

The usual way to do this is get a Python interpreter, sandbox the hell out of it on your server, and then run the untrusted code but this obviates the need for security paranoia quite a bit since it's running in the user's environment.



Sounds like a use case for simple Python rather than RustPython?

If you go full RustPython, surely looking to eek out performance must be a major reason to even go there rather than just script with CPython like it was 1999?



> essentially no performance requirements

In my experience this is very rarely actually true, even though people think it is.



> And this is on top of normal performance degradation from using a dynamically typed language

You already counted that, that's the Python interpreter (in this case RustPython) overhead.

---

The steps are (1) RustPython interpreter (2) Rust compiled to WASM (3) WASM runtime.

"Normally" you'd have (1) CPython interpreter (2) C compiled to machine code.

I'm not sure how much overhead WASM runtime really is...I'm curious, but I doubt this is awful compared to CPython.

---

FYI, for CPython as WASM, https://pyodide.org/.



I can't speak for rustpython, but you can partially evaluate dynamic languages in wasm with something like wizer https://github.com/bytecodealliance/wizer. So you can let the runtime do all the parsing and compiling ahead of time. We do this with javascript (quickjs). It does have a few downsides regarding memory size but it is pretty fast.


Ok wow, wizer is really really cool. I was thinking a few months back it would be great if you could dump out the jitted version of a wasm program so you could load it faster and... here it is, someone has built it.

That's wonderful.

Next up, wasm supercompiler...



> once by the wasm runtime to compile the rust-python wasm

I'm not sure what you mean by that. The runtime doesn't compile WASM, it simply executes it.

There are tools for dealing with interpreter runtime overhead this by pre-initalizing the environment like Wizer[0]. ComponentizeJS[1] uses it to pre-initialize the Spidermoney engine it packages to gain fast startup times (and you can then prune the initialization only code with wasm-opt). As techniques like ComponentizeJS are also being applied for a specific set of interpreted files, you can even prune parts of the interpreter that would never be used for that specific program. If you want to go even further you could record specific execution profiles and optimize based on those.

[0]: https://github.com/bytecodealliance/wizer

[1]: https://github.com/bytecodealliance/ComponentizeJS



> The runtime doesn't compile WASM, it simply executes it.

That's not necessarily true. For example, Wasmtime uses Cranelift to compile WASM binary into native code. https://docs.wasmtime.dev/contributing-architecture.html



You are right for the most part. I attended a talk about pyscript[1] (runs python in the browser using wasm which is similar) and there is a 2x performance hit.

[1] https://pyscript.net



PyScript has actually been massively refactored in the last few months and is much faster now. You can check the performance of PyScript running MicroPython here: https://laffra.github.io/ltk/

If you need/choose Pyodide load time will of course by much more but that's mostly because of the size of full core Python and the dependencies you might have.



> > "it can be compiled to WebAssembly in order to run [..] in the browser."

> and NodeJS

Wait... what? Why?



So you can run NodeJS code in the browser, even though both are JS-based, NodeJS has a bunch of APIs that deal with things like file systems that the browser doesn't have.

For example, imagine you have a lib that converts markdown to html, but the lib happens to write the files directly to the disk, hence it can't be used in the browser. If you compile the nodejs runtime to wasm with a WASI that maps the file system to local storage, then you can just read the file from local storage after invoking the lib.

Just saying it is a technical possibility, this kind of approach is really only meant to be used if you _really_ just want to run some lib in the browser, no matter how slow it gets.

Also technically if you could compile JS to wasm (without nodejs, so more like how you run C through wasm right now) then you don't need to care about browser versions and JS api polyfills while still using JS.



Just putting my hand up to say that MicroPython is awesome (and runs on the RP2040). https://micropython.org


At a previous job I was checking out MicroPython due to its support on LEON4 RAD-hardened CPUs like GR740. It was appealing as a possible design path from proof-of-concept implementations with desktop python/numpy (etc) to space-certified platforms, ideally reducing the quantity of code to reimplement in C. https://essr.esa.int/project/micropython-for-leon-pre-qualif...


I agree. Incredible development environment for a huge number of MCUs. Super fast.


Being able to run REPL session directly on MCUs and interact with the hardware live with short development loop is such a great idea.


Your examples in particular have been excellent lately.


Appreciated.


I thought RustPython is what Ironpython become after being abandoned for a while.


I feel smart because I got your very funny joke.

Silliness aside, I actually had almost the same thought when I was reading the post lol. Great minds think for themselves so I guess maybe we're only really good minds then. Ahh well....



I thought it was renamed after Oxide picked up the sponsorship.


Ha ha.

For those interested, Iron* is .NET.

IronPython

IronRuby

IronJS



Just wait until we get IronRust through a clr rustc backend :)




Related:

RustPython – A Python-3 (CPython >= 3.11.0) Interpreter written in Rust - https://news.ycombinator.com/item?id=35056586 - March 2023 (136 comments)

A full Python interpreter written in Rust - https://news.ycombinator.com/item?id=31086317 - April 2022 (7 comments)

Python interpreter written in rust reaches 10000 commits - https://news.ycombinator.com/item?id=29094323 - Nov 2021 (93 comments)

RustPython: A Python interpreter written in Rust - https://news.ycombinator.com/item?id=28280790 - Aug 2021 (49 comments)

RustPython – Python Written in Rust - https://news.ycombinator.com/item?id=27060802 - May 2021 (2 comments)

A Python interpreter rewritten in Rust, that can run pip - https://news.ycombinator.com/item?id=26030269 - Feb 2021 (48 comments)

A Python Interpreter Written in Rust - https://news.ycombinator.com/item?id=19064069 - Feb 2019 (194 comments)



I think it's really cool that Python has a healthy third-party interpreter community. PyPy, IronPython, Jython, and now RustPython expand Python's accessibility. It's very cool to see how many people are working on this, and I wonder what kind of problems with CPython this has exposed also.


I wonder if anyone actually uses those third-party interpreters for anything serious. I've never come across anyone that did.


PyPy is one of the most underrated python implementation. It just makes your pure python code 20x faster without needing to change anything ( if you don't have c++. Extension depends) We had used PyPy in production , especially on Real-time/ asynchronous web apis that doesn't need machine learning stack


When I wrote asset processing script for a video game in Python, the initial run time was 14 minutes, I was not happy with it because it made the CI times long as hell, so my next step was to rewrite it in C++ and try to sprinkle it with some SIMD magic to speed it up, but out of curiosity, I ran it through PyPy - that brought down the run time from 14 minutes to just 18 seconds (!), needless to say, I was happy enough with the result that I figured these 18 seconds are good enough to not make me waste my time with rewriting the entire tool.


If Python community and CPython developers were more open to PyPy, many things could be changed. A group of people talks about how good having a third-party interpreter in Python community while the whole Python eco-system is heavily relying on the old (maybe good-enough) Python C API.

Whenever I meet a C API issue on RustPython project (yes, I contribute to RustPython), I think about PyPy, and check what's going on C API, and realize 15 years was not enough to change things. Now the momentum of PyPy is not that strong as before. I believe Python community lost a huge chance.

Hope HPy or something can save PyPy. Maybe RustPython also will get a chance around there.



CPython is finally getting a JIT, although this change in mindset was mostly due to Microsoft and Meta.

So much time lost by not embracing PyPy.



Absolutely

It is really underrated.

Maybe it's the extension issue, maybe it's something else or maybe it's the fact that it was born as an experimental platform more than anything

But it should have been more popular



Python's use base is now dominated by ML and scientific in general isn't it? Both of those communities are relying on extensions completely. I am quite possibly biased as I haven't written any python that didn't use at least numpy.


> Python's use base is now dominated by ML and scientific in general isn't it?

No?

Python may dominate those fields vs other languages, but those fields don't dominate Python use, in my experience.



If you do a job search for "Django", "flask" or "fastapi" you'll see that there are a lot of web dev jobs using it.


I still use Python for scripting simple things and for writing web services. I use fastapi for simple things, but I will use Django for anything that needs a database because it has the best migration tooling that I have used.


I guess I could have stated my bias up front as well: mostly a scientific user/dev here. I did enable pypy wheel builds for a pybind11 project and have things magically work, and I've had 'pypy-days' to see if there was any breaking for me (none so far, so kudos!), but other than that never actually used it/found uses for it.


Pypy supports numpy just fine.


The release manager for PyPy posted "Ask HN: Is anyone using PyPy for real work?" last year and got many responses: https://news.ycombinator.com/item?id=36940871.


We used Jython very extensively at a former employer, a high frequency trading firm. Originally it was to be used as a configuration language, to allow traders to easily write scripts that configured trading strategies written in Java. The scripts grew into a monstrous ecosystem of applications and analytics.

Jython stuck on Python 2 and doesn't have great interop with the rest of the world Python ecosystem, so I wouldn't recommend it for a new project, but it was a big force multiplier in that use case.



there are JEP, JPype and PyJNIus these days...


Those are bridges to native CPython. There is also GraalPy (https://github.com/oracle/graalpython), which is a standalone implementation more similar to Jython. It seems already quite compatible with Python 3.

I have tested the latest version graalpy-community-23.1.2 as a regular Python interpreter from the command line (i.e., not through a Java program). It was able to run my standard library-only Python scripts, as well as a script that made async HTTP requests with the external library HTTPX. It couldn't run a TUI program that used the Python Prompt Toolkit (`AttributeError: module 'signal' has no attribute 'siginterrupt'`).

I am curious about your experience if you have used GraalPy more than I have.



wow oracle is really nervous about python and Microsoft's support for it, huh?

I just now learned about this project, thanks for pointing it out. I'll be curious how they go about shared data, especially shared str/String.

also I personally happen to depend on Windows support, I'll go popcorn on the future of this project...



The infamous MMO game EVE uses Stackless Python.. and the players certainly seem to take it very seriously.


why infamous?


I think it might be in reference to the semi-frequent heists that happen, that have real monetary value behind them. EVE players seem to be a special breed. Some of the stories are pretty incredible.

- https://www.pcgamer.com/inside-the-biggest-heist-in-eve-onli...

- https://www.mmorpg.com/news/an-eve-online-player-has-pulled-...

- https://www.escapistmagazine.com/the-infamous-eve-online-ban...



It’s been quite a few years since I worked on this, but spent some time implementing and administering an enterprise infrastructure auto-discovery tool (trying to remember the name - it was a very expensive product from HP, who acquired it from Mercury I think) that was built in Java and exposed a scripting layer to admins via Jython.

I haven’t seen much Jython since, but encountered a few enterprise tools doing this. It was an effective way of enabling advanced use cases and extensions of the existing Java stack without requiring the admin/developer to touch Java.



I've used pypy in anger a few times. I haven't needed to do reasonably performant webdev in python for awhile, but there was a time when PyPy and the cffi-based database drivers could not be beat by CPython interpreters for non-trivial code. This still might be the case, but I haven't profiled it at all (recently). This was back in python 2.7 days in the mid 2010s.


Oh yes! We use pypy for many batch jobs deployed in production.

We switched to pypy several years back and that allowed us to push a great deal more batch jobs through per window with zero code changes.



I think RustPython gets used in ruff (the linter), or at least I saw they were experimenting with that idea. Which is mainly what its built for - running python code in rust, since its still a fair bit slower than cpython.

Otherwise, for normal job running, cloud infrastructure makes this pretty tricky unless you're a big company. I work for a small company and we run python code on things like azure functions, until they support other python interpreters, we'd need a massive expected benefit to justify that sort of self-building of architecture.



At least PyPy sees real production use.


I have seen them used extensively, especially pypy & jython by big companies. I have seen ironpython twice, but I rarely venture into windows/.NET land.


Pypy is nice if you want more performance. From my experience it seems underused. Never seen jython and ironpython used - yet.


Pypy gives you at best a single-digit speedup, and requires carefully going through your dependencies and checking all your codepaths. That's rarely a good value proposition for a business - if you're growing fast enough that performance is an issue, you're going to need a bigger speedup than that pretty soon, and if you're taking the time to go through all your codepaths, it doesn't cost a lot more to make a bigger change like switching languages entirely.


Around 4-5 years ago, I saw a lot of IronPython used by data analysts using Spotfire[0]. I've seen Jython used to write an IDE on top of Eclipse.

I'm guessing almost nobody uses it for their SaaS, but outside of that, these runtimes do see some use.

[0] https://www.spotfire.com/



I worked on a project (15+ years ago) that used Jython to put a web front end on GDS systems (airline reservation mainframes). It was in production and being used in some of the largest travel call centers at the time. The whole thing was built in Jython.


I think eve online famously uses a third party interpreter. Stackpython maybe?


Ghidra uses Jython not sure if that counts as serious.


No.. because the only reason to use python is the ecosystem of packages and that's painful enough on the official implementation.


Python has been my go-to for prototyping since 2003. Maybe I’ve just adjusted to its quirks, but I keep returning to it despite picking up another dozen languages since. I’ve encountered other people who hate it, but never understood why; what issues are you experiencing?


If you don't mind me chiming in, mostly looking for advice if you've got any. I tend to run into a lot of issues when trying to play with projects that use TensorFlow. I have an Apple Silicon laptop and I seem to always get stuck resolving circular, conflicting dependencies. The worst offender is https://github.com/magenta/magenta. It's such a cool project and I got it running once a few years ago but recently lost a few solid weekends trying to get it up and running again.


I will say that the most issues I’ve encountered with Python happened on macOS machines. The default installation was always old, and success using a modern version depended highly on the techniques used to install it.

For anything that involves dependencies, I rely on venv or pyenv to create a clean environment. When on macOS, I tended to use docker/containers as well, but primarily because 99% of my Python work has been in a Linux environment and I wanted like-for-like as much as possible. But a good version manager directly on macOS should help quite a bit.



> I will say that the most issues I’ve encountered with Python happened on macOS machines.

Same here. It's probably somewhat to do with my only partial MacOS/Homebrew knowledge, but every time I'd ressurect a Python project on MacOS, Homebrew would end up screwing up / confusing / munging the system Python and it's own ones.

Never ran into problems like that on Linux (except for that one time I tried Linux homebrew). Just something about the way homebrew does stuff seems incompatible with me understanding it. Seems to be a "just me" thing though.



I just want to add that using the tool "asdf" for managing python versions on your Mac is very handy. And "direnv" is also great for auto-loading your environment when going into a project directory.


I honestly don’t think I’ve ever been able to git clone any python project (that isn’t hello world equivalent), follow the readme and have it just run first shot.

I’m always having to dive in and figure out which packages are missing, wrong version, etc.

As I do this I find myself wondering if the repo maintainers make a habit of actually trying to set up their project from scratch just by following their readmes.

I’m assuming they just get out of date and then leave it to the community to troubleshoot their own installs.

For whatever reason I don’t seem to have nearly the same difficulties trying to clone and run rust or node projects for example.

More generally, the fact that there are about 20 different ways to manage your venvs, and that you seemingly tend to accrete every version of python released in the last 20 years with no clear way of managing all those installations, makes it quite confusing for newbies.



Do you have any examples? I've been out of the Python game for _ages_, but can't you pretty much always just:

  git clone ...
  python -m venv venv
  source ./venv/bin/activate
  pip install -r requirements.txt
Then work on that repo? Everything after "venv" is too new for me, and I've ignored it and somehow not had any issues. If there are some packages that rely on C code and don't have wheels or whatever, you need to deal with the C ecosystem which is the real hell in my opinion.

All the autotools/cmake/scons, library paths, header paths, PKG_CONFIG, etc. I've had so many issues building C projects that I can't even begin to remember all the issues.



> can't you pretty much always just:

> ...

> pip install -r requirements.txt

No. Usually that will pick up newer versions of the project's transitive dependencies, which will have breaking changes (because these days breaking changes in minor versions is what all the cool kids are doing). Since it's Python you won't find out until you hit the wrong codepath while using the program and get a runtime error.



Curious what trouble you've had? The Python package ecosystem has always "just worked" for me. That's true regardless of the interpreter implementation I use. I generally test my projects against both CPython and Pypy.


I wish IronPython ever gets over the Python 3.x hump. It's been "dead" for a while. Every now and then rumors that some developer has started up on it again, but never seems to get to a release state.

To be fair, F# filled most of its niche after it "died" so it's not a need in any way, just a wish for nice things because there were a few years where IronPython was a very nice thing to have.



IronPython3 seems to be in pretty active development. Unfortunately, there are a lot of features that still need to be added.


I tried the latest django with pypy recently and couldn't get it working, which sucked.


Happy to see RustPython making awesome progress!

Note: I just realized that they mention wapm in their homepage. We phased out the WAPM CLI in favor of Wasmer (https://wasmer.io), so you can simply run in your shell:

  wasmer run rustpython
Or, if you want to try it using the Wasmer JS SDK:

  import { Wasmer } from "@wasmer/sdk";

  let rustpython = Wasmer.fromRegistry("rustpython");
  let instance = await rustpython.entrypoint.run({
    args: "-c \"print(1)\""
  });
  let output = await instance.wait();

Will send a PR soon it can be updated!


Probably worth a mention this is the creator of wasmer, for anyone interested. (I was curious who 'We' was to see what companies have experience with multiple wasm runtimes and their learnings)


I'm not convinced that projects like this can really have broad application. The value of Python is interfacing to native libraries, but as soon as you use something like PyPy, you lose access to all of that. It's the same story with the performance orientated forks of cPython.


As some popular libraries begin being authored in Rust (e.g. Polars), I'm wondering whether that could actually be exploited in this approach, by e.g. shipping the "native" libraries as WASM components that can then be called instead of the actual native libraries.


Given how the Python community has failed to converge on a good set of packaging and bundling tools that I highly doubt this will happen.

Perhaps this kind of thing needs to be baked into a language offering from the start?



Pretty much. Ecosystem brings most of the value than any programming language.


It's particular acute with Python. The language itself is poor - the value is in the massive ecosystem (particularly around ML).

Compare this to say, Rust, where the safety guarantees are useful in their own right.

There would be value in Rust even with zero packages, but I couldn't say the same for Python.



i'm incredibly suspicious of anyone who compares a language with a repl to one without.

as for your 3rd point about packages, just try this: `grep -l '#!.*python' /usr/bin/*` and then run that same command piped into `xargs grep import`. i.e. measure how many applications on your system are importing anything besides the builtins/standard libraries shipped in the base interpreter.

for extra fun, try porting the matches above to Rust and see how fun that is without `use clap`/`use structopt` and `use envlogger`. i think you've got the last point completely backward, not that i think it's a particularly important measure of a language anyway.



The language is easy to get started in. A half-an-hour, two-page program is easier to write and read in Python than in almost any other language. And unfortunately most programmers evaluate programming languages by writing a half-an-hour, two-page program in them.


> The language itself is poor

The language is a joy and one of the main reasons it became popular. How do you think it got its ecosystem?



"worse is better", the fact that a great ecosystem has grown does not necessarily mean the language and its tools are great.

Personally, I find the lack of static types makes maintenance a nightmare, and think the build and deployment situation is miserable.



It become popular in the era of Java, Perl and object orientated C++


> Each of these implementations offer some benefits: Jython, for example, compiles Python 2 source code to Java byte code... IronPython is well-integrated with .NET, which means IronPython can use the .NET framework and Python 2 libraries or vice versa.

Python 2 is dead and that's why Jython and IronPython, who have failed to catch up with Python 3, are dead as well and are not worth mentioning, are they?



We've been using RustPython as the Python interpreter for our project Kybra, which is a Python environment for the Internet Computer Protocol (decentralized cloud, where all programs are automatically replicated across 13-40 nodes). Wasm is the runtime environment on ICP.

It's been working quite well, though lack of C extensions is a problem. We're hoping to move to CPython once the wasi and C extension support is there.

But the project works, compiles to wasm32-wasi, and can execute on the live ICP network: https://github.com/demergent-labs/kybra



What has your experience been with Pyodide? Why did you opt for RustPython instead?


Thank you for working with RustPython. I believe kybra made RustPython wasi support a lot more stable.


I was curious about how slow (or fast) it is compared to cpython. On fibonacci.py rustpython about 11x slower than cpython.

    def fib(n):
        if n == 0 or n == 1:
            return 1
        return fib(n-1) + fib(n-2)

    print(fib(35))

    time python3 ~/code/fibs.py
    14930352

    ________________________________________________________
    Executed in    1.18 secs    fish           external
       usr time    1.14 secs  180.00 micros    1.14 secs
       sys time    0.01 secs  616.00 micros    0.01 secs

    time ./target/release/rustpython ~/code/fibs.py
    14930352

    ________________________________________________________
    Executed in   13.44 secs    fish           external
       usr time   13.32 secs  175.00 micros   13.32 secs
       sys time    0.02 secs  776.00 micros    0.02 secs


Using the .__jit__() method in your rustpython version might get you a speedup (I assume they intend for this to be automatic eventually, and it's only explicit right now while the feature is under construction).


How do extensions work with interpreters written other languages? Does the interpreter still expose a C API?


PyPy exposes a subset of the CPython C API, which is good enough for many C extensions, and some others can be ported with little effort.

Most other Python implementations don't provide any way to use CPython C extensions.

There is also the HPy API and ABI, which some C extensions target and multiple Python implementations can compile or load.



There is a merge request up to add autogen rust bindings to hpy

https://github.com/hpyproject/hpy/pull/457



No.

…and this one is no exception -> https://github.com/RustPython/RustPython/issues/1940

Packages that rely on c dependencies like numpy, etc. only work if you write a custom implementation by hand; the “normal” package flat out doesn’t (and cannot) work.

So basically you get no packages that use native extensions, unless the project explicitly implements support for them.

Pypy is the only implementation I’m aware of that has implemented a c api that is mostly compatible (see https://doc.pypy.org/en/latest/faq.html#do-c-extension-modul...)

For eg. Python running in a .net host or a Java host… I think this kind of compatibility would just be flat out impossible.

For rust… hm… probably possible. For wasm? Definitely not.



Would there be a way to write a bridge between the Rust interpreters and C API where calls to existing extension API could work to/from the Rust interpreter?


IIRC, PyPy does this and the translation between the native and C API is the main reason it is slow.


Does it allow sandboxing, or is Lua still the only good option for that?

There are so many cases where you'd rather not give extension scripts unlimited access to the OS and file system, you'd think this option would be more common...



Depending on if it supports the various CPython APIs, it'd be non-trivial to sandbox it. On the other hand, if it doesn't expose a CPython API, is it really python?


The CPython API is the main source of overhead in FFI. It is why using numpy to multiply lots of small matricies is so much slower than one large one. This has prompted people to support the simpler and faster HPy API. https://docs.hpyproject.org/en/latest/overview.html


Note that Lua is not sandboxed by default. You need to thinker a bit with it or use Luau to actually make it sandboxed.


I’ve been going the other way, adding rust bindings for some performance-critical code, and calling it from Python. It’s worked wonders, speeding things up 20-30x in some places. This also has the benefit of compiling to WASM if I need to, so it’s a breeze to run my code in the browser.


not sure if you use pyo3 but it works wonders

(except for some minor parts as well as the python in rust cross compilation case (last I tried ~1y ago), the rust in python cross compilation is fine tho)



Yeah pyo3. It also plays nicely with rust’s numpy so for a lot of deep learning applications you can load a file in rust and build your data, then pass it to python ML libraries.

Only issue I’ve had so far are some weird issues with conda versions and maturin. But those were basically my fault.



I'm curious how large the WASM target is.

One of the things that turned me off of C#'s Blazor WASM was the payload size. I've found some of Rust's offerings like yew and leptos more interesting.



It's okay-ish in size, ~10 MB, I made a small playground with an older version a while back: https://playground.cobalt.rocks/interactive


Last blog entry being from "Dec 1, 2021" doesn't make it sound like there's much action.




Does seem to have died down since June of last year.


Last release (0.3.0) was on 2023-09 and last commit was 2 weeks ago


The GitHub repo is more relevant for “action”, no?


Our team tends to drilling on code more than writing. I know it is not a strategic good way though. Any idea what do you expect to see on the blog?


talk about fun technical challenges and "things you wish you had known". endless appetite for that stuff


I've been using RustPython for my side project (scripting purpose) and its been amazing ride so far! (albeit with minimal docs which is understandable at this stage). AFAIK, the parser also been used in the awesome ruff project


> RustPython is in a development phase and should not be used in production or a fault intolerant setting. Our current build supports only about half of the Python standard library.


For a dum dum like me who only dabbles in programming:

What would be a use case for this?



A big use case is Rust's web assembly support. There are lots of people who'd probably want to run python in the browser, but doing that at the moment is a little shaky. RustPython is maybe a good route for doing this (they give this in "reasons to use" but I can't say I've tried it)


RustyPython is more memorable and has a catchy flair; my $0.02


Rust y Python is what a Spanish speaker calls a project that uses both (:


RustyPy


How does their garbage collection work? Especially in Rust it would be cool to see a concurrent collector.


Wonder how difficult it would be to write something akin to PyO3 but with python syntax for writing rust programs, just like a subset


Cool. Can I use NumPy, SciPy, PyTorch, Shapely, ...?


How is it about safety then? All the Rust program parts scripted using python are unsafe?


Pretty cool, especially the potential for using python as a scripting language embedded in rust programs. That said, python makes me wince.


Working on a significant C++ code base (shipped as deb, rpm and windows binaries) and wanting to allow some means to customize busyness logic we decided to integrate Python hooks, a couple of years ago. In hindsight, I'd call that decision a mixed blessing.

The code level integration (pybind11) was nice and easy, the issues came later. We found operational problems (multithreading), performance (in particular initialization) and worst: memory leaks and crashes on python VM teardown. Another problem was that python has it's own particular ideas about package management than generally don't rhyme with packages from the OS vendor.

I'm not sure what the angle is in this project, but I don't see a happy ending for projects that rely on both pip and cargo when it comes to maintenance and long-term support except for source-level integration. That is, unless there is some well designed and well thought out strategy to combine Rust and Python worlds that doesn't conflict with the RPM or apt view of doing things.



Rust bindings into the OS packages aren't nearly as common as in C++, so that could be a saving point. With MUSL builds there can easily be no shared dependencies at all, and with glibc builds it would just be glibc and a couple friends.


Yes, C++ definitely tends to rely more on binary dependencies, shared objects. Rust will probably have to as well, once frameworks and libraries get to size and maturity that you want your OS vendor to take over responsibility (and compile time!). And note that even MUSL doesn't help create a package that works seamlessly on any Linux OS/release — maybe WASM will (or more accurately: WASI).

But I think, the point GP is making is more about the dependencies Python ads to your C++ (or Rust). Ie.: build RustPython project to produce artifact that works on my machine, but relies on my python interpreter, my (which may have C dependencies) as well as a rust edition and cargo. How can I bundle this into a package for a different Linux? Preferably do it following the existing packages and lore of that distro, rather some python and some rustup version?



> Preferably do it following the existing packages and lore of that distro, rather some python and some rustup version?

Exactly! I want customers to be able to install my software on stock Ubuntu 24 / Rocky 9 / .. and seamlessly integrate with existing stock Apache, libcurl, libssl without any recompilation etc.

Having a bunch of pip dependencies tag along makes this quite complicated.



Curious why your team didn't opt for Lua instead? It's the more typical choice for the use case you're describing.


Our pre-sales engineers, devops people were familiar with Python. It's also a popular language outside our company, though I'm sure some of our customers use Lua (e.g. in nginx tool).

Lua being designed specifically for embedding makes it a good choice.

I wonder, though whether we might have ended up with similar fights maintaining LuaRocks dependencies in rpm/apt ecosystem on various platforms as soon as those customizations start pulling in dependencies like XML parsers, crypto, etc.

Do you have thoughts on that? (Note, we target linux, but sell proprietary closed source binaries only, except for the customizations)



As a counterpoint, Blender uses python for non-trivial tasks quite successfully. Though...it is quite easy to get blender to crash because python references to C objects become invalidated or whatever.

I've spent tons of time tracking down python crashes from C extensions -- not how a company wants their devs spending their time -- and 99% of the time its just getting the reference counts right so C and python are on the same page. Dependency management of C pointers gets tricky sometimes...

My usual workflow when I'm wrapping a C/C++ library (which I do for 'fun' quite often) is to generate a skeleton of the python module with pybindgen and then hand-tune it until it works. I could write a bunch of custom wrapper code (which I do use when it makes things easier but it gets thrown away after the skeleton is generated so doesn't have to be very robust) and just use the output from pybindgen but that takes a lot more work unless the library falls into pybindgen's happy path. Plus, as I'm doing it for fun, I don't mind spending time to prettify the generated code and add some py-sugar. And the way pybindgen generates docstrings isn't the best so those would have to get handwritten either way.

Admittedly, I just do this kind of stuff as a hobby and industry has different goals so "pinch of salt" and all that.



I think it's very naive for a programming language to make you "wince", tools are tools.


I've used all kinds of hand tools and cooking utensils that were terrible and made me wince or experience other discomfort as I used them


It's completely fair enough to argue that a language/tools is a bad at what it is made to do, but this needs to be pared out from tool choice.

A baking spatula makes a bad cooking spatula. It's pliable and can easily scrape the curved surface of a bowl, but for flipping a fried egg it is hopeless - in a pinch it will work - but it would be nice to have something rigid.

Further, describing it as making you "wince" is a recipe for a flame war bordering on deliberate bating. Just describe your issues with the language in more than a single charged word. You don't have to write an essay, just "I don't like the package management environment". That way people can have a constructive discussion.



> Tools are tools

Are you trying to say that all tools are the same? There are no bad tools?

Why is it naive for a painful tool to make you wince?



Have you ever used pip?


Agree, would like to know more context


Because you have all the power to change the tool or invent a new one.


Python is a great language to Just Get Things Done™.

Rust is great if you like puzzles and want to spend your precious time solving the same memory management riddle over and over again. You do get faster and often more robust code though, although in practice the difference is often not meaningful and the extra time invested doesn't pay off.



Sometimes, Things just can't be Done with Python. E.g., you're parsing dozens terabytes of data, or need response time within a microsecond.

So, you're left with a choice of C++, Rust (what else, C#, possibly Zig and a few other).

Rust stops being a "puzzle" once you've written enough of it and you just know how to do things. It has one major disadvantage though - returning back to write in languages like C++/Python makes you cringe because they don't have basic language tools like proper sum types or traits that get things done.



Turns out 99% of the time you don't need to parse terabytes or have μs-response times. Meaning you should be programming just about 99% of your time in Python.

The Rust memory management puzzles are not really complicated. They just get in the way.

> they don't have basic language tools like proper sum types or traits that get things done

Sum types and traits are not getting things done.



> 99% of the time

99% of whose time? Where does this estimate come from?

Definitely not my case, and definitely not the case for lots of programmers I know and/or work with. Python has its own place (for example, in DS/ML) and while I use it myself on daily basis for what's it's best at, statements like "you should be programming just about 99% of your time in Python" make no sense at best.

And, note, it's not only about speed/performance; after writing lots of Rust (or even F#), it's hard to get back to Python because of missing expressiveness and basic language tooling.

Finally, speaking of Python (CPython in particular), part of why it's so attractive is all the fast binary extensions libraries written for it... in C++/Rust, not Python.

> Sum types and traits are not getting things done.

Disagreed, sum types are exactly what gets things done, regardless of language performance and other features. This kind of logic exists everywhere. That's why languages with proper (or even semi-proper) sum types like Rust, F# are a lot easier to express 'business logic' in. And e.g. that's exactly why serialization/deserialization can get so weird in Python where values can be of multiple types and you have to invent crutches to work around that, to simulate sum types in this way or another.



Since 3.10 python has a „semi proper sum type tool belt“ with PEP 634s pattern matching, typing.Union and the | operator. It requires a different approach and it’s still not as seamless or enforce type safety as strictly at compile time compared to languages like Rust or F# but it’s doable and usable.


> Meaning you should be programming just about 99% of your time in Python.

No, not me, because I’m less productive in Python than in Rust, assuming the same level of final product quality, even excluding runtime performance.

> Turns out 99% of the time you don't need to parse terabytes or have μs-response times.

Performance is not the primary reason to use Rust. It has way more to offer than performance.



> Performance is not the primary reason to use Rust. It has way more to offer than performance.

So much this. I've been doing mostly Scala (and some Haskell) for the last 10 years, and now I'm really enjoying Rust precisely because of its language features. Performance is just a (really) nice bonus.

Other bonuses include simple deployments (single binaries versus massive JVM and fat jars), low memory footprint, simple builds that just work (Cargo is an absolute joy), great concurrency, great ecosystem, etc. etc.



I really, really get interested in rust sometimes and then I read comments like yours and remember the warnings about premature optimization and put it off again.


Well, it shouldn't be a surprise to anyone that there's many use cases out there where you just have no other choice but to use a lower-level language (where, unfortunately or not, the common tool of choice these days is still C/C++) since the application/library is either dealing with too much data or has to be very responsive, or both. Since this thread is about Python, some good well-known examples would be libraries like numpy/pandas; but there's also some like polars that are written in Rust.

Re: your point, I'd still recommend learning Rust even if it's just solving last year's AoC, to see how various data structures, enums, traits, iterators etc work. Even if you don't end up using Rust on daily basis, it will probably affect the way you structure non-Rust code and provide you with a few neat design ideas.



Don't let catchy maxims discourage you from following your curiosity.


Why does it make you wince?


From a non-python dev perspective: I always struggle with dependencies and versions. I have a script in front of me that I want to run, and am often just frustratingly brute-forcing commands to make it work.

Do I:

python? python3? pip install? pip3 install? python pip install? python pip3 install? python3 pip install? python3 pip3 install?

And then everyone mentions "oh just use venv" or "conda" or docker or... It just never ends or it never clearly explains what to do. And all I want to do is run a relatively simple script. Or course I would use a docker image to run a more complex app. But it's frustratingly hard to manage for people who don't use it often.

I'm from a java/Kotlin world where things aren't perfect either. But at least dependencies are clearly defined, and the only thing you need is Gradle that updates itself through a wrapper script if needed and pulls all dependencies. And a JVM/JDK which is also configured/installed quite clearly in a specific location.



Part of the problem is that this used to require third-party tools, which gave rise to lots of different tools, but nowadays everything you need is included in Python itself.

The simplest way (in the sense of having the fewest components required) is this, using only built-in tools:

  $ python3 -m venv --upgrade-deps my-virtual-environment
  $ my-virtual-environment/bin/pip install whatever-third-party-package
  $ my-virtual-environment/bin/python3 my-script.py
This creates an isolated virtual environment in "my-virtual-environment". Use the pip and python3 binaries inside that directory to install packages and run scripts. Done. (The "activate" step suggested in other comments is just a convenience for setting your PATH to refer to the virtual environment implicitly, so that's completely optional.)

Note that Debian's python3 package leaves out some parts of the standard Python distribution, but you can install the python3-full package for a complete installation.



This is a fairly good answer but on windows, there’s no python3, so that’s your first command being broken.


Right, on Windows the recipe needs a few alterations but is essentially the same. This should work:

  > py -3 -m venv --upgrade-deps my-virtual-environment
  > my-virtual-environment\Scripts\pip install whatever-third-party-package
  > my-virtual-environment\Scripts\python my-script.py
This assumes that the Python launcher (py.exe) was installed when installing Python, which it is by default.


Rye[1] is an all in one manager for python projects. Including the python versions and virtualenv, pip etc etc... It separates tool deps from app deps. Its all configured through a pyproject.toml config file.

Its still new but works well. I'm transitioning to it from an unholy mess of pyenv, pip installs and other manual hacks.

If you're starting a new python project that is more than just a straightforward script I'd use Rye from the get go.

[1]https://rye-up.com/



I feel you. python itself has its warts but its mostly minor. The dependancy hell though.. god damn it confuses me every time. I don't understand why they can't get it together with one consistent dependency system with good ergonomics. ruby does it with gem. elixir handles things reasonably well with mix. rust does it REALLY well with cargo. hell even ocaml has opam.

but python? it between conda, pip, pip3, virtualenv and who knows what else. its hard to make something that I knwo will just work on everything.



To highlight another dynamically typed, interpreted language ecosystem, npm has been providing a remarkably smooth and uniformly adopted service for the JavaScript community.


Yeah, it's definitely painful. Over the years, I've probably spent in excess of 100 hours on understanding pip, venv, PYTHONPATH, poetry/pdm, pipfile, pyenv, ... still barely understand how it all works together.

https://peps.python.org/pep-0723/ might make life much simpler for single-script applications.



for simple scripts, this is the easiest and works.

  python3 -m venv venv
  source venv/bin/activate
  pip install whatever
  echo "source path/to/venv/bin/activate && python3 path/to/app.py" > myscript
  mv myscript /usr/local/bin
and just use myscript. I use this for a lot of little scripts


I haven't tried this myself, but you could probably save yourself the wrapper script and execute your script directly by adding a shebang pointing to the Python executable from your virtual environment.

printf '%s\n%s\n' "#!path/to/venv/bin/python3" "$(cat path/to/app.py)" > /usr/local/bin/myscript



Why not just

Path/to/venv/bin/python path/to/app.py?

Neat trick though, gonna adopt it!



Thankfully, most distros/OSs don't ship with Python 2 anymore, so `python` will just link to `python3`.


You seriously consider Gradle - a kitchen sink and a monstrosity scripted in obscure, extremely dynamic language - a better solution just because it has a... wrapper script?!

With Gradle, where do you define dependencies? All with version numbers in the `dependencies` block? In a separate file, declaring a hash map of versions and then a hash map of dependencies, and then putting the latter into `ext`, which magically propagates to just about everywhere in the build silently? In a separate file using some plugin that tries to replicate the lockfile functionality baked into most other dependency systems? In a separate TOML file, using the Catalog?

...and that's just about dependencies, I could go on for days about other Gradle features...

It's not any better than Python. You're just used to it. The same complexity is there in both Python and Kotlin, you just learned to cope with one better than the other.

(Gradle with Kotlin DSL and good IDE support is a little better, mostly because Kotlin devs are not as afraid of touching the build as when it's done in Groovy. It's still 2 orders of magnitude more complex than whatever you use for Python - unless you use zc.buildout or something like that.)



Personally, I wince at Python because, while I view it as a quick-and-dirty scripting language, some people write production code in Python. And here we are ten years later, stuck with code that reads a few metrics from the system, packages them into JSON, and sends them to an API for monitoring. The code takes 7 seconds at 100% CPU to resolve the imports, every time it runs.


You can write bad code in any language, I fail to see how that's Python's problem.


Installing python software is hard. Golang is much easier for and end user.


Go is what I reach for when I might otherwise write Python but need to actually, like, distribute my program.


I was recently looking for some ways to make a rust project dynamically configurable. Yaml/toml/etc are too static or are terrible to describe logic in (yet we do it all the time for CI, infra etc, ugh).

WASM would be an option, but overengineered for my case. Ruby (through Artichoke), or Python (through RustPython); but they come with the downside of introducing Ruby or Python. I haven't decided yet, but would prefer Lua for it's simplicity. Or JavaScript (a subset) for how easy it can be limited in scope.

Are there any (example) rust programs out there that have embedded scripting, or runtime plugins or addons in a scripting language?



There's a bunch of scripting languages written in rust that targets "embed-in-rust", for example

https://crates.io/crates/rhai

https://crates.io/crates/dyon

https://crates.io/crates/duckscript

You can use the above links to browse crate "dependents", that is other crates using these dependencies if you are looking for example usage.



Your requirements are very close to Starklark - an embeddable programming language intended for configuration with an available Rust implementation.

https://docs.rs/crate/starlark/latest



For JavaScript, all the underlying work used for Deno is pretty much done. I'm not sure if a more complete example.

Though Lua probably has lots of good and feature rich examples as well.

Can't speak at all to Ruby or Python integrations.

In the end, I'd suggest it depends on your audience and what existing skills or biases they're likely to bring to the table.

My one warning is that I've used software that integrated JS and it hasn't aged well. From the Adobe products with ExtendScript to other, older and incompatible JS engines before ES5 even, don't get locked into something difficult to update and keep current.





There's Rhai - https://rhai.rs/




why not lua?


> That said, python makes me wince.

O lord of the compiler, lend us thine wisdom.



>RustPython can be embedded into Rust programs to use Python as a scripting language for your application

How big are my binaries going to be if I embed a whole Python interpreter in it?



Hello world in pure Rust: 400832 bytes

Hello world w/ rustpython interpreting Python[1]: 15459264 bytes

(Rust 1.75.0; rustpython 0.3.0; MacOS / Apple Silicon)

[1] https://github.com/RustPython/RustPython/blob/main/examples/...



Huh, that seems extremely reasonable even if it's not a finished product yet.


That's cool, but now for REAL bare-metal performance someone should rewrite Python in an even lower-level language. That's right, Python written in C is gonna be hella fast! Oh, wait…


Yeah, how crazy is it that a python interpreter written in python is so much faster than the one written in C? My guess is that it is constrained by not breaking the api, where pypy did break it, and that is probably why pypy hasn't gotten more use. That makes me think any python interpreter that fully supports the cpython api will be slow.


Rust can be even faster that C on some cases, so not self-explanatory.

Rust compiler can sometimes make better performance optimizations because there are more guarantees that code works as it is supposed to.

You can bypass these with C of course, but you need more manual (and less secure) operations.



Also as with C++ having a more expressive language means that while you could have done it in C in practice you won't because it sucks to do all this extra labour, whereas in Rust it's fine because the machine did all the hard work.

Monomorphization is an example where that happens, in C if we're sorting and de-duplicating Geese, Customers and BugReports, we're either writing three separate functions dedup_geese, dedup_customers and dedup_bug_reports, or we're using function pointers and we incur the function call overhead when our functions get called. Ouch.

In Rust (or C++) the monomorphization step is going to turn sort & dedup for Geese, Customers and BugReports into separate functions†, and yet we only wrote the code once.

To some extent you can try to mimic this in C via the "X macro" strategy, but now you're not even just writing C any more, you're writing the macros and maybe running them through pre-processing and trying to understand if the result does what you meant, it's a pretty horrible way to work, so again you're discouraged from doing it.

† However the compiler may spot that actually the machine code implementation for say, BugReport and Goose is identical and so it only emits one in the eventual binary with the other just aliased - which may confuse a debugger and thus a human trying to debug it.



python is already written in plain and simple C, it is sad to make it depends on such complex language which is rust (no less worse than c++).

Actually, I would not mind a python interpreter in rv64 assembly (near 0 SDK) instead.



please, just stop with Python already


We can't do that as Python is the programming language that meets business needs the best and they pay the bills.






Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact



Search:
联系我们 contact @ memedata.com