Thanks for sharing. However, after watching this demo, I am not convinced. They deliberately took an algorithm (iteration over lists) that is well-known to be slow in Python as a benchmark. They did not compare the performance with respect to numpy and Numba. Moreover, considering they use LLVM under the hood, I expect the boost will be comparable with existing solutions like Numba JIT, which also leverages LLVM. They also mentioned training DL models but did not explicitly list any gradient backward computations performance boost vs PyTorch or Jax.
@howardjeremyp - this was my thought too. The base case here is kind of a straw man - no one needing performance from a matrix-like data structure is going to use vanilla Python. What do the benchmarks look like compared to what practitioners are realistically using (numpy, pandas, or libs that leverage LLVM)? I'd be curious to get a better understanding of the unique value add (fusion, et al) and a more apples-to-apples performance comparison.
@@brandoconnor I think the value here is that it's going to be python all the way down. With any of your mentioned tools, like numpy or pandas, if you try to modify or look anything deeper, you'll hit a wall of having to switch to another language.
This is what I was thinking . I only have a little exposure to this, but one of the courses I took basically said to always avoid direct loops. And use numpy for all matrix calculations. Thanks for clarifying. This still seems to make it easier for people to modify algorithms, and deal with memory and memory alignment easier. As well as lower level paradigms. But the comparison wasnt exactly fair.
From the FAQ: "We don’t have an established plan for open-sourcing yet.". I may get interested once they have open sourced it and definitely not sooner.
@Srikanta Prasad SV I see you point but wouldn't you agree it would at least benefit users in being a more general purpose programming language compared to python ?
Mojo looks very promising and I can't wait until it reaches a level of maturity where I can put its power in to production..... Superset of python was an excellent design decision, taking the best ideas from Rust, Swift, C and Haskell... Very Exciting! Chris Lattner is an absolute genius been a huge fan of LLVM since it became the default in FreeBSD. Suffice to say Mojo on paper satisfies all my requirements for the language I've always wanted but didn't exist.
This is absolutely insane. I honestly cant believe this exists, for some reason 😂 I cannot wait to be using it! This looks like itll be useful for anything Python can be used for, not just data science, and more.
you dont use GPU drivers? you dont use any proprietary software at all? Even your mouse and printer drivers are open source? In other words = you talking bullshit
This looks incredible. I have 2 questions 1. Can mojo be used as a drop in replacement on any python script to improve performance (if possible) and not just machine learning related functions? 2. Will the autotuner have context regarding resource split like running inside a 4 core container on a 8 core CPU, so that it allows the other 4 cores to be utilized by other programs?
So I nothing about Mojo but can still answer your 2nd question: software running in a 4 core container only sees 4, so will tune for 4. Usually software also allows you to do this without a container, I see no reason why Mojo couldn't support a X_cores compile time or runtime flag
@@abhishekjm8611 wait did golang figure out how to use 8 cores from a 4 core container? ;) There is most likely a setting for this, but I don't know Go and Google didn't help me in a few seconds
@@abhishekjm8611 ah chatgpt4 to the rescue runtime.GOMAXPROCS(4) Seems to be something that needs to be done in the code, so proper Go tools will export this as a parameter I'd say. (or use numCPUs - 1/2 to allow interactivity)
upto 4:05 everything could have been done with numba. I also don't see a reason why the "plain" code on numpy arrays, if given type annotations for "numpy array", would have to be as slow as a couple megaflop/s I do like that it goes *a little* further than that... but numba could easily pick up those tricks too.
This is very cool but I think I will need a good project to learn to use it. Do you have any plans of using this in future fast ai courses / libraries?
I would like to see the performance of PyTorch matmul vs. Mojo's, tbh I'm not entirely convinced that it's faster than using stuff made in C, C++, and Rust for Python. Big if true though.
Can Mojo code be compiled into a relatively small self-contained executable? If I correctly remember that was the main limitation of the Julia programming language, according to Jeremy himself. Does this Mojo language solve that issue?
That's actually one of the worst things about python. The fact you have to use a 3rd party tool to generate an exe that will run on all systems (and it doesn't always work). Other things that frustrate the hell out of me with python is the project structure and deployment. It's tedious and quite frustrating for no particular reason. Hopefully they can resolve these things with mojo but I'm not going to hold my breath :)
No it does not. Its benefits of performance are minor compared to any language that isn't Python or alike (~4000x speedup with most tricks in the book compared to CPython is hardly impressive). It does not help at creating scalable, reliable software as at its core, it shares the same pitfalls of Python. It will also not make your existing Python code magically faster, unless you're programming the superset; the python ecosystem won't migrate to Mojo overnight. This is just a typical Chris Lattner move. He didn't bother contributing to an already existing mature platform, because it's not his creation. He'll get a lot of cash to flow into Mojo(from these hype advertisements that are misleading), only for it to end up like swift4tf in a few years, and then, we'd all still be using the sorry excuse of a language that is Python for the groundbreaking tech of this era! (Also, seriously, do they have any language designers/PL theorists on their team? Smacking in different features from everywhere else on top of Python does not make a good language)
@@balen7555 As far as what they have shown, you can take your python script to mojo and it will be 300x faster without doing anything. And contribute what to Python? Python won't ever get rid of global interpreter. You won't ever make a merge this big that completely changes semantics of Python to python codebase. Only way something like this works is as a separate project.
@@erenbalatkan5945 Your python script is not the python ecosystem. If you use X packages, you need to get X packages to compile with Mojo. Even then, copy-pasting as verbatim will only lead ~8x performance increases. You need to rewrite with static types for more performance (and even then, it's hardly as performant as even JS, which is dynamically typed...). Who said anything about contributing to Python? Stop investing in that garbage, other solutions exist that need money.
It would be really cool to see if it’s possible to make a game with decent performance in Mojo because Python is not an option for any form of complex game development.
Huge respect, but who writes matmul in python by hand? All low level stuff is delegated to C anyway. Is Mojo targetting the low level stuff, which affect 1% people? How does it compare to C then?
The point is that you no longer need C or C++ for performance. So if you implement some new algorithm which is not provided by numpy, scripy etc. you could write it in plain Mojo and combine flexibility and convenience of Python with performance of C/C++ in one unified language. It's all about unification and having one complete language instead of jumping between Python and C/C++ and then interfacing between these.
@@michal3141 That is correct, thank you. After watching Lex' episode with Chris L. I do have much more context, and I agree that the future looks very exciting ;-)
Because you don't get the benefits of fusion, amongst many other things. That's why PyTorch, Jax, and TensorFlow all use separate compilers nowadays and only can support a subset of Python.
Another point is, the said external libraries can now be implemented with all the SIMD tricks in a python-like language instead of using C++/Fortran/Cython/... (like numpy did).
Importing certain external libraries, such as numpy, into Mojo appears inelegant. I'm quite interested in understanding how they incorporate other deep learning frameworks... It claims to be a superset of Python, but its syntax reminds me more of Golang.... Especially "usage of the struct"... etc.
Mojo will be a "strict superset" of Python, so that means anything you can do in Python will be doable in Mojo. However, there will be things you can do in Mojo that you *won't* be able to do in Python (eg, memory allocation, and playing with pointers) Depending on your goals, there's a couple routes that might work. Goal A: build some project that's of personal interest to you Goal B: learn the fundamental concepts behind how computers work If you lean towards Goal A, then it's probably better to start with Python first. Later, you can learn the extra functionality that Mojo offers. Python has less stuff to worry about On the other hand, if you want to learn foundational concepts of computers for Goal B (how memory works, how to manage it manually, etc), then becoming aware of lower level abstractions (pointers, the stack, heap, etc) should happen as quickly as possible. Mojo should be able to help with this (but I'm not 100% sure since it's not out yet). The fact that Python doesn't have pointers (but Mojo does), makes me think Mojo might be better if you're trying to understand computers Here's an analogy related to cars: [Goal A] Do you just want to learn enough about the car to be able to drive it from A to B? Then just get an automatic car (ie, install Python on your computer) take some driver's ed courses (ie, learn Python with some beginner course) and take your road trip from A to B (ie, build your personal project) [Goal B] Do you want to understand why cars (ie, computers) work and how to tinker with your own to make it go fast? Then get a stick shift car (ie, install Mojo on your computer) take some auto shop/mechanics class (ie, take some Mojo course that focuses on the algorithmic & hardware side of things) and start tinkering with stuff to see what happens (using benchmarking & debugging tools to see how stuff works) Since Mojo will be new, Goal B may have limited resources for a little while. But most people who use cars don't need (or want) to pursue Goal B
Imagine if we could train neural networks 10-50x faster by just adapting the language used to define them. You get exponential returns just by wide adoptation and speed of execution/iteration of refinement process + tuning.
I don't think so. AI/ML libraries for python already uses Rust/C/C++ under the hood, with the same or better optimizations demonstrated here. But the key advantage here is you can write AI code in pure mojo, and expect it to run as fast as compiled programming languages.
@@hkanything I like Cython but it is like a parasite (depends of another language to survive) over C.. Mojo is a standalone solution created from scratch and not limited to C limitations..
Speedups in the thousands makes something that needs ~1 hour in Python take ~1 second in Mojo. Crazy. *Edit:* It never ceases to amaze me how much cynicism there is in the tech world. I never said that all code has this speedup; I only put the speedup for the showcased code into perspective. Here's the deal: if it can do this for *some* code, then it is a promising and useful tool with way more applications than e.g. Numba JIT. There's always a silver lining.
@@andreashon Sure, but the fact that we have here a Python-like programming interface with parallelised C-like speed obviously extends to other applications. Like, you're not going to tell me that it's a 4000x speedup for only matrix multiplication, but other than that, it's an entirely useless language that runs at 1x Python. We can extrapolate.
@@Mew__, I switched to Julia, when I could not optimize my numpy+numba code any further. It has mature community, tons of packages and several cool syntax choices.
I understand that they want to be able to reuse python libs, but superset languages add unnecessary complexity with mixed syntax, style, and version compatibility. I kind of wish they just created a new language that’s simple (like python or swift) and doesn’t overlap another language. Maybe create a tool that allows you to transpile a py lib to mojo and then import it. I just don’t like the feel of 2 conceptual language constructs mixed together. One of the reasons I find Typescript as completely pointless