Rust Belt Rust is a conference about the Rust programming language held in the Rust Belt region of the US.
Rust Belt Rust is for people of any level of Rust experience-- you're welcome even if you're just interested in Rust!
The conference consists of a day of interactive workshops followed by a day of 30 minute talks on a variety of Rust related topics.
Why is the conference taking place in the Rust Belt? For the pun opportunities, of course!! We also enjoy the opportunity to show off our region-- there are more people doing interesting things with technology here than you might have guessed!
Finally grasped the mechanical steps I need to pay attention to when dealing with the borrow checker in today's rust code!! This needs to be documented in the rust book the way it was explained here, the best explanation of borrow checker yet !!
I don't get it. How does a typical OO background make traits hard to understand? Also, for a "deep dive", this doesn't dive very deep. I was hoping to learn the differences between using generics on a trait and having a type field.
In _raku_ (`raku_lang`) the type hierarchy starts with "Mu" - which puts "the developer" in the realm of logical philosophy and Buddhist metaphysics right from the get go.
wow mathematicians couldn’t be bothered to type out Natural and succeed or increment but would/expect me to say ‘suck suck naturals’ with a straight face
actually a brilliant talk. thanks for making this understandable for those of us who aren’t type theorists!! i’ve always been really bothered by how i have to check the lengths of vectors and such at runtime, learnt something really cool from this talk, and i’m gonna try and apply this idea in my code later. i still stand by my point that mathematicians should get over themselves and stop with the silly symbol business already. computers have been around for decades. just learn to type faster ffs. it’s 2023 and it’s ur fault if you type so slowly that you still have to Nat this and suc that.
I wish this project resumes development, because I actually really like KDevelop, and I think it's just a shame that it doesn't see much use because of the lack of language support in general. That being said, C/C++ support is fantastic.
I just wish slice::Windows could produce `Item`s of known length so that they could be unpacked. This would seem to be a solution. But there has got to be a limit somewhere; I think the typescript type system has been shown to be Turing complete. Maybe rust can beat it by being the first to run Doom on a type system.
Without defining what "0.999..." means or what operations you can do on that object, one can hardly consider this a proof. Here's one definition: "0.999..." is shorthand for the sequence 0.9, 0.99, 0.999, 0.9999 and so on(*). When we say that "1 = 0.999..." what we really mean is that 1 is the limit of this sequence. If this is what we mean, the proof should follow easily from the kind of real analysis I expect first-year math students to learn at most universities. I guess we can call the sequence X, and let 10X be the component-wise scaling by 10, i.e. 9, 0.9, 0.99, etc.; then the component-wise difference (= 9X) is 8.1, 0.81, 0.081 and so on. We can then divide by 9 and get back X = 0.9, 0.99, 0.999 and so on, i.e. we can go back to where we started. But this doesn't prove that the sequence has a limit. (*) Note: we can define the sequence recursively: x_0 = 0.9 and x_{n+1} = 1 - (1 - x_n)/10. This recurrence has a single fixpoint, at 1, the iterated function is continuous and the distance to the fixpoint decreases by a constant factor. Together these facts imply convergence to the fixpoint.
it's expensive to type check. the current implementation in rustc can be very slow and memory expensive. you can use it with the nightly compiler and RUSTFLAGS="-Zpolonius".
Will this solve this problem: - case that doesn't work: object.call(object.value()); - case that works: let value = object.value(); object.call(value);
The difference is in HOW you write an unsafe code. If the programmer has to do something too advanced for the Rust compiler to comprehend, you can take over control. There's nothing wrong in calling unsafe function that has no flaws (aka is "pure")
That's not dependent types, but rather an overengineered, hardcoded, typed nightmare bullshit. Imagine trying to use this in a production code. Actual dependent types don't depend on types (pun not intended) and can be inferred automatically by proof checker from context.
Dependent types cannot always be inferred by the proof assistant unfortunately. I think dependent types are better understood as a kind of macro, since (almost) all of the type information is completely deleted at compile time.
TBH, Most pragmatically useful content is at the beginning and end. IMO, the content in the middle while nice background information is a little light on utility. @ ~ 1:20 Ms. Manning points out difference between string.len() & string.char().count() @ ~ 15:40 Ms. Manning talks over UTF-8, UTF-16, & UTF-32
28:00 couldn't we save some space in Dashed variant by making it Option<NonZeroU32> and the resulting Option would have the same size as u32, 4 bytes? Dashed with zero spacing should theoretically look the same as Solid.
Wait, what happens if vector length is already usize::MAX, and then we push again? I don't see how the code presented deals with this potential problem.
I think I'm missing the point here. What I took away from this talk is that I can replace a primitive or a Vec with a Trait for no reason. I feel like I understood traits better before I watched this, but everyone else is saying this talk was great. I realize she chose simple examples on purpose, but these are cases where traits shouldn't be used.
As a Rust learner, I think it is being sold wrongly. The word "undefined behaviour" is thrown around too casually. The CPP conference people tend to abstract away hardware and concrete implementations, hence they just label anything not specified by the C++ specs as "undefined behaviour" because they cannot be sure how the implementations will handle such situations. But, almost all popular compilers do the most reasonable thing when it comes to these situations. Printing uninitialised memory will print whatever bits were stored in its place, overflowing an integer will result in it wrapping down to the smallest storable number, etc. This is not something CPP programmers would worry about while coding. This is fear-mongering. I fear I'm getting myself into a community full of these people, but damn the promise is too great not to give it a shot, and so far, I like it.
I'd like to respond to that: in all these situations, is wrapping behaviour and printing undefined bits what you want? You can achieve all of that in Rust too, but it forces you to explicitly say that that behaviour is what you want. If I'm summing positive integers in an array and get a negative result, is that what I wanted, or will this cause a problem with some assumption I make further on? Rust doesn't forbid this behaviour, there's a wrapping add, which let's you say that you're sure you want this behaviour. If you don't want to worry about that every time, there's specific wrapping integer classes that will do that every time if that's what you want. You can read uninitialized memory, or even memory from an arbitrarily specified pointer inside an unsafe block. I feel like the purpose of Rust is to help you not get bitten by these things when you're not expecting it, making it easier to do the right thing.
AHHHH that's why a Rust enum is a SUM-type; A FruitSnack so to speak; A variable can have enum-value 1 OR enum-value 2 OR ...; e.g. The Option enum can be either Some(x) or None.