• 0 Posts
  • 129 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle

  • Yep and that’s fair, but it’s still really critical that those of us that can migrate do so. It’s a chicken and egg problem. Developers won’t feel pressured to support Linux if there’s no sizable user base, but the user base won’t grow until developers provide support for Linux. He even mentions that in that video. There’s a reason I’m only this year planning on switching my primary desktop from Windows to Linux and it’s because of how good Proton has gotten. I’ve already checked every game in my Steam library and while it’s not 100% of the library that runs, everything that doesn’t is something I don’t care about.


  • Nah, Linux still only accounts for about 2% of all users on Steam (active per month) so it has a long way to go still, but at least it’s heading in the right direction. If you count only English speaking Steam users that number climbs to over 5%. If Linux can get to and reliably maintain 10% that’s probably good enough to make it a first class target for even AAA releases, but it’s not there yet. The fact that so many games run fine under Linux these days is almost entirely down to the effort Valve has sunk into Proton making it relatively easy for devs to check Steamdeck support off without needing to really put much work in at all.



  • We don’t need everyone to migrate, just enough that companies and developers feel obligated to support Linux. We’re slowly getting there. Valve throwing their weight behind Linux for gaming was a massive win for Linux. Another important factor is the rise of the mobile first generations and the fact that at its core Android is Linux based. It’s not completely trivial to port an Android app to Linux but it’s at least no worse than porting it to Windows.

    Microsoft may still have a stranglehold on corporate desktops, but they’ve long since lost the battle for servers and their hold on the home desktop is slipping a little more each day. Losing a significant chunk of gamers to Linux would be a massive blow to MS because it has been one of the few really unassailable markets for them historically.



  • Hmm, it’s true that cold fusion would need some kind of physics breakthrough, although I think it might be going too far to call it junk science. To be entirely fair energy positive hot fusion also requires some kind of physics breakthrough though, although potentially a far less extreme one.

    The Sun works because of its mass which generates the necessary temperature and pressures to trigger the fusion. Replicating those pressures and temperatures here though is incredibly energy intensive. In theory, on paper the energy released by the fusion reaction should exceed those energy requirements, but when you factor in that doing so requires exceedingly rare and expensive to create fuel most if not all of that energy surplus vanishes. Nobody has been able to prove that they can get more energy out of the reaction than the energy cost of creating the fuel and triggering the reaction, so until that happens hot fusion is far from proved either. There’s a few research projects that look promising, but it’s far from guaranteed that they’ll pan out.


  • Hydro is good when it’s available but also has some significant problems. The biggest is that it’s an ecological disaster even if the reach of that disaster is far more limited. The areas upstream of the dam flood while the ones downstream are in constant danger of flooding and drought. In the worst case if the dam collapses it can wipe entire towns off the map with little or no warning. It is objectively far more dangerous and damaging to the environment than any nuclear reactor. The only upside it has is that it’s effectively infinitely renewable barring massive shifts in weather patterns or geology.

    All of that is of course assuming that hydro is even an option. There’s a very specific set of geological and weather features that must be present, so the locations you can power with hydro power without significant transport problems are limited.

    It’s certainly an option, and better than coal, oil, or gas, but still generally worse than nuclear.


  • Why? It’s an active area of research with several companies and universities trying to solve the problem. There’s also a chance hot fusion succeeds although to my knowledge nobody has actually gotten close to solving that particular problem either. Tokamaks and such are still energy negative when taken as a whole (a couple have claimed energy positive status, but only by excluding the power requirements of certain parts of their operation). I guess maybe I should have just said fusion instead of cold fusion, but either way there are no working energy positive fusion systems currently.

    Edit: To be clear, I’m not claiming that anyone has a working cold fusion device, quite the opposite. Nobody has been able to demonstrate a working cold fusion device to date. Anybody claiming they have is either lying or mistaken. But by the same token nobody has been able to show an energy positive hot fusion device either. There’s a couple that have come close but only by doing things like hand waving away the cost to produce the fuel, or part of the energy cost of operating the containment vessel, to say nothing of the significant long term maintenance costs. I’ve not seen evidence of anybody getting even remotely close to a financially viable fusion reactor of any kind.


  • The real problem is that there are no renewable solutions for base load, nuclear is the best we’ve got. Renewables are good, but they’re spotty, you can’t produce renewable power on demand or scale it on demand, and storing it is also a problem. Because of that you still need something to fill in the gaps for renewables. Now your options there are coal, oil, gas, or nuclear. That’s it, that’s your options. Pick one.

    If we can successfully get cold fusion working we’ll finally have a base power generation option that doesn’t have (many) downsides, but until then nuclear power is the least bad option.

    So yes, if you tell them “no nuclear”, you’re going to get more coal and gas plants, coal because it’s cheap, and gas because it’s marginally cleaner than coal.






  • It also massively helps with productivity

    Absolutely! Types are as much about providing the programmer with information as they are the compiler. A well typed and designed API conveys so much useful information. It’s why it’s mildly infuriating when I see functions that look like something from C where you’ll see like:

    pub fn draw_circle(x: i8, y: i8, red: u8, green, u8, blue: u8, r: u8) -> bool {
    

    rather than a better strongly typed version like:

    type Point = Vec2<i8>;
    type Color = Vec3<u8>;
    type Radius = NonZero<u8>;
    pub fn draw_circle(point: Point, color: Color, r: Radius) -> Result<()> {
    

    Similarly I think the ability to use an any or dynamic escape hatch is quite useful, even if it should be used very sparingly.

    I disagree with this, I don’t think those are ever necessary assuming a powerful enough type system. Function arguments should always have a defined type, even if it’s using dynamic dispatch. If you just want to not have to specify the type on a local, let bindings where you don’t explicitly define the type are fine, but even in that case it still has a type, you’re just letting the compiler derive it for you (and if it can’t it will error).


  • Hmm, sort of, although that situation is a little different and nowhere near as bad. Rusts type system and feature flags mean that most libraries actually supported both tokio and async-std, you just needed to compile them with the appropriate feature flag. Even more worked with both libraries out of the box because they only needed the minimal functionality that Future provided. The only reason that it was even an issue is that Future didn’t provide a few mechanisms that might be necessary depending on what you’re doing. E.G. there’s no mechanism to fork/join in Future, that has to be provided by the implementation.

    async-std still technically exists, it’s just that most of the most popular libraries and frameworks happened to have picked tokio as their default (or only) async implementation, so if you’re just going by the most downloaded async libraries, tokio ends up over represented there. Longer term I expect that chunks of tokio will get pulled in and made part of the std library like Future is to the point where you’ll be able to swap tokio for async-std without needing a feature flag, but that’s likely going to need some more design work to do that cleanly.

    In the case of D, it was literally the case that if you used one of the standard libraries, you couldn’t import the other one or your build would fail, and it didn’t have the feature flag capabilities like Rust has to let authors paper over that difference. It really did cause a hard split in D’s library ecosystem, and the only fix was getting the two teams responsible for the standard libraries to sit down and agree to merge their libraries.


  • I’ll look into OPAM, it sounds interesting.

    I disagree that combining build and package management is a mistake, although I also agree that it would be ideal for a build/package management system to be able to manage other dependencies.

    A big chunk of the problem is how libraries are handled, particularly shared libraries. Nix sidesteps the problem by using a complex system of symlinks to avoid DLL hell, but I’m sure a big part of why the Windows work is still ongoing is because Windows doesn’t resemble a Linux/Unix system in the way that OS X and (obviously) Linux do. Its approach to library management is entirely different because once again there was no standard for how to handle that in C/C++ and so each OS came up with their own solution.

    On Unix (and by extension Linux, and then later OS X), it was via special system include and lib folders in canonical locations. On Windows it was via dumping everything into C:\Windows (and a lovely mess that has made [made somehow even worse by mingw/Cygwin then layering in Linux style conventions that are only followed by mingw/Cygwin built binaries]). Into this mix you have the various compilers and linkers that all either expect the given OSes conventions to be followed, or else define their own OS independent conventions. The problem is of course now we have a second layer of divergence with languages that follow different conventions struggling to work together. This isn’t even a purely Rust problem, other languages also struggle with this. Generally most languages that interop with C/C++ in any fashion do so by just expecting C/C++ libraries to be installed in the canonical locations for that OS, as that’s the closest thing to an agreed upon convention in the C/C++ world, and this is in fact what Rust does as well.

    In an ideal world, there would be an actual agreed upon C/C++ repository that all the C/C++ devs used and uploaded their various libraries to, with an API that build tools could use to download those libraries like Rust does with crates.io. If that was the case it would be fairly trivial to add support to cargo or any other build tool to fetch C/C++ dependencies and link them into projects. Because that doesn’t exist, instead there are various ad-hoc repositories where mostly users and occasionally project members upload their libraries, but it’s a crap-shoot as to whether any given library will exist on any given repository. Even Nix only has a tiny subset of all the C/C++ libraries on it.


  • So, it’s C#?

    No, that’s what Java would look like today if designed by a giant evil megacorp… or was that J++. Eh, same difference. /s

    This did make me laugh though. Anyone else remember that brief period in the mid-90s when MS released Visual J++ aka Alpha C#? Of course then Sun sued them into the ground and they ended up abandoning that for a little while until they were ready to release the rebranded version in 2000.



  • Rusts ownership model is not just an alternative to garbage collection, it provides much more than that. It’s as much about preventing race conditions as it is in making sure that memory (and other resources) get freed up in a timely fashion. Just because Go has GC doesn’t mean it provides the same safety guarantees as Rust does. Go’s type system is also weaker than Rusts even setting aside the matter of memory management.