A string being parsed as a date-time is presumably user input, which is potentially invalid.
A string being parsed as a date-time is presumably user input, which is potentially invalid.
What happens when you coerce a string to a date-and-time but it’s not valid?
Where I’m from (Rust), error handling is very strict and very explicit, and that’s how it should be. It forces you to properly handle everything that can potentially go wrong, instead of just crashing and looking like a fool.
Dynamic typing is insane. You have to keep track of the type of absolutely everything, in your head. It’s like the assembly of type systems, except it makes your program slower instead of faster.
Difficult to test == poorly designed
It’s pretty much a natural law that GUIs are hard to thoroughly test.
Since when were Boston Dynamics robots sentient?
Sandboxing the binary doesn’t protect you. It can still insert malicious code into your application.
If the binary matched the source code, that argument would hold, but it doesn’t, which is sounding alarm bells in my head. Just what is in those 600 kilobytes of machine code?
So, the general unsuspecting public will still be executing this potentially-malicious binary, and only those in the know will be able to avoid it?
The proper way to handle this is to contact the maintainer, ask why this change was made, and start a discussion arguing the drawbacks and asking to revert it.
That has already occurred. The maintainer pretty much ignored the question, as far as I can tell.
People usually behave that way when they have an ulterior motive. In this case, I worry that the plan is to slip some malware into that binary…
I’m not sure. I’ve only ever used the stock operating system on my phones.
Everything needs good security. Firewall devices only cover a specific, limited portion of the attack surface of machines behind them. One successful browser exploit or attack on an exposed port, and the firewall may as well be a paperweight.
No way. Plasma is beautiful.
Linux is already dominant on just about everything except the desktop, and it has yet to suffer significant enshittification.
Edit: Well, a bunch of Linux distributions have suffered enshittification, if that counts.
Yeah, that’s the problem. We don’t have the requisite technology to build a Star Trek utopia. If only we did…
If you have an NVIDIA video card, I would start with buying AMD or Intel instead. Attempting to use NVIDIA with Linux will result in misery.
I do believe that while programming has many ways of doing the same task, there is always an objectively best way to do it.
I’ve been writing code in one form or another for some 30 years now, and my observation so far has been the exact opposite: there are many problems in programming for which there is no one clearly superior solution, even in theory. Just like life in general, programming is full of trade-offs, compromises, and diminishing returns.
This artificial pseudointelligence exists because there’s the “gee whiz, that’s cool” of a computer talking like a person, and a bunch of hype chasers looking to cash in. Much like cryptocurrency before it, and the dot-com boom before that, there is little substance to it, and most of it will be commercially irrelevant a decade from now.
That is so evil. I love it.
By “user” I mean the person who is using the application.
Using exceptions for handling unexceptional errors (like invalid user input) is a footgun. You don’t know when one might be raised, nor what type it will have, so you can easily forget to catch it and handle it properly, and then your app crashes.