• 0 Posts
  • 31 Comments
Joined 1 year ago
cake
Cake day: July 10th, 2023

help-circle
  • Militants specifically use these pagers for security and stealth. Everyone else just uses phones.

    It’s a brilliant way to target only combatants, and also expose them to their friends and neighbours. This attack is incredibly disruptive with very little collateral damage compared to alternatives.

    And yes, it’s terrorism, an attack meant to inspire terror and disrupt communication networks with a chilling effect much larger than the actual damage. However it’s interesting as unlike most terrorism it does not target civilians.

    It’s also terrifying to think we are living in a world where a malicious component attack is a legitimate concern. This is one of those moments that change the world - I’m sure every industry is thinking about the danger of their foreign supply chain right now.


  • That’s a valid point, the dev cycle is compressed now and customer expectations are low.

    So instead of putting in the long term effort to deliver and support a quality product, something that should have been considered a beta is just shipped and called “good enough”.

    A good example I guess would be a long term embedded OSS project like Tasmota, compared to the barely functional firmware that comes stock on the devices that people buy to reflash to Tasmota.

    Still there are few things that frustrate me like some Bluetooth device that really shouldn’t have been a Bluetooth device, and has non-deterministic behaviour due to lack of initialization or some other trivial fault. Why did the tractor work lights turn on as purple today? Nobody knows!


  • My type is a dying breed too, the guys who do their best to write robust code and actually trying to consider edge cases, race conditions, properly sized variables and efficient use of cycles, all the things that embedded guys have done as “embedded” evolved from 6800 to Pic, Atmel and then ESP platforms.

    Now people seem to have embraced “move fast and break things” but that’s the exact opposite to how embedded is supposed to be done. Don’t get me wrong there is some great ESP code out there but there’s also a shitload of buggy and poorly documented libraries and devices that require far too many power cycles to keep functioning.

    In my opinion one power cycle is too many in the embedded world. Your code should not leak memory. We grew up with BYTES of RAM to use, memory leaks were unthinkable!

    And don’t get me started on the appalling mess that modern engineers can make with functional block inside a PLC, or their seeming lack of knowledge of industrial control standards that have existed since before the PLC.



  • It’s complicated. The main issue is, I live on a remote farm without cell coverage, except in the tiny zone under my 50’ tower with booster.

    However I now have Starlink, and wired and wireless APs covering a large area with high speed, low latency data.

    So, port my number to VoIP.ms, which supports SMS, and make all my calls/texts through Wifi using SIP. On the road, use a basic cell plan with unlimited slow data that is still fast enough for voice. Tested, working, so far fairly simple.

    Now the issues. RCS won’t work with my now VoIP provisioned number, because there’s no SIM for it. The SIM in the phone has a different number, that of the new plan which will be unreachable at the farm by voice/SMS just like the old number used to be.

    This would all be a non-issue if my provider supported VoWifi on anything other than iPhones, but sadly this is not an option. So I’ve got service everywhere now, but am stuck with voice and SMS, no RCS or MMS.




  • Right, we need to come up with better terms for talking about “AI”. Personally at the moment I’m considering any transformer-type ML system to be part of the category, as you stated none of them are any more “intelligent” than any others. They’re all just a big stack of tensor operations. So if one is AI, they all are.

    Remember long ago when “fuzzy logic” was all the hype and considered to be AI? Just a very early form of classifier network but everyone was super excited at the time.


  • I’m just stating that “AI” is a broad field. These lightweight and useful transformer models are a direct product of other AI research.

    I know what you mean, but simply stating “Don’t use AI” isn’t really valid anymore as soon these ML models will be a common component. There are even libraries and hardware acceleration support for tensor operations on the ESP32-S3.


  • evranch@lemmy.catoTechnology@lemmy.worldThe AI bill that has Big Tech panicked
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    5
    ·
    3 months ago

    It’s possible for local AI models to be very economical on energy, if used for the right tasks.

    For example I’m running RapidOCR which uses a modern transformer architecture, and absolutely blows away traditional OCR at capturing data from character displays.

    Doesn’t even need a GPU and returns results in under a second on a modern CPU. No preprocessing needed, just feed it an image. This little multimodal transformer is just as much “AI” as bloated general purpose GPTs, but it’s cheap, fast and useful.








  • If you don’t want memory-safe buffer overruns, don’t write C/C++.

    Fixed further?

    It’s perfectly possible to write C++ code that won’t fall prey to buffer overruns. C is a lot harder. However yes it’s far from memory safe, you can still do stupid things with pointers and freed memory if you want to.

    I’ll admit as I grew up with C I still have a love for some of its oh so simple features like structs. For embedded work, give me a packed struct over complex serialization libraries any day.

    I tend to write a hybrid of the two languages for my own projects, and I’ll be honest I’ve forgotten where exactly the line lies between them.


  • A million tiny decisions can be just as damaging. In my limited experience with several different local and cloud models you have to review basically all output as it can confidently introduce small errors. Often code will compile and run, but it has small errors that can cause output to drift, or the aforementioned long-run overflow type errors.

    Those are the errors that junior or lazy coders will never notice and walk away from, causing hard to diagnose failure down the road. And the code “looks fine” so reviewers would need to really go over it with a fine toothed comb, which only happens in critical industries.

    I will only use AI to write comments and documentation blocks and to get jumping off points for algorithms I don’t keep in my head. (“Write a function to sort this array”) It’s better than stack exchange for that IMO.


  • I tried using AI tools to do some cleanup and refactoring of some legacy embedded C code and was curious if it could do any optimization or knew any clever algorithms.

    It’s pretty good at figuring out the function of the code and adding comments, it did some decent refactoring of some sections to make them more readable.

    It has no clue about how to work in a resource constrained environment or about the main concepts that separate embedded from everything else. Namely that it has to be able to run “forever”, operate in realtime on a constant flow of sensor data, and that nobody else is taking care of your memory management.

    It even explained to me that we could do input filtering by using big arrays to do simple averaging on a device with only 1kB RAM, or use a long long for a never-reset accumulator without worrying about what will happen because “it will be years before it overflows”.

    AI buddy, some of these units have run for decades without a power cycle. If lazy coders start dumping AI output into embedded systems the whole world is going to get a lot more glitchy.