• 2 Posts
  • 50 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle
  • Oh they definitely exist. At a high level the bullshit is driven by malicious greed, but there are also people who are naive and ignorant and hopeful enough to hear that drivel and truly believe in it.

    Like when Microsoft shoves GPT4 into notepad.exe. Obviously a terrible terrible product from a UX/CX perspective. But also, extremely expensive for Microsoft right? They don’t gain anything by stuffing their products with useless annoying features that eat expensive cloud compute like a kid eats candy. That only happens because their management people truly believe, honest to god, that this is a sound business strategy, which would only be the case if they are completely misunderstanding what GPT4 is and could be and actually think that future improvements would be so great that there is a path to mass monetization somehow.


  • He was really popular on twitter, and if he says mastodon’s worse despite having a smaller audience there, I trust his judgement. Literally his pinned toot.

    “First replies shown are the ones the author replied to and/or liked” seems like an obvious, simple, and transparent algorithm. Like youtube comments. Give lazy reply guys an opportunity to see without scrolling down that they aren’t as original as they think they are. The fact that this isn’t implemented in even a basic form is absolutely insane and shows a very fundamental ideological disconnect between people who want “open twitter with decent moderation” and whatever the fuck it is that the mastodon OGs/devs are trying to achieve.



  • Some people don’t want a suggestion algorithm but do want full reply federation.

    Alec from Technology Connections stopped using mastodon because of this, every post he made would get nitpicked on by 20 different people from instances who did not federate the replies with each other so each reply guy thought they were the first.

    I have a single user instance and I use a relay, but most replies are still missing if I click on a post unless I go to the original webpage.

    Lazy-federating replies when a post is viewed sounds like an obvious solution but AFAIK the mastodon devs are very opposed to this.


  • I wasn’t very old then but the main thing was RAM. Fuckers in Microsoft sales/marketing made 1 GB the minimum requirement for OEMs to install Vista.

    So guess what? Every OEM installed Vista with 1 GB of RAM and a 5200 RPM hard drive (the “standard” config for XP which is what most of those SKUs were meant to target). That hard drive would inevitably spend its short life thrashing because if you opened IE it would immediately start swapping. Even worse with OEM bloat, but even a clean Vista install would swap real bad under light web browsing.

    It was utterly unusable. Like, everything would be unbearably slow and all you could do was (slowly) open task manager and say “yep, literally nothing running, all nonessential programs killed, only got two tabs open, still swapping like it’s the sex party of the century”.

    “Fixing” those hellspawns by adding a spare DDR2 stick is a big part of how I learned to fix computer hardware. All ya had to do was chuck 30 € of RAM in there and suddenly Vista went from actually unusable to buttery smooth.

    By the time the OEMs wised up to Microsoft’s bullshit, Seven was around the corner so everyone thought Seven “fixed” the performance issues. It didn’t, it’s just that 2 GB of RAM had become the bare minimum standard by then.

    EDIT: Just installed a Vista VM because I ain’t got nothing better to do at 2 am apparently. Not connected to the internet, didn’t install a thing, got all of 12 processes listed by task manager, and it already uses 500 MB of RAM. Aero didn’t even enable as I didn’t configure graphics acceleration.


  • Bro I wouldn’t trust most companies not to store their only copy of super_duper_important_financial_data_2024.xlsx on an old AliExpress thumb drive attached to the CFO’s laptop in a coffee shop while he’s taking a shit.

    If your company has an actual DRP for if your datacenter catches fire or your cloud provider disappears, you are already doing better than 98 % of your competitors, and these aren’t far-fetched disaster scenarios. Maintaining an entire separate pen-and-paper shadow process, training people for it? That’s orders of magnitude more expensive than the simplest of DRPs most companies already don’t have.

    Friendly wave to all the companies currently paying millions a year extra to Broadcom/VMWare because their tools and processes are too rigid to use with literally any other hypervisor when realistically all their needs could be covered by the free tier of ProxMox and/or OpenStack.


  • Congrats. So you think that since you can do it (as a clearly very tech-literate person) the government shouldn’t do anything? Do you think it’s because they all researched the issues with these companies and decided to actively support them, or is it because their apathy should be considered an encouragement to continue?

    You are so haughty you’ve circled back around to being libertarian. This is genuinely a terrible but unfortunately common take that is honestly entirely indistinguishable from the kind of shit you’d hear coming from a FAANG lobby group.




  • azertyfun@sh.itjust.workstoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    2 months ago

    You are conflating Consumers with Citizens, a classic pitfall of modern neoliberal democracies.

    Just because people willingly Consume a Product does not mean they think The Product is good or even that it should exist at all. Neoliberalism is unable to acknowledge that, because Everything is a Market and the Market is Infallible.

    In reality, the game theory is such that individuals may not have the means to get out of the local minimum they found themselves stuck in. Prisoner’s dilemma and all that. That’s what representative democracy is supposed to solve, when it isn’t captured by ideology and corporate interests.



  • I mean, he’s actively supporting the opposition (Trump) right now. Were Trump to win then he’d certainly be in a very good position within Trump’s desired oligarchy. Until then he’s just a very rich asshole whose main major concrete political power comes from his ownership of Twitter and (largely artificial) audience. If anything his support of Trump kneecaps him in his ability to run his businesses as the Biden and hypothetical Harris administrations are not as likely to let him keep getting away with all the blatantly illegal shit he keeps doing.

    Michael Bloomberg OTOH fits the term pretty well, as he’s a very major donor to the DNC and that certainly makes him very close to the ear of the president and policy decisions.


  • Try to turn up the contrast and saturation to 200 %, that should increase the comments on picture quality :)

    FR tho, mine is also impressively thin but like… I discovered that when I unpacked it? Thinness is not effectively conveyed by marketing material, and maybe it’s because I haven’t set foot in an electronics store in years but aren’t TVs typically laid out in a way that you don’t see them from the side?

    Maybe I’m totally off-base and it truly is a big factor for normies shopping for a TV, but I just can’t even really understand how a 3 cm thick panel would significantly impact sales compared to panel tech, size, cost, and ancillary features.

    However now that I think about it, maybe “thick” LCDs can’t go bezel-less? That I could easily understand how it impacts the overall esthetics (or even practicality with respect to Ambilight for instance).


  • What’s the overlap of the general public, people who buy “fancy sculpture TVs”, and people who still buy LCD TVs when OLED has been affordable for years now (I paid a grand for mine)? Keeping in mind that regular TVs already look impossibly thin so you gotta find someone knowledgeable enough to know that 3-5 cm is not as thin as it goes, but not knowledgeable enough to know LCD ain’t shit.

    Maybe there are enough of these people to justify a SKU to cater to their needs. But I can also believe that no market research exists to support that hypothesis, and it reads a lot like the average boomer’s understanding of “the younguns and their flat-screen television sets” as if the switch away from bulky CRTs had only happened 5 years ago and not 25.


  • Do people buy the thinnest thing? Laptops or phones maybe to some extent, but TVs I sincerely doubt.

    And having gotten to interact with the real process of product development, I gotta say in my (relatively narrow) experience it’s based a lot more on vibes/politics than market research or focus groups.

    I can totally see “make it as thin as XYZ” being a hard requirement for no better reason than a PM felt strongly about it, and no-one had all three infinity stones necessary to call them out (engineering knowledge, understanding of the PD pipeline, and political capital).


  • You’re describing proper incident response but I fail to see what that has to do with the status page. They have core metrics that they could display on that status page without a human being involved.

    IMO a customer-friendly status page would automatically display elevated error rates as “suspected outage” or whatever. Then management can add more detail and/or say “confirmed outage”. In fact that’s how the reddit status page works (or at least used to work), it even shows little graphs with error rates and processing backlogs.

    There are reasons why these automated systems don’t exist, but none of these reasons align with user interests.



  • I looked into it after this year’s massive price hike… There’s no meaningful alternative. We’re on the FOSS version of GitLab now (GitLab-CE), but the lack of code ownership / multiple reviewers / etc. is a real pain and poses problems with accountability.

    Honestly there are not that many features in Gitlab EE that are truly necessary for a corporate environment, so a GitLab-CE fork may be able to set itself apart by providing those. To me there are two hurdles:

    • Legal uncertainties (do we need a clean room implementation to make sure Gitlab Inc doesn’t sue for re-implementing the EE-only features into a Gitlab fork?)
    • The enormous complexity of the GitLab codebase will make any fork, to put it mildly, a major PITA to maintain. 2,264 people work for GitLab FFS (with hundreds in dev/ops), it’s indecent.

    Honestly I think I’d be happy if forgejo supported gitlab-runner, that seems like a much more reasonable ask given the clean interface between runner and server. Maybe I should experiment with that…


  • All of this has already been implemented for over a hundred years for other trades. Us software people have generally escaped this conversation, but I think we’ll have to have it at some point. It doesn’t have to be heavy-handed government regulation; a self-governed trades association may well aim to set the bar for licensing requirements and industry standards. This doesn’t make it illegal to write code however you want, but it does set higher quality expectations and slightly lowers the bar for proving negligence on a company’s part.

    There should be a ISO-whateverthefuck or DIN-thisorother that every developer would know to point to when the software deployment process looks as bad as CrowdStrike’s. Instead we’re happy to shrug and move on when management doesn’t even understand what a CI is or why it should get prioritized. In other trades the follow-up for management would be a CYA email that clearly outlines the risk and standards noncompliance and sets a line in the sand liability-wise. That doesn’t sound particularly outlandish to me.


  • But a company that hires carpenters to build a roof will be held liable if that roof collapses on the first snow storm. Plumbers and electricians must be accredited AFAIK, have the final word on what is good enough by their standards, and signing off on shoddy work exposes them to criminal negligence lawsuits.

    Some software truly has no stakes (e.g. a free mp3 converter), but even boring office productivity tools can be more critical than my colleagues sometimes seem to think. Sure, we work on boring office productivity tools, but hospitals buy those tools and unreliable software means measurably worse health outcomes for the patients.

    Engineers signing off on all software is an extreme end of the spectrum, but there are a whole lot of options between that and the current free-for-all where customers have no way to know if the product they’re buying is following industry standard practices, or if the deployment process is “Dave receives a USB from Paula and connects to the FTP using a 15 year-old version of FileZilla and a post-it note with the credentials”.