I’m just some guy, you know.
Volunteer overseas for the next 4 years. Teach English or something.
I actually disagree with both being on social media, and leaving, so I’m choosing the third option of joining a fake social media website that doesn’t actually connect you to anyone else, but it has all the features I would want if it did.
100PB on i2p is a funny idea, but it’s not necessarily a bad one.
“Sure, Republicans killed a million Iraqis chasing fake WMDs, but I’m voting against the Democrat because she isn’t sufficiently outraged about Israel killing 40k Palestinians.”
Americans need to clean up their own shit before assuming their self-assigned role as world police.
Yes, and actively rooting for Trump to win.
Okay, so if your view is that Democrats and Republicans are the same on this issue, then you’re going to need to vote based on something other than this issue.
So, aside from Gaza which you believe they are equal on, why do you prefer Trump over Harris? Do you support his campaign promises over her’s?
You’ve convinced me. I won’t vote for Joe Biden today.
Yeah, and I plan to vote against Netanyahu’s buddy who keeps saying he should do “whatever it takes to finish the job”. Stop pretending that a Trump presidency will be better for Palestine. Literally everyone who actually cares about Palestinians in good faith knows that a vote for Harris is a vote for harm reduction.
You’re showing your privilege here by admitting you aren’t scared of Trump’s plans, which involves oppressing millions of people, because you’re confident you’re not one of the groups he plans to oppress.
There is no CPU that is ever going to be supported for 10 years for a consumer application. ARM CPUs today are 20x faster than they were 10 years ago, and the ARM/RISC-V chips a decade from now will likely be 10-20x faster than today.
Regardless, the Kryo 670 CPU in the Fairphone 5 is already 3.5 years old, and it’s not super special, it’s just a semi-custom Snapdragon SoC. Consider that 4G LTE launched 13 years ago in the USA, and in 10 years that Kryo chip in the FP5 will be older than that. Could you handle the performance of your last 3G phone today?
Android APK shipping this weekend
Source first. I ain’t touching an APK without seeing code.
Nothing related to Loops on the Pixelfed GitHub yet.
God, this could have been a funny joke if you hadn’t presented it as a humorless blowhard.
Like debating how the “ean” in “Sean Bean” is pronounced, and insisting he can’t have it both ways. It’s one or the other, Sean…
The article does link to that URL behind the line “the first test flight”, but that seems erroneous. This story actually seems to be based on this Chinese press release: https://www.spacetransportation.com.cn/news/info/22.html
But it also seems like there’s some confusion between an “aircraft test” and a “test flight”. I’m not convinced this thing has ever flown.
Lemmy.ml is such a weird situation. Personally, I worry about the future of Lemmy when I consider who runs the circus.
It started off as the first and only Lemmy instance run by the maintainers. It was classified as general purpose, but leaned hard to the left. The maintainers are mostly Tankies, but the users are not, and Lemmygrad is created for the purpose of being focused on far-left politics. Lemmy.ml was my first instance, and I was an early user. Early on it was nice there.
As Lemmy grows, new instances appear, and eventually ML is not the largest, just the oldest. As that happens, the original intent of the ML TLD choice starts to become visible, as the instance shifts from general purpose to an instance for “Marxist Leninists” just like Lemmygrad. Early users have now mostly migrated elsewhere because it is no longer the “flagship” instance.
Now we’re in a position where the majority of Lemmy users exist on non-ML instances, and think that the ML instances are run by morons, yet those morons maintain the codebase that runs every Lemmy instance.
It’s a ticking timebomb. I expect we’ll all be on Kbin by 2030 after something boils over and the Lemmy maintainers start fucking around with the code to get their way.
No, anyone gatekeeping Lemmy users who aren’t on the largest instances are fundamentally working against how the fediverse is supposed to work. lemm.ee is smaller, and generally politically neutral, but it also seems to properly moderate hate speech, violence, harassment, and illegal content. As long as that persists, all should be good.
If it ever comes to a “be on .world or you’re sus”, then .world can just disable federation and become Reddit 2 while the rest of Lemmy embraces decentralization.
And without that capability, every instance would be a cesspool of Nazi and pedo content flooding in from the Fediverse’s dark side.
People are downvoting you not because you are wrong, but because it really hurts when you call people out with this kind of precision. It should be common sense that the message boards full of randos shouldn’t be the foundation of one’s political worldview, but it’s also really easy to make message boards full of randos an integral part of one’s social life.
Getting your news from credible, non-social sources, is important. Being able to read an article and move on without heading to the comments is important. Having conversations with real people offline is important. But those things don’t offer the same steady drip of dopamine that social media provides.
A lot of people here are excessively online, and in desperate need of grass touching, and they don’t want to be told that directly, but they do also need to hear it.
o7 to the brave admins of Hexbear, Lemmy’s tankie quarantine instance.
“Open Source” is mostly the right term. AI isn’t code, so there’s no source code to open up. If you provide the dataset you trained off of, and open up the code used to train the model, that’s pretty close.
Otherwise, we need to consider “open weights” and “free use” to be more accurate terms.
For example, ChatGPT 3+ in undeniably closed/proprietary. You can’t download the model and run it on your own hardware. The dataset used to train it is a trade secret. You have to agree to all of OpenAI’s terms to use it.
LLaMa is way more open. The dataset is largely known (though no public master copy exists). The code used to train is open source. You can download the model for local use, and train new models based off of the weights of the base model. The license allows all of this.
It’s just not a 1:1 equivalent to open source software. It’s basically the equivalent of royalty free media, but with big collections of conceptual weights.