Facial recognition confirmed he was a criminal and the scanner confirmed he had a gun! Of course we opened fire instantly. How could we have known it was just some guy with a water bottle?
Facial recognition confirmed he was a criminal and the scanner confirmed he had a gun! Of course we opened fire instantly. How could we have known it was just some guy with a water bottle?
I can’t believe that there are people are out there just raw dogging the Internet with no AdBlock. The advertising is so aggressive and intrusive. I legitimately cannot tolerate using the Internet when 80% of the page is filled with attention grabbing bullshit.
Where else am I going to share the fun stuff I find with StumbleUpon?
Same here. I love DuckDNS but after the third DNS outage taking down all my services I migrated to Cloudflare and haven’t had a single problem since.
It’s kind of a paradox when you think about it. Good reviewers are often just regular people with a passion for tech but as they become more popular and prolific they become part of the industry itself. Once that happens even if they try to stay objective and critical their perspective is so different from regular people that reviews are just part of the sales and marketing strategy rather than pro tips from an enthusiast.
Their mom’s basement, most likely.
I take no delight in killing but Russian forces could leave Ukraine at any point and put an end to it.
Backups need to be reliable and I just can’t rely on a community of volunteers or the availability of family to help.
So yeah I pay for S3 and/or a VPS. I consider it one of the few things worth it to pay a larger hosting company for.
unless they open source their code and/or provide some public interface to test and validate feed content
This honestly seems like a good idea. I think one of the ways to mitigate the harm of algorithmically driven content feeds is openness and transparency.
It’s clear that Valve’s competitors undervalue the user experience that Steam provides and don’t understand why it’s so sticky.
My shitposting will make AI dumber all on its own; feedback loop not required.
I intentionally do not host my own git repos mostly because I need them to be available when my environment is having problems.
I make use of local runners for CI/CD though which is nice but git is one of the few things I need to not have to worry about.
I was paying for Google music until they took it away from me and told me it was Youtube Premium and then raised the price twice.
Not exactly what I’d call a great value proposition.
IPv6 firewalls should, by default, offer similar levels of security to NAT
I think you’re probably right. We had decades of security experts saying that NAT is not a firewall and everyone on the planet treated it like one anyway. Now we’re overexposed for a no-NAT IPV6 internet.
Somewhat ironically the Surface laptops are really great Linux machines.
Shouldn’t have put the ‘implode’ action on the shoulder button. It was only a matter of time before he triggered it on accident.
if you go to another country, you have to adjust to their law
Big business knows no national boundaries. They’ll build factories wherever labor is cheap, put headquarters wherever the taxes are low, and sell their wares wherever consumer rights are weak.
Do you have any links or guides that you found helpful? A friend wanted to try this out but basically gave up when he realized he’d need an Nvidia GPU.
I’ve been testing Ollama in Docker/WSL with the idea that if I like it I’ll eventually move my GPU into my home server and get an upgrade for my gaming pc. When you run a model it has to load the whole thing into VRAM. I use the 8gb models so it takes 20-40 seconds to load the model and then each response is really fast after that and the GPU hit is pretty small. After I think five minutes by default it will unload the model to free up VRAM.
Basically this means that you either need to wait a bit for the model to warm up or you need to extend that timeout so that it stays warm longer. That means that I cannot really use my GPU for anything else while the LLM is loaded.
I haven’t tracked power usage, but besides the VRAM requirements it doesn’t seem too intensive on resources, but maybe I just haven’t done anything complex enough yet.
I only go in the server room to t-pose in front of the giant air conditioner to cool off.