I went ahead and just deleted my entire pictrs cache and will definitely disable caching other servers images when it becomes available.
I went ahead and just deleted my entire pictrs cache and will definitely disable caching other servers images when it becomes available.
Yes im doing something similar but you should probably mask your home IP in some way using something like Cloudflare tunnels. Keep in mind the whole idea behind federation is you are advertising yourself to the fediverse.
Everyone there probably decided not to self-host because they can’t hide it behind their VPN lol
I do something similar, then fail2ban immediately bans the ip for a few hours. The only people going to my root domain are people i do not want sniffing around. It also does the same if you dont pass in my domain at all (and are just hitting random ips).
Namecheap is what I use ya. They are also really great if you have some internal services and want just a cheap domain to get SSL certs from LetEncrypt. All my internal traffic is SSL now because why not, its 85 cents a year. And no dealing with self-signed certs.
No i mean if you want a super cheap .xyz domain, its very cheap if you choose a domain that is digits only. For example my lemmy domain is 158436977.xyz. its 89 cents a year.
You can certainly have xyz domains that are words just like any other.
Also if you dont mind numbers .xyz domains can be like $1 a year. It has to be only numbers and i think at least 9 digits.
Keep in mind they are hacked together and were not meant for mass consumption. Here is an example of one of the scripts that contacts the gitea api and inserts the most recent 10 issues into Flames database with a specific category.
`import sqlite3 from datetime import datetime import requests import re import json from datetime import datetime, timezone
def insert_bookmark(name, url, category_id, order_id): conn = sqlite3.connect(‘/app/db.sqlite’) cursor = conn.cursor()
cursor.execute("SELECT MAX(id) FROM bookmarks")
result = cursor.fetchone()
max_id = result[0] if result[0] else 0
current_time = datetime.now().strftime('%Y-%m-%d %H:%M:%S.%f %z')
values = (name, url, category_id, "", current_time, current_time, 0, order_id)
cursor.execute("INSERT INTO bookmarks (name, url, categoryId, icon, createdAt, updatedAt, isPublic, orderId) VALUES (?, ?, ?, ?, ?, ?, ?, ?);", values)
max_id += 1
conn.commit()
conn.close()
return max_id
def delete_bookmark(category_id): conn = sqlite3.connect(‘/app/db.sqlite’) cursor = conn.cursor() cursor.execute(“DELETE FROM bookmarks WHERE categoryId = ?”, (category_id,)) # Commit the changes and close the connection conn.commit() conn.close()
def get_recently_updated_issues(repo_urls, user_name, api_token): headers = { “Authorization”: f"token {api_token}", “Content-Type”: “application/json” }
all_issues = []
for repo_url, repo_name in repo_urls:
api_url = repo_url
# Query the Gitea API to get the issues
response = requests.get(api_url, headers=headers, params={"state": "all"})
response.raise_for_status()
issues = response.json()
sorted_issues = sorted(issues, key=lambda x: x["updated_at"], reverse=True)
all_issues.extend(sorted_issues[:5])
sorted_all_issues = sorted(all_issues, key=lambda x: x["updated_at"], reverse=True)
recent_issue_titles = []
recent_issue_links = []
recent_timestamps = []
for issue in sorted_all_issues[:10]:
title = issue["title"]
link = issue["html_url"]
timestamp = issue["updated_at"]
recent_issue_titles.append(title)
recent_issue_links.append(link)
recent_timestamps.append(timestamp)
return recent_issue_titles, recent_issue_links, recent_timestamps
repo_urls = [ (“https://gitea.example.com/api/v1/repos/user1/repo1/issues”, “repo1”), (“https://gitea.example.com/api/v1/repos/user1/repo2/issues”, “repo2”) ] user_name = “user1” api_token = “example token”
delete_bookmark(8) order_id = 1
recent_issue_titles, recent_issue_links, recent_timestamps = get_recently_updated_issues(repo_urls, user_name, api_token)
for title, link, timestamp in zip(recent_issue_titles, recent_issue_links, recent_timestamps): print(“Issue Title:”, title) print(“Issue Link:”, link) print(“Last Updated:”, timestamp) print() bookmark_id = insert_bookmark(title, link, 8, order_id) order_id += 1`
Really? Any proof of this?
I use Flame Dashboard for something similar to this. It uses a simple sqllite database, so i just whipped together a few python scripts that gather information such as my to do list, calendar, gir repos, etc and update the database table it uses.
Wouldn’t doing this result in a massive amount of unnecssary load on the larger instances?
Yep, while other were complaining about issues it was smooth sailing for me.
On the flip side, discovering new communities is a pain, and whenever i subscribe to a new community it can take hours to start populating comments.
If you don’t mind switching to Tidal, you can do that with Plex. Otherwise, I don’t think it’s possible.
linkding was one for me. It sounded like a great idea at first, but, I never used it. Shut it down after a couple months.
I did it. So far I’ve noticed a few things, for example you have to populate/federate the communities yourself, and it can take a long time. It took hours to retrieve and catch up all the lemmy.world posts. I expect it to be an ongoing thing. When you first connect to a community, it downloads the first 20 posts, but all the comments are empty.
The plus side though is it is very fast for me. And nobody can delete my profile.
Right now I’m using Plexamp. Really nice app, offline features work really well. Sonic Analysis is awesome. Only issue I have is it crashed sometimes when I’m using Google maps.
Did you try the docker-compose file referenced in these instructions? It worked first try for me. The hardest part was proxying externally. I’m used to using SWAG so I had to get the nginx config working with SWAG.
https://join-lemmy.org/docs/en/administration/install_docker.html
https://github.com/LemmyNet/lemmy/pull/3897