mastodon.ie is one of the many independent Mastodon servers you can use to participate in the fediverse.
Irish Mastodon - run from Ireland, we welcome all who respect the community rules and members.

Administered by:

Server stats:

1.8K
active users

#raspberrypi500

0 posts0 participants0 posts today

I'm curious to hear what others are #SelfHosting! Here's my current setup:

Hardware & OS

Infrastructure & Networking

Security & Monitoring

Authentication & Identity Management

  • Authelia (Docker): Just set this up for two-factor authentication and single sign-on. Seems to be working well so far!
  • LLDAP (Docker): Lightweight LDAP server for managing authentication. Also seems to be working pretty well!
    #AuthenticationTools #IdentityManagement

Productivity & Personal Tools

Notifications & Development Workflow

  • Notifications via: #Ntfy (Docker) and Zoho's ZeptoMail (#Zoho)
  • Development Environment: Mostly using VSCode connected to my server via Remote-SSH extension. #VSCodeRemote

Accessibility Focus ♿🖥️

Accessibility heavily influences my choices—I use a screen reader full-time (#ScreenReader), so I prioritize services usable without sight (#InclusiveDesign#DigitalAccessibility). Always open to discussing accessibility experiences or recommendations!

I've also experimented with:

  • Ollama (#Ollama): Not enough RAM on my Pi.
  • Habit trackers like Beaver Habit Tracker (#HabitTracking): Accessibility issues made it unusable for me.

I don't really have a media collection, so no Plex or Jellyfin here (#MediaServer)—but I'm always open to suggestions! I've gotten a bit addicted to exploring new self-hosted services! 😄

What's your setup like? Any cool services you'd recommend I try?

#SelfHosted #LinuxSelfHost #OpenSource #TechCommunity #FOSS #TechDIY

@selfhost @selfhosted @selfhosting

#SelfHosted #LinkAce Bookmark Manager Running, but Unable to Check for Updates or Generate a Cron Token

Hi all. Hoping someone in the #SelfHosting community can help here. I'm running LinkAce in #Docker behind non-Dockerized #Caddy and #Authelia, and most things are working, but I'm seeing "Could not check for updates" at the bottom of each page, and when I tried to generate a cron token, nothing happened except for the generate button graying out. I am seeing one or two 404 errors in my logs, but I don't know if that's causing the problem or not. I don't know much about #PHP applications.

Logs

2025-02-22 23:25:26,460 INFO supervisord started with pid 1
2025-02-22 23:25:27,465 INFO spawned: 'php-fpm' with pid 8
2025-02-22 23:25:27,467 INFO spawned: 'caddy' with pid 9
[22-Feb-2025 23:25:27] NOTICE: [pool www] 'user' directive is ignored when FPM is not running as root
[22-Feb-2025 23:25:27] NOTICE: [pool www] 'group' directive is ignored when FPM is not running as root
[22-Feb-2025 23:25:27] NOTICE: fpm is running, pid 8
[22-Feb-2025 23:25:27] NOTICE: ready to handle connections
{"level":"info","ts":1740266727.5264525,"msg":"using config from file","file":"/etc/caddy/Caddyfile"}
{"level":"info","ts":1740266727.5280282,"msg":"adapted config to JSON","adapter":"caddyfile"}
{"level":"warn","ts":1740266727.5280406,"msg":"Caddyfile input is not formatted; run 'caddy fmt --overwrite' to fix inconsistencies","adapter":"caddyfile","file":"/etc/caddy/Caddyfile","line":2}
{"level":"info","ts":1740266727.529092,"logger":"admin","msg":"admin endpoint started","address":"localhost:2019","enforce_origin":false,"origins":["//localhost:2019","//[::1]:2019","//127.0.0.1:2019"]}
{"level":"warn","ts":1740266727.529331,"logger":"http.auto_https","msg":"server is listening only on the HTTP port, so no automatic HTTPS will be applied to this server","server_name":"srv0","http_port":80}
{"level":"info","ts":1740266727.5294206,"logger":"tls.cache.maintenance","msg":"started background certificate maintenance","cache":"0x40000bab00"}
{"level":"warn","ts":1740266727.530186,"logger":"http","msg":"HTTP/2 skipped because it requires TLS","network":"tcp","addr":":80"}
{"level":"warn","ts":1740266727.530195,"logger":"http","msg":"HTTP/3 skipped because it requires TLS","network":"tcp","addr":":80"}
{"level":"info","ts":1740266727.530198,"logger":"http.log","msg":"server running","name":"srv0","protocols":["h1","h2","h3"]}
{"level":"info","ts":1740266727.5412574,"msg":"autosaved config (load with --resume flag)","file":"/home/www-data/.config/caddy/autosave.json"}
{"level":"info","ts":1740266727.541271,"msg":"serving initial configuration"}
{"level":"info","ts":1740266727.5477707,"logger":"tls","msg":"cleaning storage unit","storage":"FileStorage:/home/www-data/.local/share/caddy"}
{"level":"info","ts":1740266727.5541356,"logger":"tls","msg":"finished cleaning storage units"}
2025-02-22 23:25:28,555 INFO success: php-fpm entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2025-02-22 23:25:28,555 INFO success: caddy entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
::1 - 22/Feb/2025:23:25:34 +0000 "GET /index.php" 200
::1 - 22/Feb/2025:23:25:34 +0000 "GET /index.php" 404

Docker Compose file

services:
# --- LinkAce
linkace:
image: docker.io/linkace/linkace:latest
container_name: linkace
restart: unless-stopped
depends_on:
- linkace_db
ports:
- "0.0.0.0:3009:80"
volumes:
- ./.env:/app/.env
- ./backups:/app/storage/app/backups

# --- Database
linkace_db:
image: docker.io/library/mariadb:11.5
container_name: linkace_db
restart: unless-stopped
command: mariadbd --character-set-server=utf8mb4 --collation-server=utf8mb4_bin
environment:
- MYSQL_ROOT_PASSWORD=${DB_PASSWORD}
- MYSQL_USER=${DB_USERNAME}
- MYSQL_PASSWORD=${DB_PASSWORD}
- MYSQL_DATABASE=${DB_DATABASE}
volumes:
- db:/var/lib/mysql

# --- Cache
linkace_redis:
image: docker.io/bitnami/redis:7.4
container_name: linkace_redis
restart: unless-stopped
environment:
- REDIS_PASSWORD=${REDIS_PASSWORD}

volumes:
db:

.env (secrets redacted)

## LINKACE CONFIGURATION

# The app key is generated later, please leave it like that
APP_KEY=redacted
APP_ENV=development

## Configuration of the database connection
## Attention: Those settings are configured during the web setup, please do not modify them now.
# Set the database driver (mysql, pgsql, sqlsrv, sqlite)
DB_CONNECTION=mysql
# Set the host of your database here
DB_HOST=linkace_db
# Set the port of your database here
DB_PORT=3306
# Set the database name here
DB_DATABASE=linkace
# Set both username and password of the user accessing the database
DB_USERNAME=linkace
# Wrap your password into quotes (") if it contains special characters
DB_PASSWORD=redacted

## Redis cache configuration
# Set the Redis connection here if you want to use it
REDIS_HOST=linkace_redis
REDIS_PASSWORD=redacted
REDIS_PORT=6379
APP_DEBUG=true

# SSO configuration
SSO_ENABLED=true
SSO_OIDC_ENABLED=true
SSO_REGISTRATION_ENABLED=true
REGULAR_LOGIN_DISABLED=true
SSO_OIDC_BASE_URL=https://auth.laniecarmelo.tech/ # Your Authelia base URL
SSO_OIDC_CLIENT_ID=linkace
SSO_OIDC_CLIENT_SECRET='redacted'
SSO_OIDC_SCOPES=openid,profile,email

Caddyfile snippet

{
email laniecarmelo@gmail.com
debug
acme_dns cloudflare redacted
http_port 80
https_port 443
admin :2019 {
origins 127.0.0.1:2019 0.0.0.0:2019 stormux:2019 caddy.laniecarmelo.tech
}
}

(logconfig) {
log {
output stdout
format json
}
}

(auth_headers) {
header Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"
}

(proxy_config) {
header_up Host {http.request.host}
header_up X-Real-IP {http.request.remote}
header_up X-Forwarded-User {http.auth.user.id} # Pass user ID
header_up X-Forwarded-Email {http.auth.user.email} # Pass email
}

(authelia_middleware) {
forward_auth localhost:9091 {
uri /api/verify?rd=https://auth.laniecarmelo.tech
copy_headers Remote-User Remote-Email Remote-Groups Authorization
}
}

bookmarks.laniecarmelo.tech {
route {
import authelia_middleware
reverse_proxy localhost:3009 { # Directly proxy to LinkAce's web server
import proxy_config
}
}
import logconfig
import auth_headers
}

Authelia config snippet

    - domain: "*.laniecarmelo.tech"
policy: bypass
networks:
- 192.168.1.0/24 # Local network
- 172.17.0.0/16 # Docker bridge network
- 100.64.0.0/10 # Tailscale network

- domain: "bookmarks.laniecarmelo.tech"
resources: ["^/api.*"]
policy: bypass

- domain: "*.laniecarmelo.tech"
policy: one_factor

- client_id: linkace
client_name: LinkAce bookmarking app
client_secret: redacted
public: false
authorization_policy: one_factor
scopes: [openid, groups, profile, email, offline_access]
redirect_uris:
- https://bookmarks.laniecarmelo.tech/auth/oidc/callback
grant_types: [authorization_code]
response_types: [code]
response_modes: [form_post, query]
userinfo_signed_response_alg: none
consent_mode: explicit
pre_configured_consent_duration: "1y"

Does anyone know what might be causing this and how I can fix it?
#Linux #ArchLinuxARM #Stormux #RaspberryPi #RaspberryPi500 #RPi #RPi500 #tech #technology
@selfhost @selfhosted @selfhosting

Help Needed with Cloudflare Zero Trust, Pages, and Workers for ReactFlux + MiniFlux Setup

Hi everyone,

I'm new to #Cloudflare and have been trying to set up a #SelfHosted project on my #RaspberryPi 500. I'm mostly self-taught, so I apologize if I misunderstand anything or miss important details. Here's my situation:

Current Setup

  • I'm running the self-hosted #RSS feed reader #MiniFlux on my Raspberry Pi 500 (#ArchLinuxARM, installed via Pacman).
  • The setup uses #Caddy as a reverse proxy, a #CloudflareZeroTrust tunnel, and Cloudflare Access for SSO.
  • My #CloudflareAccess application is configured to allow all origins, methods, and headers. It has a policy that allows specific emails or login methods (e.g., GitHub).

What I'm Trying to Do

  • I want to deploy ReactFlux, an alternative frontend for MiniFlux, on #CloudflarePages.
  • Before setting it up fully, I tested the ReactFlux demo with my MiniFlux instance at https://rss.laniecarmelo.tech. However, ReactFlux couldn't log in.

Suspected Issue

I believe the issue is caused by Cloudflare Access protection blocking ReactFlux from accessing the MiniFlux API (https://rss.laniecarmelo.tech/v1/*).

What I've Tried So Far

  1. I added another hostname (rss.laniecarmelo.tech/v1/*) to my tunnel configuration and created a new Cloudflare Access application with a policy set to "Bypass" for everyone. However, this didn't work—when testing the API endpoint in a private browser window, I'm still asked to sign into Cloudflare.
  2. I also tried setting up the hostname with "Protect with Access" turned off but got the same results.
  3. Next, I attempted to use a #CloudflareWorker written in JavaScript to bypass authentication for /v1/*, but it doesn't seem to be doing anything (or isn't being triggered).

What I Need Help With

  • How can I properly configure Cloudflare so ReactFlux can access the MiniFlux API (/v1/*) while keeping the rest of my MiniFlux instance protected by Cloudflare Access?
  • I've been stuck on this for a couple of days and would really appreciate any guidance or suggestions!

Thanks in advance for your help!

#SelfHosting #ArchLinux #Linux #RSSReader #tech #technology #RaspberryPi #RPi #RPi500 #RaspberryPi500
@selfhosting @selfhost @selfhosted

reactflux.pages.devReactFluxA Simple but Powerful RSS Reader for Miniflux

#MiniFlux users, can anyone help?

Hi all. I'm having some issues with MiniFlux, a #SelfHosted #RSSReader, and hoping someone can help. MiniFlux was working fine until I tried to deploy ReactFlux on the same domain as it, rss.laniecarmelo.tech, on a subpath, /reactflux. This didn't work so I removed ReactFlux. I also migrated MiniFlux from #Docker to #Pacman package, thinking it would be easier on my system. This problem, or a similar one, was occurring before I did that though.

Now, rss.laniecarmelo.tech loads the MiniFlux login page, but when I login, it redirects to a blank page at rss.laniecarmelo.tech/login. I've added trusted proxies and cookie configuration to my miniflux.conf and headers to my Caddyfile, but I still have the issue.

I'm using #Caddy for #ReverseProxy and #Cloudflare for #SSO. Has anyone seen anything like this before? This is on a #RaspberryPi500 running #ArchLinuxARM.

I've checked MiniFlux logs, and it's getting the login requests and creating sessions. I'm not sure what's happening after that. Cloudflared and Caddy seem to be working normally.

#SelFhosting #Linux #RSS #RaspberryPi #RPi #tech #technology
@selfhost @selfhosted @selfhosting

🚨 Help Needed: #CORS and #Cloudflare Access Issues with #Nextflux + #MiniFlux Setup 🚨

Hi everyone! I’m struggling with a #SelfHosted setup and could really use some advice from the self-hosting community. Lol I've been trying to figure this out for hours with no luck. Here’s my situation:

Setup

  • MiniFlux: Running in #Docker on a #RaspberryPi500 (#Stormux, based on #ArchLinuxARM).
  • Nextflux: Hosted on Cloudflare Pages.
  • Reverse Proxy: #Caddy (installed via AUR).
  • Cloudflare Access: Enabled for security and SSO.
  • Cloudflared: Also installed via AUR.
  • CORS Settings in Cloudflare Access: Configured to allow all origins, methods, and headers.

What’s Working

  • MiniFlux is accessible from my home network after removing restrictive CORS settings in both Caddy and MiniFlux.
  • Nextflux is properly deployed on Cloudflare Pages.

The Problem

Nextflux cannot connect to MiniFlux due to persistent CORS errors and authentication issues with Cloudflare Access. Here are the errors I’m seeing in the browser console:

  1. CORS Error:Access to fetch at 'https://rss.laniecarmelo.tech/v1/me' from origin 'https://nextflux.laniecarmelo.tech' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.
  2. Cloudflare Access Redirection:

    Request redirected to 'https://lifeofararebird.cloudflareaccess.com/cdn-cgi/access/login/rss.laniecarmelo.tech'.
  3. Failed to Fetch:

    Failed to fetch: TypeError: Failed to fetch.

What I’ve Tried

  1. Service Token Authentication:

    • Generated a service token in Cloudflare Access for Nextflux.
    • Added CF-Access-Client-Id and CF-Access-Client-Secret headers in Caddy for rss.laniecarmelo.tech.
    • Updated Cloudflare Access policies to include a bypass rule for this service token.
  2. CORS Configuration:

    • Tried permissive settings (Access-Control-Allow-Origin: *) in both Caddy and MiniFlux.
    • Configured Cloudflare Access CORS settings to allow all origins, methods, and headers.
  3. Policy Adjustments:

    • Created a bypass policy for my home IP range and public IP.
    • Added an "Allow" policy for authenticated users via email/login methods.
  4. Debugging Logs:

    • Checked Cloudflared logs, which show requests being blocked due to missing access tokens (AccessJWTValidator errors).

Current State

Despite these efforts:

  • Requests from Nextflux are still being blocked by Cloudflare Access or failing due to CORS issues.
  • The browser console consistently shows "No 'Access-Control-Allow-Origin' header" errors.

Goals

  1. Allow Nextflux (hosted on Cloudflare Pages) to connect seamlessly to MiniFlux (behind Cloudflare Access).
  2. Maintain secure access to MiniFlux for other devices (e.g., my home network or mobile devices).

My Environment

  • Raspberry Pi 500 running Arch Linux ARM.
  • Both Caddy and Cloudflared are installed via AUR packages.
  • MiniFlux is running in Docker with the following environment variables:CLOUDFLARE_SERVICE_AUTH_ENABLED=trueCLOUDFLARE_CLIENT_ID=<client-id>CLOUDFLARE_CLIENT_SECRET=<client-secret>

Relevant Logs

From cloudflared:

ERR error="request filtered by middleware handler (AccessJWTValidator) due to: no access token in request"

From the browser console:

Access to fetch at 'https://rss.laniecarmelo.tech/v1/me' has been blocked by CORS policy.

Questions

  1. Is there a better way to configure CORS for this setup?
  2. Should I be handling authentication differently between Nextflux and MiniFlux?
  3. How can I ensure that requests from Nextflux include valid access tokens?

Any help or advice would be greatly appreciated! 🙏

Hi #SelfHosted community. I've figured out a lot of my setup. I now have a new domain, laniesplace.us, just for #HomeServer stuff. It's set up through Porkbun with Dynu for #DDNS. I've now got #Traefik, #TailscaleVPN, #Linkding, #Forgejo, #Dokuwiki, Code-Server, #Portainer, #Netdata, #Watchtower, #Cockpit, #Pihole, #MiniFlux, #TheLounge, #Filebrowser, #UptimeKuma, and the #Homer dashboard service installed. I'm now trying to set up #Authelia so I can have single sign-on to my services. For some, it's working now, but I can't seem to get Linkding to work no matter what I do. This is on a #RaspberryPi 500 with 8 GB RAM and a 512 GB SD card, running #Stormux, which is based on #ArchlinuxARM. Can anyone help? I'll reply to this post with all my relevant config files in separate posts. What's happening is this: Linkding is supposed to be available at bookmarks.laniesplace.us. When I go there, I see a 401 unauthorized error and a link to sign into Authelia. Once I sign in, though, it redirects back to the page with the 401 error. I've been trying to figure this out for hours with no luck. Files will be in replies to this post.
#SelfHosting #Linux #HomeLab #RPi #RaspberryPi500 #RPi500 #Tech #Technology
@selfhost @selfhosting @selfhosted @linux

#RaspberryPi owners, how do you back up your #RPi?
I don’t have an extra hard drive, so I need to back up to #GoogleDrive. I’ve installed #Rclone and tried creating a #BASH script, but I’m struggling with it.
If you’ve set up something similar or have recommendations for an efficient backup workflow, I’d love to hear them! Bonus points if you know any good courses or resources to improve my scripting skills.
#Linux #Tech #SelfHosted #SelfHosting #RaspberryPiBackup #RaspberryPi500 #RaspberryPiProjects #RPiBackup #RaspberryPiOS #RPiProjects #LinuxBackup #LinuxTips #LinuxCommunity #TechSupport #OpenSource #DataBackup #CloudBackup #GoogleDriveBackup #CloudStorage #FileBackup #Rclone #BASHScript #ShellScripting #ProgrammingHelp #CodeNewbie #LearnToCode #TechSolutions #Automation #techhacks @linux @selfhost @selfhosting @selfhosted

I just got my new #RaspberryPi500, but I’m finding #RaspberryPiOS isn’t very accessible. I’m thinking of installing another #Linux distro. Any recommendations from fellow #Blind Raspberry Pi users? I’m most familiar with #ArchLinux—would it work well on this device? Also, what’s the best way to install it without an SD card reader?
Looking forward to your insights!
#Accessibility #Tech #Technology #AssistiveTechnology #LinuxAccessibility
@mastoblind @main @accessibility

Replied to Scott Williams 🐧

@vwbusguy I had massive slowdowns on #BevyEngine as soon as I installed the #Vulkan drivers for the Broadcom VideoCore VI chipset on the #RaspberryPi4. It seems the #RaspberryPi500 ships VideoCore VII and possibly uses the same driver.

github.com/bevyengine/bevy/iss

How does your problem look like exactly? It might be a common #Wgpu bug (and not Bevy's fault after all)

GitHubRaspberry Pi 4 Performance Regression · Issue #14253 · bevyengine/bevyBy s-mayrh
Continued thread

No more broken services on startup now and startup is crazy fast, even from sdcard.

Notifications now make noise and pop up, but still can't get any Cosmic specific apps to launch, even though they did on the Pi400, so I continue to think this is rpi5 GPU firmware specific.

That said, I can still install and launch other things. The Cosmic terminal might not launch, but alacritty does!

Today I need to figure out why the Cosmic launcher for terminal, etc., worked on the Pi400 but not on the Pi500. I suspect it's most likely some kind of race condition and not hardware related, but it's possible there's some kind of hardware acceleration in the Cosmic compositor that isn't playing well. Non-Cosmic apps still launch fine.

It's very close to being generally usable.