social.coop is one of the many independent Mastodon servers you can use to participate in the fediverse.
A Fediverse instance for people interested in cooperative and collective projects. If you are interested in joining our community, please apply at https://join.social.coop/registration-form.html.

Administered by:

Server stats:

486
active users

#anubis

4 posts4 participants3 posts today
Continued thread

As my initial use case for #swad was to stop #AI #bots clogging my DSL upstream, and the #FreeBSD / #poudriere build logs they were downloading in gigabytes aren't secret at all (to the contrary, it can sometimes be helpful to share them when hunting down build issues in the community) I wonder whether I should add a module somewhat similar to #anubis[1] for "guest logins"? 🤔 Might be a lot of work though...

[1] github.com/TecharoHQ/anubis

Weighs the soul of incoming HTTP requests using proof-of-work to stop AI crawlers - TecharoHQ/anubis
GitHubGitHub - TecharoHQ/anubis: Weighs the soul of incoming HTTP requests using proof-of-work to stop AI crawlersWeighs the soul of incoming HTTP requests using proof-of-work to stop AI crawlers - TecharoHQ/anubis
Continued thread

It's come back, but if this continues, we'll just go to another cloud. The great part about Kubernetes is that you can just do that, but the horrible part about Kubernetes is that you have to have opinions about how to do that. We'll get back to HyperCloud, just gotta ride out the #Anubis wave!

Bluesky SocialBluesky

👀 Esta mañana al comentar los problemas de Wikimedia con el scrapping, un amigo programador me han hablado del proyecto Anubis github.com/TecharoHQ/anubis/
"Es bastante sencillo y fácil de implementar en cualquier web medio seria, te cargas automáticamente cualquier scrapper (sea de IA sea de lo que sea). Además, no pueden inventar nada que haga que sea rentable el scrapping con eso puesto." #aiscraping #aiscrapers #wikimedia #anubis #iahastaenlaputasopa

y0 thanks to @seism0saurus's friend @kubikpixel i have a cool project to toss on the pile 😂:

#Anubis: "Anubis weighs the soul of your connection using a sha256 proof-of-work challenge in order to protect upstream resources from scraper bots.

Installing and using this will likely result in your website not being indexed by some search engines. This is considered a feature of Anubis, not a bug."

respect my robots.txt or pound sound. #AI #mitigation #scrapers #PoW #antiAI

Replied to Xe :verified:

@cadey My thoughts on #Anubis after encountering it multiple times as a user:
* mascot is nice, creative and intuitive to understand
* as a user of tor it works! cloudflare and others reject me as a bot, but anubis left me through, thank you
* onion services do not require anubis protection, though, right? Since they have their own proof of work system integrated by default …
blog.torproject.org/introducin

… equi-x function based on what Tor uses?
pony.social/@cadey/11423626384

blog.torproject.orgIntroducing Proof-of-Work Defense for Onion Services | Tor ProjectToday, we are officially introducing a proof-of-work (PoW) defense for onion services designed to prioritize verified network traffic as a deterrent against denial of service (DoS) attacks with the release of Tor 0.4.8.