I honestly don’t understand the Dutch techno reference.
I honestly don’t understand the Dutch techno reference.
Wouldn’t that disrupt the usage of a phone as a server?
It’s not a god per se, rather the animal companion of Ganesh (Elephant god/ god of people, kinda like god Fufluns of Populonia). The mouse is like a ride of the god.
It’s opening full for me. Probably geo paywalled…
Goes away with js disabled. Rest of page works fine.
Yeah hx. It was hx that finally made me use vi style navigation and now I choose vim over nano almost always.
deleted by creator
This caption is generated by apple accessibility service? wow
newpipe
The text. And probably images too (but the only mistake being the wrong port depiction (all c) says more human).
ai generated lol
scaled by system/themselves … looks like those are x11 apps. why is firefox into this? run it as native wayland with MOZ_ENABLE_WAYLAND
the bad guys use bots or services and are done. regular users have to endure while no security is added
put in other words, common users can’t easily become ‘bad guy’ ie cost of attack is higher hence lower number of script kiddies and automated attacks. You want to reduce number. These protections are nothing for bitnet owners or other high profile bad actors.
ps: recaptcha (or captcha in general) isn’t a security feature. At most it can be a safety feature.
stopping automated requests
yeah my bad. I meant too many automated requests. Both humans and bot generate spams and the issue is high influx of it. Legitimate users also use bots and by no means it’s harmful. That way you do not encounter captcha everytime you visit any google page, nor a couple of scraping scripts gets a problem. Recaptcha (or hcaptcha, say) triggers when there is high volume of request coming from same ip. Instead of blocking everyone out to protect their servers, they might allow slower requests so legitimate users face mininimal hindrance.
Most google services nowadays require accounts with stronger (like cell phone) verification so automated spam isn’t a big deal.
And what will you do if a person in a CGNAT is DoSing/scraping your site while you want others to access? IP based limiting isn’t very useful, both ways.
hCaptcha, Microsoft CAPTCHA all do the same. Can you give example of some that can’t easily be overcome just by better compute hardware?
There isn’t a good way to classify human users with scripts without adding too much friction to normal use. Also bots are sometimes welcome amd useful, it’s a problem when someone tries to mine data in large volume or effectively DoS the server.
Forget bots, there exist centers in India and other countries where you can employ humans to do ‘automated things’ (youtube like count, watch hour for example) at the same expense of bots. There are similar CAPTCHA services too. Good luck with those :)
Only rate limiting is the effective option.
Apart from usual ubo, reader mode and friends trained eyes are very effective content filter. We all can glance on a search result page or an article and immediately know if it’s content or low effort craps.
Stay out of mainstream social media, stop consuming ‘feeds’. Stay in the realms of personal sites, blogs and sane link aggregators/rss to keep mental peace of not having to filter garbage with eyes everyday.