Fixing the Tundra Tracker flashing red light issue

Some of my Tundra Trackers have been having an issue recently where they’ll get stuck in a flashing red light mode and refuse to connect or turn off. This is a known issue from Tundra and they have provided a sequence of steps to fix it. For some reason they opted to make it as a video with no transcript, however, and this is both difficult to find and annoying to follow.

So I’ve opted to provide the steps in written English so that people can actually find and follow them.

  1. Disconnect all VR-related peripherals from the computer
  2. Connect the Tundra tracker to the computer via USB
  3. Open the lighthouse control application, which is usually at C:\Program Files (x86)\Steam\steamapps\common\SteamVR\tools\lighthouse\bin\win32\lighthouse_console.exe
  4. In the console window that comes up, type reboot
  5. Wonder why people need to provide all such instructions in an impossible-to-read low-resolution poorly-narrated too-long video instead of it just being a text file
  6. Also not stated in the video, but the light might not stop flashing until you unplug it from the computer, and that would be a lot easier to add into a text file than a video after the fact, as well

Read more…

Blocking abusive webcrawlers

People often talk about how bad AI is for the environment, but only focus on the operation of the LLMs themselves. They seem to ignore the much larger impact of what the AI scrapers are doing: not only do those take massive amounts of energy and bandwidth to run, but they’re impacting every single website operator on the planet by increasing their server requirements and bandwidth utilizatoin as well. And this makes it everyone’s problem, since everyone ends up having to foot the bill. It’s asinine and disgusting.

At one point, fully 94% of all of my web traffic was coming from a single botnet like this. These bots do not respect robots.txt or the nofollow link rels that I put on my site to prevent robots from getting stuck in a trap of navigating every single tag combination on my site, and it’s ridiculous just how many resources — both mine and theirs — are constantly being wasted like this.

I’ve been using the nginx ultimate bad bot blocker to subscribe to lists of known bots, but this one particular botnet (which operates on Alibaba’s subnets) has gotten ridiculous, and enough is enough.

So, I finally did something I should have done ages ago, and set up UFW and set up some basic rules.

UPDATE: This article has started getting linked to from elsewhere (including Coyote’s excellent article about the problem), but I no longer use this approach for blocking crawlers as it’s become completely ineffective thanks to the crawlers now behaving like a massive DDoS. These days I’m using a combination of gated access, sentience checks, and, unfortunately, CloudFlare.

Read more…