Author: Lisa Montague | Category: Technology Governance for Nonprofits

We recently updated three forms on a client's web application. Two of the new endpoints got locked down right away with the protections our team uses to block abusive traffic. The third one did not. We forgot.

Within hours, that unprotected URL was getting hammered by automated traffic. No human typed their way to it. No one shared the link. Bots found it the way bots always do: by scanning, probing, and methodically testing every possible opening until they hit one that didn't push back.

We caught it. We fixed it. But the lesson stuck.

This is not a scare story. It's a description of what your web infrastructure is up against right now, whether your team knows it or not.

This Is Happening to Nonprofits Today

At the same time as that incident, we've been watching something else unfold across a group of client sites. AI-driven bots have been running through multiple sites simultaneously, hitting dynamic pages over and over. Pages that require the server to do real work to respond.

The effect is not dramatic. There's no alarm. What happens is slower and quieter: the server runs out of available connections. Pages load slowly. Sometimes they don't load at all. Visitors who were ready to donate or sign up for a program hit a delay and leave.

We're working to upgrade server capacity, but it's a race. The bots are faster than they used to be, and they're getting smarter.

Nonprofits are not exempt from this. If anything, lean teams with less technical monitoring are easier targets. Bots don't care about your mission. They care about open doors.

What's Actually Protecting Your Site (Or Not)

Here's the technical concept worth understanding, without the jargon.

Modern web applications can be wrapped in a layer of protective software that sits between the public internet and your actual site. This layer watches every request coming in and applies rules. Is this IP address making 500 requests per minute? Block it. Is this URL being probed in a pattern that looks automated? Throttle it. Is this request coming from a known bad actor? Reject it before it ever touches your server.

Think of it like a front desk that checks credentials before anyone gets to the meeting room. Except the front desk never sleeps, handles thousands of visitors at once, and can recognize when someone is behaving like a bot rather than a person.

When this protection is in place and configured correctly, most malicious traffic never reaches your application. When it's missing from even one URL, bots will find that URL and use it.

That's exactly what happened to us. The protection wasn't missing from the whole application. It was missing from one new endpoint. That was enough.

What Nonprofit Leaders Should Know

You don't need to understand how this works at a technical level. What you need is to know the right questions to ask.

Bring these to your tech lead, vendor, or web developer this week:

  • Do we have any form of rate limiting or traffic filtering in place on our web application? If so, how is it configured?
  • When new pages or form endpoints are added to our site, is there a checklist or review step to confirm protections are applied before launch?
  • Are we monitoring for unusual traffic patterns? If something starts hitting our site 500 times a minute, who finds out and when?
  • Are our dynamic pages (donation pages, sign-up forms, member portals) protected against high-volume automated requests?
  • When was the last time someone reviewed our site's traffic logs for anomalies?

These are not trick questions. They're reasonable things any executive should be asking about digital infrastructure. If your tech team can answer them clearly, you're in decent shape. If the answers are vague, that's worth a follow-up conversation.

The Pace Is Accelerating

A few years ago, this level of automated attack was mostly a concern for large commercial websites. That's no longer true. The tools that power these bots have gotten cheaper, faster, and more capable. What used to require significant technical skill now runs on a subscription.

Your site doesn't need to be famous to be a target. It just needs to exist and have unprotected openings.

The good news is that the protections exist too. The problem is making sure they're applied consistently, especially as sites evolve and new features are added. One overlooked URL during a routine update is all it takes.

Want to Talk Through Where You Stand?

If you're not sure whether your organization's web infrastructure has these protections in place, Coat Rack offers a free initial 15-minute conversation, no pitch, no pressure. We can help you understand what questions to ask and what to look for.

Schedule a free consult with Coat Rack

Coat Rack is a nonprofit technology consulting firm. We help mission-driven organizations make confident, strategic decisions about their digital infrastructure.