I’ve been on a bit of a local search kick lately. Last month I created No Kings TV, which monitored news about the protests in 20 different US metro areas. Shortly after that, I made Local News TV, which lets users browse 660 local American TV stations by state/metro area and get recent content via YouTube RSS feeds.
Over the last week I’ve been exploring what I can do when I put together my own datasets. Today I finished creating Local Search America, a new way to create specific search spaces for authoritative institutions in American DMAs (Designated Market Area, essentially a metro area. You can learn more about DMAs here.) Enter a city and state and Local Search America identifies the television stations and government agencies which serve that area as well as the institutions of higher education in that area. Up to 25 of those sites can bebundled into a Google search (using Google’s site: operator.) Here’s how it works.

Start using Local Search America by entering a city and a two-letter postal code for state. (LSA will try to autocomplete the city name.) LSA uses local datasets to aggregate a list of FCC-licensed television stations, accredited institutions of higher education, and official government web sites in that city’s Designated Market Area. A number of filter buttons help you quickly narrow down on certain institution types, specific TV networks, or even government structures. When displayed, each listing has a checkbox beside it. Select up to 25 agencies you want to include in a Google search.

Once you’ve selected all the sources you want to search, scroll back to the top of the search results where the Google query box resides. Enter your query here (I recommend trying basic queries without special syntax first; do not use site: syntax at all). A set of radio buttons let you set the timeframe for your web search.

Once you’ve got the query just how you want it, click the “Create Google Search” button. A Google search result page will open in a new tab.

Earlier in this post I noted that Local Search America searches authoritative institutions. What I mean by that is the TV stations are licensed by the FCC, the institutions of higher education are accredited via the Department of Education (it overlooks the accreditors), and the government web sites came off a list provided by CISA on GitHub.
When we use authoritative sites to build our search space, we’re making a space that’s stronger against information warfare attacks and attempts to inject disinformation via fake sites masquerading as news outlets or government representatives. Are spaces like this absolutely proof against such things? No; sites can be hacked and universities accept pages from a number of less authoritative bodies, like students and community members. However, they’re much better than a random set of sites from the open web for which you have little information.
Search spaces strengthened against disinformation is the first reason I made Local Search America. The second reason is that local search can be fantastic for topical searches and current event searches. Looking for information about climate change is going to look very different in Biddeford Maine vs San Antonio Texas. Being able to restrict a super-general search like “commercial real estate” to a specific area is a powerful way to get a manageable number of results which can more quickly teach you about the location-specific aspects of the topic without your having to spend a lot of time narrowing your query down.
I know the great fashion now is to use AI for anything. But we have a number of longtime systems for identifying useful sites in useful areas that we can use via datasets that don’t require AI. It’s not as “sexy” or trendy as AI — but this method doesn’t use crazy amount of power or water either.
We don’t need AI for great search, and I’m going to keep saying that until I’m blue in the face.