Tara Calishain

I dream in data flows

Daydreaming About the Perfect Mastodon Data Broadcast Setup for Emergency Responders

I used to play Rimworld a lot in the evening, but since I started learning JavaScript, it’s much more interesting to make up my own games.

Yesterday evening I pretended I had just started a Mastodon instance for a county fire and rescue and I was designing a tool to make their posts as useful to the community as possible with the least time/work for them.

I think I would start with a two-section form that would have one space for the text of the post and a second space for a relevant location.

With location separate you can run it through a geolocator API and get lat/long, the relevant county/metro area (depending on what kind of area you were trying to cover), etc. Additionally you might access your own dataset with specifics of YOUR area. Perhaps some streets need a hashtag of #FloodRisk or #BridgeArea. You can grab all that with an address.

But what do you want to get all that for?

So you can automatically generate standardized, relevant hashtags, of course! Noting the #county might be the right area size depending on your coverage.

For data nerds, you could even transmit the lat/long of the location as a hashtag. Why not #34d143cm118d37 ?

(the d stands for “dot”, the c stands for “comma”, and the m stands for “minus.”)

So far you have your EMS hero putting in a text message and an address and generating relevant, data-rich, location-specific hashtags automatically. Great.

What else can we add automatically? I think that rather depends on what’s available in what we might call the live open data network — the open transmission of data that exists as a background process across the fediverse and becomes especially active at certain times.

We can introduce RSS feeds into that structure. Maybe we (or another org) have RSS-feed driven pages for every neighborhood in our coverage area.

Those pages are populated by live information relevant to the area (from other gov instances, news outlets, trusted citizen journalists) and, during emergencies, updated information about shelters, evacuation routes, weather updates, etc. Each post relevant to that area includes a link to that page automatically.

Data freaks (like your pal Cal) would use the standardized hashtags to maintain a separate ecosystem of data that could ideally be aggregated across LOTS of instances operating in the same way.

Standardizing of hashtags during emergencies is better for everybody — media, aggregators, researchers. everybody. And if you can generate these rich information sources just by entering an address — holy crap!

More prosaically, posts might also have automatic hashtag timestamps – you can get that data from the feed itself, of course, but if you provide it as a hashtag it’s easier for the client to handle.

That would allow you build a mechanism that recalled posts after a certain time based on the timestamp, so old information doesn’t get left out to rot.

In the little Barbie Dream Information Dissemination House I’m building, I’m also thinking about the profile page.

I’d want to standardize that too, perhaps with hashtags like #WakeCountyNC or #RaleighNC depending on coverage area. Some groups might want to include ham radio call signs or other offline contacts.

Bio links might be standardized to go to a) home page, b) emergency services page, c) municipal government page d) feed resources page .

Now, just because there’s an address in the equation, we’ve got a lovely dataflow going on that it took absolutely zero effort for the operator to initiate.

Fellow emergency responders can use it. Local media and community workers can use it. Data randos can use it.

And because the hashtags are standardized, end users can use it too, and adapt the flow for their area, situation, or accessibility need.

Now imagine lots of instances operating with a similar setup, working with a base set of hashtags to delineate location and other parameters typical of emergency events, and a subset specific to their needs.

Imagine the data they would both aggregate and distribute across the metaverse.

Wouldn’t that be amazing?