It used to be that if I found some online writing I really liked, I’d look up and follow the author on Twitter. Now I look up the author and follow them on LinkedIn.
This opinion piece by Elizabeth Lopatto is first rate. I love it when an entire article resonates from the jump.
‘Get in, loser, we’re going back to Web 1.0. We have the opportunity to get out from under the algorithms. So maybe it’s time to think about what a web of people looks like now.’
I have been spending a lot of time doing that with an effort scaled to one person with no resources: exploring the Mastodon API and how it can be used to filter and display search results in a way that renders them as useful as possible. Doing this I’ve kind of reached a — well, not a conclusion, but a road that might lead to a conclusion.
I’ve talked before about the ubiquitous metadata of time and place and how you can use it to filter and recontextualize search results. (See: Pam’s Pin, Contemporary Biography Builder, Gossip Machine, etc.) When we think about refining search we often start at the idea of time and place when it makes sense, because they’re easily understandable to the end user.
Now I’m wondering if an online audience, even a nonexpert one, could be assumed to understand a similar set of metadata standards for information they receive socially?
In other words, when you see this post do you understand it came from an individual? Probably. Do you understand that if you and I are friends on a social network that I have a network relationship to you? Probably. Do you understand that when you read this post you will have the opportunity to share it or save it or otherwise respond? Probably.
Does that all sound basic to you? Probably. And it is, NOW. Because lots of people and companies have spent many years explaining the Internet and developing structures to make it as easy to use as possible. Why not take advantage of that evolved understanding?
But what does this have to do with refining search results?
I’m wondering if we can take the understanding developed by Internet users as a group and develop transparent filtering/narrowing methodologies that require as little algorithm as possible, or rather an algorithm that is entirely user-generated.
I’m trying to do this with the Mastodon API because it includes with its query responses information about shares and responses and favorites. I am convinced there are ways users can use that information to refine their search results in a way that is transparent, meaningful, and even useful (because it gives the user feedback about how their own behavior influences other people’s search results.)
Honestly, Mastodon’s completely open API and how it treats hashtags have given me so many ideas about integrating social signals into transparent search that I’m not sure I’ll ever get through them all.
Go read Ms. Lopatto’s article. It’s really good.