Interesting Spark interview from Ted Striphas, associate professor in the department of Communication & Culture at Indiana University. (It's only about 16 minutes so go ahead and listen to it.) He's talking about our algorithmic culture and the danger and issues surrounding the use of our technologies that are so often at the foundation of our current cultural activities: recommendation algorithms, search engines, etc. Unfortunately, I think a lot of what he says fall into some typical traps in the fields of technology versus society.
There's the usual fallacy of treating human culture and ability as somehow magical and valuable and resulting in quality always if not always obviously. I get the feeling that Ted and Nora (the interviewer) both think that injecting "the machine" into the cultural process flattens the results, makes it die a little inside (to be a little overly dramatic in my description). There's the assumption that technology is somehow going to control everything, that using these algorithmic tools will keep us from something we don't like. We've always avoided what we don't like and we've always had to face what we don't like because those efforts fail. Technology has ALWAYS limited our use of culture and experience with others. Because of how the printing press spat out the words, how the horse-and-carriage distribution network could reach people, how roads were laid out (due ultimately to limitations of technology and workload) and so on, news could only travel in so many ways to so many people in so many formats.
Practices like SEO (search engine optimization) is seen as somehow changing the authenticity of the results, making things popular which are not REALLY popular. I'm not even sure what that's supposed to mean. What's the difference between what the numbers say are popular due to usage and what is REALLY popular. Perhaps, if things were different, other patterns of popularity would arise. But that's always been true. What rises to the top has always been biased. We just understand that bias a little better because we are intentionally controlling the tools that these biases are flowing through.
And finally, there seems to be a narrow definition of "culture" being used; as something governed by or enjoyed by a minority. Culture is created and experienced by everyone. It's the average of all our tastes and activities, it's the successful memes that have survived society. Often "high culture" are those aspects of culture that are enjoyed by the minority and/or had been enjoyed by the majority in times past, but culture as a whole is a description of entire social groups. What we're doing is affecting culture. What gets remembered as culture has typically been those parts of culture that have been lucky enough to have been recorded somehow: in print, on records, as oral tradition, etc. But sitting in the middle of present day society, looking around, anyone can see how much more complex culture is while being experienced.
Sure, algorithms are affecting culture more and more, but it's always been the case that non-cultural tools and processes have affect culture. Don't worry about it. Don't run from it. Understand it.