May 09, 2019

Tutorial for ALL social networks


The amount of tutorials for single social networks is huge. The amount of youtube screencasts, blogs, books and podcasts about the topic is around 1 million or maybe more. Sometimes, the tutorials are well written but in most cases the same information is repeated over and over again. Most tutorials are written in a personal style. The idea is to explain a person how to use Facebook or another social network the right way. It is explained how to join a group and how to post a link. What is missing is an abstract description what the purpose is.
I'd like to give a more general perspective on the topic. What we have to do first is to a go a step backward and focus the attention to an old, not often cited software which is called planet aggregator. The software itself is not very well written, but the tutorials which are explaining how to use Planet are the best introduction into social media.
Planet is not yet-another social network in which the user can become a friend but it's technical standard which was developed in the upraising semantic network area for combining RSS feeds. Somebody may ask what the purpose is of combining different RSS feeds. The good news is, that this question is adressed in the tutorials about planet. In the first tutorial it was described, how 10 universities have used planet to aggregate their blogs. In the second tutorial a use case for the planet gnome website was described and many other examples are available. The good news is, that these tutorials are not a tutorial how a single user has to use a certain social network, but the tutorials are describing neutral what content aggregation is, and how to realize it a with a python script. This gives a much more expert view on the topic.
Sure, in most social networks like Facebook, the planet software is useless. The reason is, that the Facebook API works a bit different from the planet standard and if the user is not interested in automating the task but likes to insert a link manual to a group he won't need any planet RSS feed. What the user can learn from planet is to see the overall ecosystem from a more general perspective. The idea of a facebook group is not, that somebody drops a URL and get a like. But the idea is, that existing content written in blogs will feed into a single playlist.
It is important to know, that the planet software itself is outdated. The sourcecode is written poorly, and the project on top of the planet program like planet gnome are looking like an anti-pattern of how to not realize a social network. But, what can i recommend to read are the tutorials and use-cases around the planet software. In most cases, they are written for an academic audience and containing lots of references to technical information about semantic net and annotated playlist. A good paper which describes the early attempts of news aggregation is: Willison, Simon Philip. "Building a decentralised collaborative annotation system for the World-Wide Web BSc (Hons) in Computer Science." (2005).
It is a bit funny to see, that the SEO community and the Planet community are doing the same thing but from different perspectives. SEO is a marketing term which is used by the mainstream audience for communicting about social networks. In contrast, the planet software and content aggregation is a term used by libraries and semantic web visionairs. The aim of both communities fits together. The problem is that they are not communicating to each other very well. Most SEO experts never heard of the planet software and vice versa.
Perhaps it make sense to compare SEO and the planet software direct. SEO is used in the context of existing large scale social networks like Facebook, Google+, Instagram and Twitter. The term SEO is used to describe what the marketing agency of a newspapers are doing. The problem with the SEO community is, that their description are not based on academic standards but are similar to some kind of pseudo science. There are lots of tutorials available but they can't be recommended to read. On the other hand the planet aggregator community has a different background. They are located in the academic library, in the open source software movement and some of them are familiar with Open Access and WIkipedia. They are not using the normal social networks like Facebook, but have created their own websites. A typical example is planet gnome, but there are many other websites available on top of the planet software.
The reason why such projects are created are the same for SEO experts and planet users, because they want to aggregate existing content. And they are not using a fulltext search engine like google but are providing a playlist of URLs. The idea behind planet gnome is, that the user doesn't have to type in the term linux to see the latest information but he visits the website and get a linear list of blogposts. What the most interesting feature of the planet community is, that they are reducing social media to a minimalist approach. A typical website has the following structure:
2019 may 9:
title, URL
title, URL
2019 may 8
title, URL
title, URL
title, URL
2019 may 6
title, URL
title, URL
The URL can link to different domains, that means the articles were written by different authors which have uploaded the content to their individual blogs. And the planet website is creating the playlist for the entire content.
Exactly the same principle is described in Facebook tutorials, in which it is explained how to post a link to a Facebook group. What such a tutorial is explaining is how to create a curated playlist of URLs which are ordered in chronological fashion.
Another important aspect of websites which are produced with the planet software is, that they are containing a large amount of information. A normal blog created by a single author has around 30 postings per month. The reason is, that a single author is not very productive. This is equal to one posting a day. Secondly, a normal blog is based on a certain knowledge. The author can only write articles which are fitting in their background knowledge. In contrast, a planet software curated website contains the combined information of many blogs. The typical aggregated websites gets input from around 100 blogs. 30 postings a month x 100 blogs = 3000 postings a month = 100 postings a day. That means, on a single day not a single title is available but a large list. Additionally, the articles were written by different authors independent from each other, so the overall amount of knowledge which is transported is much higher. The reader gets the output of a large scale collective hive mind.
This effect is not only the result of a planet software but can be seen in mainstream social networks like Google+ as well. It is the result of aggregated information. Somebody may think, that a normal Search engine like Google has already all the content in the database. But a news aggregator puts this information in a different context to the screen which is a chronological one. If annotation and upvotes are added to the playlist the value become much greater.
A planet aggregated website can be improved by additional features like tags. A tag is usually used for categorize the articles in a single user blog. For example if a single user blog contains of 30 postings each month, 4 of them will get the tag “Artificial General Intelligence”. If the tag system is adapted to planet software aggregation the tag is cataloging the content from different blogs from different authors. That means, the tag AGI contains not only 4 postings in the last week but maybe 40. Such a system is very interesting for a reader who would likes to get an overview. He is not only analyzing what a single authored blog knows about AGI, but he gets an impression what the entire community knows.
Multi-authored blogs
Most blogging software like Wordpress allows to create a multi-authored blog. The admin can create subusers and provide them rights. A subuser can login into a blog and post an article which is peer reviewed by the admin. On the first look, this sounds like a powerful feature, but in reality multi-authored blogs are the exception. Most blogs are a one-man show. That means, the admin is the only person who writes articles.
And if this is true, the next question is how to combine all the content into a larger community. One possible explanation is, that weblogs are outdated and should be replaced by wikis. A wiki has the advantage that many authors can edit the content. It is a bit surprising that in reality weblogs are not dead. They are healthy and the amount of blogs has increased. So it seems, that the authors are feeling comfortable with their own blog and doesn't like Wikis very much.
The answer to the problem of community building for blogs is content aggregation. Which means, that the current blog infrastructure remains the same, but on top a meta-blog is created. The planet aggregator software helps to realize such a meta blog. Right now the overall community of planet users is experimenting with the features. They have discovered, that in theory planet is a great tool which helps to connect existing blogs into a higher structure, but they are unsure how to use the idea the right way.
What a possible replacement for the planet software is well known and was described in a previous post. It is a wiki based social network which is collecting URLs and is filled manual by a group. As far as i know the idea of using a wiki instead of the planet software is not very common in today's internet.
Combining SEO tutorials with planet aggregators
The most advanced technique described in classical SEO tutorials is called an “autoposter bot”. This kind of technique is often described as the ultimate tool to overtage an existing Facebook group and if a user is doing so to often he gets banned by the admins. The interesting point is, that the planet aggregator community doesn't know of such technique. Because the inner working of a planet RSS feed is, that it is working with the autopost mode in default. That means, an out of the box, planet website will autopost all the RSS streams into one. What the planet aggregator community has is a different problem, because in default the stream doesn't contains the ability to make the stream more individual for exmaple by adding upvotes or posting a comment.
And surprisingly, these problems are not known in the Facebook community. In Facebook it is the default setting, that all the postings are created manually and can get annotated. The reason why Facebook and planet aggregators is different is because the technology behind it is working different. The idea of Facebook is, that a company provides a website and the user should interact with the website. While the idea of the planet software is, to to provide a software for creating an automated playlist.
Let us go into the details. If the planet software is installed with the default setting, the input for the software are 100 RSS streams. Each of them contains of 500 postings. After pressing the start button the planet software is generating the combined feed which contains of 50000 postings. Every Facebook user would be impressed by such a powerful autoposter, but the planet community doesn't call the principle autoposting, because it is the normal working of the python script. The reason why such a large scale RSS feed with thousands of postings is the normal case is because the planet software doesn't need a website. The idea is not to post the stream to a group, but the group has to be created around the planet software. What most users of the planet software are doing is to use the created RSS stream and build around the stream a website. They add the ability to comment the entries and they are rendering the title headlines.
We can say, that both existing technologies have some disadvantages. The problem with Facebook like social networks is, that the user has to interact with a gui and it's not possible to put an RSS stream into a group, while the disadvantage of the planet software is, that the automatically generated playlist contains of two much information and manual intervention is not possible. The answer to the problem is simple: we can ignore Facebook and we can ignore the planet software. And build a news aggregator website from scratch which is superior to both.
Producing information overload and making the reader angry
The core functionality of the planet software is to combine different blogs into a larger blogs. In the easiest case the input stream contains of only 2 rss feeds. Which are mixed to one. If each blog has around 30 postings per month, the resulting rss feed will contain 60 postings a month. The average reader can handle this amount of information very well, especially if he reads the RSS feed's sourcecode. The file contains of 60 postings, and sometimes the post is from Blog a and sometimes from Blog b.
But what will happen, if the concept is scaled up? From a technical perspective the planet software can handle much more streams. It is possible to combine 1000 blogs into a single stream. Each blog has 30 postings and the output of planet will contains of 30000 blogposts. The problem is, that each entry is from a different blog. And it's very easy to get lost in the information. If the stream is displayed graphically with previews of the text, podcasts and videos the user is in a typical information overload situation.
Let us make an example. The situation is, that 1000 blogs are feed into a single planet stream and the content is rendered on a website. The blog authors like the idea, because they are now part of a larger community and get a lot of traffic from the users. So they are motivated to post more content in their blog. They are increasing the posting frequency from 30 posts a month to 40 posts a month. Which can be handled easily by a single author. Now let us take a look how the resulting combined stream will look like. The amount of overall posting will grow up from 30000 to 40000. Which means, that the average reader sees 10k more postings on the screen and gets completely confused by the information overload. He has no ability to reduce the load, because the blog authors decides how much they will write. What the reader doesn't have is an information filter. He gets bombarded by the all the blog content. Some of them is interesting for him, while other is not.
The effect is an emotional reaction, known as the social network effect. The user is fascinated and angry at the same kind. He stands in the middle of a tornado and is searching for place which protects him from the information. In a classical search engine driven filtering, it's easy to avoid information overload. Because as default a search engine doesn't provide information. The reader has to type in actively a keyword to see the result page. In case of a social media stream the default setting is, that all the information are provided to the user. He can't control the flow.
To overcome the information overload problem of social media aggregated streams a new kind of technology is needed which combines the power of a planet software with a mechanism to reduce the reduce the amount of information. This new kind of technology has to answer the problem of how to transform the RSS feeds of the individual blogs into the combined playlist which is displayed on the website. The best technology available today is to create the combined playlist with a wiki system. The RSS feed of each blog is feed into a media wiki, in the media wiki human users are curating the content, and the resulting wiki page is displayed to the reader.
A wiki system is a very effective filter system. Wikis have the ability to organize large groups of editors in a productive way. Wikis have also the tendency to block unwanted content. What most Wiki admins are doing in the default mode is to see any edit as vandalism. This strict filtering mechanism results into a minimal wiki. Which means that only the important content can pass the wiki filter. The problem of filtering the stream is delegated to the wiki. It is enough to analyze the wiki version history to get an impression how a group of people is doing so. That means, if something with the combined RSS stream is wrong then the problem is located within the wiki. And if something with the wiki is wrong, a single Wiki admin can be blamed.
This principle avoids that the reader get angry. If he sees something which he doesn't like in the stream, he can analyze the reason why. He can go through the wiki version history to get an explanation, why the URLs was put into the stream and by whom.