February 24, 2020

Comparison of RHEL with Debian

The Red Hat based Linux distribution has made everything right. They are earning money by selling the software to large companies, they have a free version Fedora distribution to offer which demonstrated the power of Linux and they have a professional community who is well organized. They are maintaining the issue tracker and fix all the bugs.

In contrast the Debian ecosystem is working quite different. The annual conferences are the pure chaos, the Debian software isn't working as expected and they have too much packages in the repository. One of them is fuzzylite, which is a C++ library for creating Fuzzy logic applications. This kind of package is no longer needed, but it seems that the Debian community has a different point of view to this problem.

In general we can say, that Fedora is a here to stay, while Debian blocks the Linux ecosystem. Without Debian, Linux would become a success on the desktop. It seems, that the combination of a non commercial project plus a stable release distribution isn't the best choice. The problem is, that it's not possible to argue against Debian instead the project should be ignored completely. After a while the users will recognize that Fedora is much better than Debian.

The problem with Debian is, that they the community is convinced that they are part of movement. Debian is arguing against Red Hat.

February 21, 2020

Why Debian is better than Arch Linux

Debian Linux increases the gap between upstream and downstream. This become obvious, if an upstream developer tries to submit the latest version of his sourcecode. The debain maintainer prevent him for doing so, because only security fixes are applied to existing debian installations. In contrast, the Arch Linux distribution works very close with the upstream. After starting the package manager the upstream sourcecode gets compiled into the latest Arch Linux binary file.

The obvious question is, why should somebody increase the gap between the upstream and downstream? Because it can be filled with a community. This community argues for the Linux distribution and against the upstream. The result is a long running debate held on conferences in which the aim is not to solve conflicts but to increase the misunderstanding. The principle is known from Wikipedia in which all the editors are in a permanent conflict with the admins. The opposite social roles is the strength of Wikipedia and of Debian as well.

In the Arch LInux distribution such a community isn't available. Instead the pacman software is connecting the user with the upstream releases. Pacman doesn't understand English and he won't recognize that the latest kernel will introduce new security problems instead of solving the old ones. Only the Debian community is able to do so.

February 17, 2020

Docker images with Node.js

https://frontendmasters.com/courses/complete-intro-containers/alpine-node-js-container/

The classical understanding of an operating system is based on the assumption that a physical computer is used. This perspective was available in the early 1990s. Since the advent of virtualization and docker containers the context has changed. It makes no sense anymore to compare operating systems like Windows 10, Ubuntu and Fedora against each other, but it's possible to compare a 3 GB Fedora ISO image with a 50 minideb docker image. This has complicated many things because it's no longer possible to be part of an operating system community like Debian or Windows, but the system administrator will loose all it's experience from the past.

The most advanced form of running a webserver in the year 2020 is a combination of docker, Alpine Linux and nodejs. The resulting image file has a size of less than 50 MB and it doesn't fit in the categories from the past. Alpine Linux is different from Debian or Ubuntu, it has nothing to do with Fedora Linux and it's not part of the Microsoft world. Instead the correct classification for such a webserver is, that it's virtual, runs with Linux as a kernel and it's using a Javascript for providing the functionality.

What we can say is, that the described Alpine LInux project is located in the github universe. It has to do with social coding. This makes it hard for classical communities like Debian user or Windows Server administrators to keep up. Bascially spoken all the manuals written about Windows Server adminstration and all the information in the debian forum are useless to understand Alpine linux plus nodejs. It's simply a new kind of technology unknown before.

One option to deal with this challenge is to ignore the problem. An old school admin can argue, that node.js is not the right choice as a backend programming language, but only PHP and perl can be used for productive systems. Another system adminstrator can argue, that Open Source in general is a bad idea becaue no support is provided. And a debian adminstrator can argue, that it's possible to use the netinstall CD from Debian which needs only 150 MB and can provide a small debian image as well. All these arguments are wrong, instead they want to explain to the world, that nothing has changed and future webservers are powered by the same technology used in the past.

Because so much things are changing all the time it make sense to provide some stability which is constant. The first thing to mention is, that the outdated technology and the newer technology in the now are both described in books and papers. The library catalog will provide information how a webserver was installed in the early 1990s, in the mid 2000s and in the 2020s as well. A second thing which doesn't have changed in the last 30 years is, that all the technology has to do with writing sourcecode. A c program in the year 1980s has a similar syntax like a c program in the year 2020. And last but not least, technology is usually connnected to profit interests of companies. The early computer revolution in the 1980s was powered by tech companies listed at the NASDAQ, 40 years later new tech companies are listed at the same stock exchange. Especially the Open source movement doesn't have changed this situation. The light houses in open source development like Red Hat, Amazon and Google are for-profit companies. They are not programming software because it's fun, but because it's their business.

The best idea what computer experts from the past are able to do is attack new developments like Alpine Linux. It's the most hated Linux project because it's so different from operating systems in the past. There are many blog postings available in which longterm experts are explaining to the world, why they are stay away from docker in general and alpine in detail.

The main difference of Alpine LInux over classical operating systems is, that it's very efficient. Installing a new system from scratch takes around 2 seconds and the size is around 30 MB. And the exact details how to do this isn't described in classical Linux manuals from the past but they were written from scratch by Alpine developers. So they will become the new gurus who are telling other users what to enter on the command line.

The only good news with Alpine Linux is, that it's based on the LInux kernel which was written in C. It's only a question of time, until Forth developers are feeling motivated to show that world, that they can provide the same functionality with a stack based approach ...

February 16, 2020

Alternatives to Debian

The Debian distribution is a very influential Linux distribution. It is available since the 1990s and the successful derivative Ubuntu is using Debian packages as the base. I'm sorry to explain, that the Debian ecosystem is no longer stable but it has developed into a failed Linux distribution. Instead of explaining the problems of Debian itself it make sense to describe the situation in the year 2020 if somebody likes to install a webserver. The interesting point is, that a webserver doesn't mean, that somebody booting up a pentium I PC which contains of a CD-rom drive and a magnetic harddrive. Secondly it doesn't mean, that somebody is using an SSD drive in a modern rack mounted server, but it means, that somebody pulls an operating system into a docker container and deploys this into the cloud.[1]

A docker container is a virtualized server. It reduces the costs and has became the defacto standard for all the webservers. In theory, it's possible to install a debian operating system in a docker container but in reality lightweight distrubutions like Alpine Linux were used for this purpose. Alpine Linux is a relative new development. It was first published in 2010 but has become very successful. It's not based on Debian and not on Fedora, but it was developed with a different purpose in mind.

What we can say for sure is, that Debian can't compete with Alpine Linux. Only another Docker-friendly Linux distribution can compete with Alpine Linux. But if Debian is no longer relevant for a text-only webserver, which purpose is fulfilled by the project? Right, it has become obsolete. If the idea is to provide not a cloud webserver but a desktop operating system there are many alternatives to Debian. One is Clear Linux. Clear Linux is very different from classical desktop operating systems. It is working with a combination of highly optimized binary files plus an incremental update mechanism. Similiar to Alpine LInux it's a very new development not available in the past.

We can say in general that the computing industry has changed a lot over the times. Innovations like github, Stackoverflow and Google Scholar were not available 20 years ago. The disadvantage of this development is, that projects from the past will become obsolete. They are not serving the needs of today's programmers anymore. Debian is one example for an outdated project. Similar to Windows XP and Google+ the service has lost it's influence. The Ubuntu operating system which includes the well written Wiki was perhaps the greatest milestone in the long history of Debian. Bye.

[1] Zerouali, Ahmed, et al. "On the relation between outdated docker containers, severity vulnerabilities, and bugs." 2019 IEEE 26th International Conference on Software Analysis, Evolution and Reengineering (SANER). IEEE, 2019.

Debian is dead

The amount of subjectives judgements about Debian is high. The problem in the past was, that no alternative was available and the arguments were not based on academic standards. A user from the docker community has changed the situation and published a paper, in which a newly developed tool was used to scan docker images for security problems.[1]

Especially debian based images were scanned and lots of unfixed security problems were found. Nearly 100% of the debian docker images which are hosted in the cloud are vulnerable to attacks. In the same paper a possible alternative was mentioned, which is Alpine linux [2]. Alpine linux is different from Debian or Fedora and was developed with a docker container in mind. It's a security hardend, slim down version of an operating system and runs in the cloud but also on a rasberry pi computer.

The study [1] has shown very concrete what the problem with debian is. According to the debian manual, a typical websever has a debian sticker on the front panel, consists of a CD-ROM drive and minor updates are left but only important updates are installed every 6 months. This kind of description was correct in the early 1990s but fails to describe the current situation. Basically spoken, it make sense to argue against Debian, and the only thing which is important is to bring the argument on an academic level.

[1] Zerouali, Ahmed, et al. "On the relation between outdated docker containers, severity vulnerabilities, and bugs." 2019 IEEE 26th International Conference on Software Analysis, Evolution and Reengineering (SANER). IEEE, 2019.

[2] https://nickjanetakis.com/blog/the-3-biggest-wins-when-using-alpine-as-a-base-docker-image

February 14, 2020

Github has replaced debian

The first debian release was published in 1993. In that time, the github repository wasn't available. It was founded in 2008. Let us imagine who bugtracking and patch submitting was handled by the Debian community before the advent of github.

Suppose, a user has identified a bug or a security issue in one of the debian packages. What he needs is a higher instance to report the error. This is equal to the debian mailing list. The maintainer who is receiving the bug has access to the sourcecode because it's stored in the debian repository. So the maintainer can take a look into the code and create a patch. Then the patch is submitted to the upstream.

Bascially spoken, Debian was a manual version of the github repository. It was a layer between bug reports, patches and the sourcecode. Since the year 2008 the github project was available. The consequence was, that the debian layer was no longer needed. A common workflow which includes github is working the following way: the user sends the bug not to the debain maintainer but creates the ticket in the github repository. He has also access to the sourcecode and the user can checkout the code and create a patch.

The interesting feature is, that github doesn't need human intervention but it's working with commands like git clone and git commit. Basically spoken, github has replaced debian.

The consequence was, that from 2008 on many so called rolling release distribution have become popular. The famous one is Arch Linux. The interesting point in Arch Linux is, that the users are not sending bugs to the Arch Linux mailing list and they do not sending patches to the Arch Linux team. If they find a bug or like to fix a bug, they will submit the commit direct to the github repository.

Or let me explain it the other way: Suppose, there is no github website available. Then it make sense to invent a debian like distribution which is based on bugtracking, stable releases and maintainers. Not Red Hat has killed debian but it was github. The thesis is, that the github website acts as a Linux distribution.´ There are places for the x11 sourcecode https://github.com/mirror/libX11 and the geany texteditor https://github.com/geany/geany If the process of bug creation and patch creation are handled by the git tool and the github website, the amount of work located in the linux distribution itself is lower. A rolling release distribution is doing nothing else than taking the latest github sourcecode and compiles this into a binary file. All the communication between users, maintainers and upstream is handled on the github platform but not within the debian distribution.

The funny thing is, that Arch Linux and other distribution are stable release distributions as well. Stable release means, that the code was freezen at a point in the past and then only security bugfixes are provided. The feature of freezing the code is called in the github website “creating a new release”. The github owner is allowed to copy the code on a certain time into a zip file which has become a stable version. This stable release of the github repository is used by a linux distribution to compile the sourcecode.

The funny thing is, that github is not only a website but they are creating conferences as well. In the year 2000 the debian maintainer has meet each other at the debian conference and in the year 2017 they are meeting each other at the github universe conference. Bascially spoken, github has become a universal operating system which bridges the users, developers and maintainers from different operating systems like Linux, Apple, Windows and Android.

According to the latest stats, the debian community consists of around 1000 developers worldwide plus a large userbase which goes into millions. And the github community contains of 40 million users, 100 million repositories and nearly all the programmers in the world.

What's wrong with debian 5 lenny?

Debian 5 lenny was released in 2009, which was 10 years ago. A short look into a screenshot will show that the Linux desktop looks very similar to a current desktop. Instead of ext4 the former filesystem was ext3 which was working great too. So what exactly is the difference between Debian 5 and Debian 10? It's the same operating system, only minor improvements and security issues were fixed. Apart from the security topic, the average user can work with Debian 5 lenny very well. He won't miss anything. All the programs like webserver, gimp, webbrowser and OpenOffice were available 10 years ago.

The good news is, that most LInux user have updated their system because it costs nothing. So there is no problem with outdated Debian 5 workstations who are vulnerable for security issues. The problem is, since the last 10 years no improvement is visible. That former story of highly innovative software development is broken.

To understand what is different we have to take a look into the past. A computer in the year 1980 looks very different from a computer in the year 1990. In the year 1980 no software was available and textual interfaces were the only choice for the user. While ten years later a large amount of software plus graphical user interfaces were the standard. I would go a step further and compare the year 1990 with the year 2000. The difference he was that the price was lower. A typical computer in the year 1990 costs a large amount of money. The price for the average workstation was similar to buying a car. While in the year 2000 the same computer system was sold in mainstream computer stores for low amount of money. That means, the customer has seen a large improvement and the story of a highly innovative computing industry was valid.

This plot is broken since 2010. It was no longer possible to provide an added value for the customer. Today's computer are looking the same, and cost the same like computers ten years ago. The improvements are nearly invisible. And in most cases, the consumers are rejecting the improvements at all. One example is the introduction of 3d desktops which is possible with today's computer but all the customers are prefering normal 2d desktops because they run faster.

So the question is, what are the future plans of the PC industry? Are any improvements plans which makes the computer in ten years from now an highly advanced system? No such things are available. The PC industry which includes hardware and software companies are done. They have made their homework and the market is saturated. That means, the PC industry has built over night in the 1980s the market and now the situation is relaxed. Innovation means, that something is different from the past, and this is no longer the case.

PC companies are similar to the car industry an old buisiness. They have developed their workflows, they are producing a high quality and they have no need to reinvent themself.

The last big attempt of the PC industry to develop a new product was the iphone from Apple. The idea was to provide to the customer something which is new and which makes sense for him. The first iphone was released in 2007, which was 13 years ago. Since then the PC industry has failed to develop any new kind of products. The problem is, that technology can be invented only once. If somebody has programmed an operating system and manufactored a cpu already, it's not possible to promote this technology as something which is new and exciting.

The question is what will comes next after the PC industry? The first thing what is needed is the self-description of software companies that they are done. The products they are developing are known and they are products are very similar to what was available 10 years ago. In some niches like the programming language C/C++ the software is the same since 30 years. That means, the first C++ compiler was invented in the mid 1980s and since then the C++ language was used. Pretending that the next iteration of the C++ compiler will revolutionize the PC industry makes no sense.

Instead of describing the situation too negative. It make sense to see the positive sides. The PC industry has developed affordable computer technology for the masses. The internet has replaced former television and radio networks and the normal user is invited to become owner of his own youtube channel online. Thanks to search engines he has access to more information than in the past and a modern PC needs less energy than a light bulb. It's not the fault of the PC industry that they have done everything right. The problem is, that the world is different than in the 1980s. There is no need to program software or invent a 32bit cpu. Because these tasks are solved already.

Let us describe a well known phenomena. A user in the year 2020 comes to the conclusion, that he likes to reinvent the PC industry. He is programming his own operating system and build in the fpga device his own CPU. The result is software and hardware which is not new but is repeating the tasks from the past. If the user has made everything right, he has invented his own 16bit cpu whicn can run his own MS-DOS operating system. This is not the future, but it's learning from the past. It's similar to playing a game, if user is already familar with the game. He knows what is needed to solve the puzzle.

The interesting point is, that larger parts of amateur tech-enthusiasts are operating under this mode. 90% of the questions on Stackoverflow are about issues which were solved already. The users are solving problems from the past because they want to feel good. What the users are not doing is to solve more complicated tasks which are not researched very well. The reason is, that complicated tasks are resulting into failure and this is equal to frustration. Frustration is something which holds programmer down and it doesn't fit to their self-awareness of a competent programmer.

February 13, 2020

Understanding the Debian operating system

The difference between Debian and other Linux distribution is, that Debian is working with a fixed release workflow. The only way to realize such a release cycle is with human intervention. That means, the latest update of ArchLinux (a famous rollling release distribution) can be created with a small team with less than 5 humans, while the latest update of Debian requires 1000 debian developers or more.

The effect of a stable release cycle is, that the distribution isn't solving problems but it is asking for manpower. That means, the debian ecosystem has a demand for new developers and maintainers. While rollling release systems like ArchLInux have a tendency to reject new users. Instead of analyzing what the better Linux distribution is, we have to go a step backward and describe how commercial software development is working.

The most famous operating system in the world is called Windows 10 and it is programmed and distributed by Microsoft. The interesting point of Microsoft is, that the company has arond 150k employees worldwide and the tendency is growing. This is a clear sign, that Microsoft isn't working very efficient like the ArchLinux and the Gentoo community but they are operating with stable release version. Which can only be created with human intervention. The thesis is, that Microsoft has become successful because it is not solving problems, but it is generating work which have to be solved by humans.

To make Linux distributions more successful, a community has to be created. The only way in creating a community is produce a workload for this community. Debian is great in creating new unfixed issues. While rollling release distributions like Antergos, Gentoo, Clear Linux and Arch Linux have a tendency to reduce the workfload, because the new version was created with an automated script.

Or let me explain it the other way around: it's not possible to create a growing community, if all the problems were solved by the package manager. And without a community, a Linux distribution will fail.

Cons of Clear Linux

The Clear Linux distribution is similar to Antergos a rolling release model. It has perceived a lot of attention because it is able to outperform Debian desktop easily. It's a bit tricky to describe what is wrong with Clear Linux. The assumption is, that the projects gets terminated within 24 months because it will fail to communicate the idea to a larger audience.

What the Clear linux website is doing is to explain to the newbies, that rolling release updates is a here to stay, https://community.clearlinux.org/t/does-clear-linux-have-stable-release/1346/6 If a user likes to switch of the auto-update and tries to backport security fixes into Clear Linux, the adminstrator of the community forum will teach to the user, that this is not allowed because it makes no sense. That means, the newbies didn't have understood what Clear linux is about.

The problem is, that most of the users are interested in deactivating the auto-update feature and are programing backports, because this is equal to build a community around a distribution. This behavior isn't compatible with the Clear Linux project. So the question for the community admins is how to convince the public, that rollilng release is a here to stay. The answer is they will fail. Antergos has failed because it's not possible to explain what the advantages of rollling release is, and Clear linux will fail because of the same reason.

The combination of Open Source with a rollling release model for the Linux distribution is an anti-pattern. It will destroy any community.

Quote: "many of you probably noticed over the past several months, we no longer have enough free time to properly maintain Antergos." https://en.wikipedia.org/wiki/Antergos

February 12, 2020

Understanding the Linux ecosystem

Before it make sense to describe the inner working of the Linux community let us investigate how the Microsoft ecosystem was built from scratch. Before the year 1975 there was no Microsoft company available because it was the starting date. Rascally spoken, Microsoft has built together with other large software companies the PC industry from scratch. They have created a market for software which consists of consumers on the one side who are spending money and the employees within the companies on the other side how are earning money.

This social relationship was pretty stable over the decades. The names of the products have changed from Windows NT in the early 1990s, to Windows 2000 professional in the year 2000 over the current Windows 10 Pro which is sold for 400 US$ to the customer. The open question is, which kind of product is sold by Microsoft? It's not the software alone, because the current Ubuntu Workstation ISO file available in the internet has nearly the same features like the latest release of the Windows 10 Professional operating system. But it's everything around the software which includes commercial advertisement, printed books, education in the universities and the marketing efforts in the retail stores who are responsible for the success of Linux.

Basically spoken, if the customer enters the computer store and buys the box with the Windows operating system he is not only buying sourcecode but he part of a larger marketing concept which is available in journals, Television and in educational institutions. In contrast the Linux operating system doesn't have any kind of marketing, or it is very bad marketing. The question is how to realize the marketing for Open Source software. Nobody has an answer to the problem, and because of this reason, the market share of Linux on the desktop is less than 1% while Windows 10 is used by hundred of millions users worldwide.

But what is marketing? Marketing is the difference between the current Ubuntu iso and the current Windows 10 iso file. From a technical perspective both software products are nearly the same. Windows 10 professional comes with the powerful NTFS filesystem, a window manager and powerful network features, and Ubuntu provides all the same features. Some Linux fans are arguing, that the Linux system is superior, because the customer gets more software, doesn't pay anything and gets also access to the sourcecode. But it seems, that it's not enough to provide the better software, if the merchandise of Microsoft is superior.

How marketing is working for close source software is well known. The efforts of large companies like Apple, Microsoft and Intel are documented. .They are spending millions of millions of US-Dollar for advertisement and the result is, that the average customer likes to buy the product. In contrast, the Linux ecosystem is working without any sort of marketing. Most of the Linux distributions are technically driven. That means, a technican has created a piece of software and then he is surprised if the world doesn't like to install the product on the own computer.

Let us take a closer look, what Linux has to offer from a technical perspective. There is a powerful kernel, an advanced filesystem, a state of the art webserver, a C/C++ compiler, an office application, a video editing software, a drawing software, and many other high quality software. Most of the software in the Linux ecosystem has a higher quality than the commercial counterpart. For example GCC compiler is more powerful than the Visual C++ compiler and the ext4 filesystem is more robust than NTFS. But, the customer prefers the software from Microsoft. And the only explanation which makes sense is because Microsoft products are feeling better. They have a better branding. They are introduced in the market already and it's easy for the customer to install the software.

The question left open is, how to reduce the entry barrier in Linux? It has to do with marketing. This can be realized with two ways. One option is, to assume that Linux is a commercial product. As a consequence a company like Red Hat has to build a brand on top of Linux. That means, Red Hat makes some advertaisment for Linux and then all the customer will install it on their PC. The other option is, that Linux is working different from classical products and the marketing has to be realized by the user groups. This is equal to the Debian philosophy in which not a commercial company stays behind the software, but local chapters of volunteers.

To answer which option is the preferred one we have to take a step backward and describe what is wrong with Linux. The good news is, that from a technical perspective, Linux is a great piece of software. It has no serious security bugs, it contains of hundreds of state of art software projects and it outperforms easily Windows 10 and Mac OS X. There is not a single case known in which a Linux user has switched back to Windows. If somebody is familiar with Linux he will laugh about Windows 10 because the system is weaker. That means, there is no need to improve the sourcecode of libreoffice, Lyx, the Linux kernel or from Wayland.

The only thing what is missing in the Linux world is a community. Ubuntu was the first effort to build such a community. The project was successful. Not because Ubuntu has contributed very much to the Linux kernel but because it has build some advertisement around Linux, which includes a wiki, a forum, some books and some conferences.

Roughly spoken, Ubuntu is nothing else than a marketing campaign to bring Linux into the mainstream. It contains of t-shirts, advertisements, indoctrination for universities and unscientific case-studies. The assumption is, that not technical innovation, but merchandise will bring Linux to the desktop.

Clear Linux vs Debian

Let us make a practical example to explain the difference between software and marketing. A recent state-of-the art Linux distribution is called â€Å“Clear Linux” which was published by the Intel company one year ago. Nearly all users of Clear Linux are excited how fast the software runs. The package manager is superior because it downloads only delta upgrades and the desktop runs faster than any other Linux distribution. Clear Linux is more advanced than the current Fedora Linux so it can be called the queen of all distributions available.

So the prediction is, that Clear Linux will bring Linux to the desktop, right? No the project will fail. Similar to the Antergos project a while ago. Antergos was similar to Clear Linux a very advanced Linux distribution. It was supereasy to install, was compiled with the fastest flags and was based on the latest package manager available. The sad news is, that Antergos is no longer available, not because of the software but because of the missing community. And exactly this is the problem with Clear Linux as well. The iso file runs great, but most users who have tested it out are not motivated to use it for daily work. The reason is, that Clear Linux is technically advanced but there is no marketing community in the background.

The counterpart to Clear Linux is debian Linux. Debian Linux is according to most Liniux expert technically outdated. It's a failed project, because the debian community is focussed on their own problems, but not motivated to compile the latest software into an iso file. The surprising fact is, that the Debian community was established 27 years ago, it is one of the oldest Linux projects. And the assumption is, that they will survive Clear Linux, Fedora Linux and all the other distributions easily. Basically spoken, Debian has the focus on marketing. Their strength is to provide nice stickers, create podcasts about the debian operating system and start nonsense-debates about the cons of systemd.

Debian is not a Linux distribution but it's a movement which has in part to do with creating an operating system. It seems, that the focus on marketing first and reduced priorities for technical questions has made Debian successful. One episode from a debian conference will make the point clear. There was a talk, in which the user on the stage explained, that is not using Debian on their computer, but has installed a normal Windows operating system. But at the same time, the speaker was confident to talk about the Debian project.

Or let me explain it the other way around. Debian is not about the Linux software, but Debian is about Open source software in general and Linux is only one part of it.

Again, Clear Linux vs. Debian

The comparison between both Linux distributions makes a lot of sense, because they are representing both an edge case. Clear Linux can be introduced as an advanced form of Fedora Linux. Fedora is programmed by the Red Hat company. And Red Hat is the company who is programming the Linux kernel, the gnome environment and the systemd framework. From a technical point of view, all these distributions are great: RHEL, Fedora and Clear Linux are representing the latest technology available. They are equipped with virtualisation software like qemu, are nearly error free and can be installed easily.

The interesting question is, if Fedora and Clear Linux is so great, why is the user base so low? This is indeed a problem. It's an unsolved problem in the Linux ecosystem. There is on the one hand advanced software which is provides for free to the customer and on the other hand, nobody likes to install it. Only to get the fact right. The Clear Linux distribution provides all the features of Windows 10 Professional and much more but the software is available for free as an ISO Download. From a logical perspective, around 100 million Windows have to switch to Clear Linux, and all the Ubuntu users as well because it's the best software ever developed.

Everybody who has analyzed the Linux ecosystem will predict that this switch of the main stream towards Clear Linux won't happen. Instead the prediction is, that the project of the Intel company to bring their own Linux distribution to the market will suspended within 24 months from today. That means in 2023 nobody will care about Clear Linux anymore. The question is: why?

In the beginning of this blog post a possible answer was given. It makes no sense to compare LInux with Windows from a technical perspective, because most of the user decisions are the result of marketing campaigns. And the reason why Fedora and Clear Linux have failed is because they have a poor or even no marketing at all. The absence of marketing is perceived by the average user and he won't install the software on his own PC. Some users will test out the new distribution in the virtual box but they will use a different operating system for their daily work.

Under the assumption that marketing is everything in the decision for a Linux distribution the logical next step is to search for the Linux distribution with the best marketing and with the lowest technical performance. And this is – without any doubt – Debian. The quality is the worst of all Linux distribution, and at the same time, the community is the largest.

Can Linux be made lighter?

The Lubuntu distribution needs only 4 GB of discspace in the basic installation. Compared to 10 GB of mainstream distribution this is very little. The key is not located in a certain compiler flag but it has to do with the certain packages. Lubuntu is based on LXDE and abiword, but not on Gnome and LibreOffice. This helps to save a lot of space.

Let us take a look how much dependencies are generated by standard Linux packages. After installing the Lyx program, the complete texlive software is installed too. Does anybody need such bloatware? A potential alternative is abiword which needs only 20 MB plus cherrytree which is lightweight notetaking software. Sure, Lyx provides more features, but the price is, that the development team needs to implment all these features. The main problem with Linux is, that the number of developers is small and it make sense to spend manhours wisely. I do not see that the future of the Linux desktop is about LaTeX, Gnome and Java, but it has to do with small software which runs fast.

Sure, a modern computer doesn't has hardware limitations. If a Linux distribution needs 25 GB of harddrive space nobody cares, because the price for an SSD is low and fast internet connections are available. What is a bottleneck is the codelines who have to be created. The best example for bloatware in the Linux ecosystem is LaTeX. A closer look into the code will show that the project is way to big. Sure, it can produce beautiful documents but for which price? Or let me give another example. There are two IDE available under Linux: the first is the well known Eclipse software which is great. A lightweight editor is called geany. Which needs only a little amount of disc space.

Some expert Linux user will argue, that they need LaTeX otherwise it's not possible to create a well formatted pdf paper which includes bibliographic references. But let us imagine a world without LaTeX. What the user needs is abiword and an additional reference manager. In the reference manager a bibtex file is stored, and the entries can be copied and pasted into the abiword document. The combintation of a lightweight reference manager plus abiword will need less than 30 MB on the harddrive. There is not need to install Lyx, LaTeX, and Computer modern fonts in high resolution.

Clean up the Linux ecosystem

Writing new code is pretty easy. The result is, that hundred of software programs and thousands of Linux distributions are available. Each are maintained by a single or two programmers. The problem with the mess is, that the normal user gets a poor software quality. If the manpower is distributed over lots of small projects, many sourceocde redundant.

The better idea is to focus on small but effective projects. The direction is to reduce the sourcecode to a minimum. A concrete example is to prefer WXwidgets over GTK and prefer abiword over Libreoffice. The resulting ISO image which contains all of the programs is smaller. Most Linux distributions today have a demand of 20 GB on the harddrive. The same functionality can be reached with 10 GB and even with 5 GB for the binary files. From the hardware perspective this is only a minor problem, because most SSD have enough space, but the advantage is located on the developer side.

Do we need the sourcecode in the LaTeX project? Is Java under Linux a good idea? Technically it works great but it's the past. The LaTeX ecosystem was created 40 years ago. The programming languaes used in that time are outdated. The general question is which programming language and which software make sense. What we can say for sure is, that the C/C++ ecosystem makes a lot of sense. It allows to create slim programs which are running fast.

One explanation why the average Linux desktop is so huge is because the programs were created before the advent of the Internet. In the past, the user had no internet access. The result was, that all the programs were needed. This motivated the developers to create large scale operating systems with many programs.

February 11, 2020

Extending learning from demonstration into Reward learning from demonstration

The vanilla learning from demonstration idea is about recording the human's demonstration and replay the trajectory on the robot. For example, the human operator is executing a trajectory (100,10),(150,30),(150,80) and this waypoints are used for control the robot's arm.

The disadvantage is, that the connection between demonstration and replay is very static. One option to avoid the cons is to use the demonstration as an indirect pathway. In the literature the concept is called reward learning and the idea is to create a heatmap. The heatmap allows to find many different trajectories which are all bring the robot into the goal state.

A heatmap aka costmap is a visual representation of a learned cost function. The idea is that colors from green to red are shown as overlay picture over the normal map. The information which pixel becomes which color is given by the demonstration of the human operator. Bascially spoken, the human demonstration creates a path in the map, and the path is extended to a colored heatmap. A trajectory planner like RRT is used to find in this map a path.

Clicker training with dogs

In animal training there is a powerful technique available called clicker training. For the newbie the technique is hard to understand. The human trainer is using a noise making device, and feeds the dog with some cookies. After a while the dog is able to do lots of tricks. But how does it work from a technical perspective?

Teaching skills can be done in two forms: direct and indirect. Suppose the idea is to explain who to move from start to goal. This can be done in giving the direct command. At first, the dog has to walk 10 meter ahead, and then he has to go left for 5 meters. The problem with this method is, that the explanation can't be adapted to new situations. For example, if the pathway is blocked, it makes no sense to walk 10 meter ahead. So the question is how to give a tutorial which is more flexible?

Clicker training is working with a cost map. What the human trainer is doing with the noise making device is produce a cost map for the dog. He sets reward points on the map. A reward is a situation in which the dog gets a cookie. The dog labels the point on the map with the positive reward. In the replay mode, the dog is approaching all the +1 rewards on it's reward map and this will make the human trainer happy.

In case of spatial maps, it's not very complicated to imagine such a map. In abstract situation the map looks more complicated. For example, if the goal is not to reach a point in space, but to walk in a circle, it's an abstract behavior. If the dog is smart he can create the cost map for such abstract tasks as well.

Creating a minimal Linux kernel

In Linux forums there was announced an interesting challenge: creating a minimalist linux kernel, https://www.quora.com/What-is-the-smallest-in-size-Linux-kernel The idea is, to compile the Kernel from scratch, leave out most of the modules and hope that it will fit in under 10 MB of space. If this task make sense for productive systems is open but it make sense for learning how the Linux system is working.

In other forums it was explained, that distributions like LFS and Gentoo are supporting this attempt. Other Linux distributions like Tiny core linux and puppy linux are the result of the goal to create a minimalist system. The sad news is, that no performance improvement is visible. The runtime of a program is the same if a 10 MB linux kernel or a 1 GB Linux kernel was bootet. Unfortunately, such a system can't be used for older computers, because even if the kernel runs well on a 486'er machine the user can't utilize it because a 486'er PC has no USB connector, no LAN card and no high resolution graphics card. At the end, the experiment results into a lesson learned and the user will switch back to it's normal COTS linux distirubiton which includes Gnome and a full scale kernel.

But what will happen if the challenge is taken with the opposite attempt: How to create the most complete Linux kernel which includes all the modules?

Some remarks about Clear Linux

The Intel based Clear Linux is the upraising star on the Linux desktop ecosystem. Instead of describing the technical side which has to do with installing the system on a real PC the more interesting feature is the background story. Clear Linux has two interesting features: first one is, that in contrast to Debian the system wasn't developed by amateur volunteers but by professional Intel programmers, and secondly the idea is not to provide a Linux distribution which runs on outdated PC which were available 20 years ago, but Clear Linux was compiled with the latest compilers and the most advanced flags available.

The story makes a lot of sense and it will force other Linux distributions to go into a similar direction. The interesting point is, that the Clear Linux distribution and the idea of the Open source ecosystem are fitting well together. fedora was the first company who has proven that Linux is a commercial product. Intel has become the second company who is going toward that direction. The unsolved problem with Clear Linux is, that a desktop Linux is adressed to the end user. And this audience is difficult to handle. The dominant reason why Microsoft is successful is because it is able to provide additional help to the normal consumer. It's hard to tell if Intel can do the same.

Suppose an end user has a problem with Clear Linux. Does he get a forum or a repository server? Does Intel provides warranty for the software? All these points are unsolved. Let us take a look into a similar project, called Antergos. This was a minimalist desktop operating system based on Arch Linux. The problem with Antergos was, that it was canceled after a while. Not because of the software itself, which was state of the art, because of the unsolved communication between the project and the end user.

If a company is publishing a software to the market, the company has to provide support as well. It will depend on this issue, if Clear Linux becomes successful or not.

Let us describe the situation in detail. THe Clear Linux website provides an about section https://clearlinux.org/community/bug-report in which the user is asked to send bug reports. He can do so with the github issue tracker or with e-mail. What will happen, if thousands of users are installing the software and sending bugs to the company? Can Intel handle feature requests from normal endusers who are not developers?

February 08, 2020

Transition from Python to Javascript

Python is known as an easy to learn programming language which is available on all operating systems and has object oriented features available. Creating new software in Python isn't recommended because the Javascript language is the natural successor. It's also available under all operating systems, provides object oriented features and has two additional advantages:

1. GUI applications can be created much easier compared to Python. In python a GUI depends strongly on the underlying framework which is tkinter, python-gtk or Python under Windows

2. the execution speed is faster, which makes Javascript the prefered choice for productive sourcecode which can't be realized with Python program

Sure, Python is used by many programmers, but the latest version which is Python3 can't compete with Javascript and node.js. What we can say about the Javascript language is, that it's not only used for prototyping new applications, but Javascript results into working code. In contrast, a program written in Python needs to be converted into faster programming language which is C++, Java or C#. This additional step isn't needed for Javascript which runs great in the normal Javascript interpreter.

Another interesting aspect is the node.js runtime environment. It's possible to configure the engine so that it's integrated in the programming IDE. The user types in the sourcecode not into a webbrowser but with a normal IDE, and after press on the run button, the status box will show the result. This allows a fast edit-compile-run cycle. It's basically the same workflow like programming code in Python. The only thing what is different is, that missing tab spaces.

Can a computer virus infect humans?

In a comic book about a super-human AI it was explained that robots are able to rescue mankind from a virus, Mitchell Kwok: Superai #20:

http://www.lulu.com/shop/mitchell-kwok/superai-20/ebook/product-24420726.html

It was left open if the virus is a computer virus, a fictional or a biological one. So the first question to answer is: can a computer virus infect biological entities? A while ago, in a hoax blog post it was explained, that a computer virus is able to spread to humans and animals,

https://www.scmagazine.com/home/security-news/lethal-computer-virus-spreads-in-humans/

But this can't be realized in reality, because a computer virus is located in the RAM of a computer while biological viruses are based on biological cells. What is needed to overcome the barrier between silicon and cells is a "brain computer interface". That is a device which was invented with the purpose to receive and transmit signals into the brain. If such an interface is used for control something, it's called in the literature "brain stimulation". Brain stimulation means, that on the computer a program is started. The BCI-interface sends signals to the brain and then the biological lifeform is doing something.

Exactly such a pipeline can be used by a virus as well. From science fiction movies like The lawnmower man (1992) and Existenz (1999) such a device was used. It was left open if a normal MS-DOS PC can be reprogrammed in a way, that the Soundblaster card is acting as a brain machine interface. But suppose it's possible, than a computer virus is able to spread from computers into humans and vice verse. The result is a mashup between two different film genres: Terminator II in which an artificial Intelligence takes over the world, and a Zombie apocalypse movie like 'I am legend (2007)' in which a biological virus is transforming humans into living deads.

February 07, 2020

Linux software for creating text documents

The Linux operating system is known for it's large amount of user-software. In contrast to the Windows and Apple world, the average program is available for free. That means, after clicking the download button in the App-repository no extra costs are charged. The disadvantage of Open Source software is, that the amount of users is low, and most programs are not documented very well. But this can be changed. In this blog post, some text programs are presented to the normal user.

- Lyx

- LaTeX

- Geany

- pandoc

- LibreOffice writer

All of these programs are allowing to enter text with a keyboard. The difference is, that pandoc, Geany and LaTeX programs are markup oriented, while the other tools on the list have powerful graphical features. The lyx programs stands somewhere in the middle and it's the most powerful tool of them.

The lyx programs is able to render images in the text, provides an outline on the left side and can manage external literature references provided by bibtex.

The other programs on the list are also supporting the creation of text documents. The libreoffice writer programs comes very close to the Microsoft Word program. It's a typical WYSIWYG application which can import and export most of the text formats like RTF, PDF, plain text, MS-Word and so on. The disadvantage of LibreOffice writer is, that it can be used for complex layouts. If someone likes to format a complete book he will click into lots of menus. Only occasional users who are using a Desktop PC as a replacement as a classical typewriter will benefit from the program.

The geany program isn't a classical wordprocessor, but it's mainly a programming ide which can open text files as well. It's strength is, that many tabs can be opened at the same time, and that powerful regular expression for search and replace actions are allowed. What the user has to do before he can type in normal text is to activate the â€Å“line wrapping” option. This will transform the program into a normal texteditor. The strength of the Geany software is, that it provides only the minimal functionality. In contrast to Lyx or LibreOffice writer it's a small program which allows the user to create only textfiles but nothing more.

The ability of geany to print out textdocuments or create nice formatted pdf papers is low. It can be realized with external tools like pandoc or the user gets only a basic layout. If the idea is to create text documents with a minimalist approach, the Geany software is a here to stay.

At the end, I'd like to provide some information about classical UNIX text processing tools like emacs, VIM, ed or other tools. They are no longer relevant. Emacs was developed for text only terminal computers in the 1980s. With the advent of modern GUI systems like Gnome, emacs has lost all the fans. It was a great tool in the past, but nowadays there is no advantage over other programs like Geany, Eclipse or the atom texteditor. In the windows world the Notepad++ texteditor has become the standard for editing longer textdocuments. The software can be installed in Linux as well, but compared to Geany there is no advantage.

The good news is, that the amount of texteditors for Linux is very large. If the user isn't satisfied with the described programs he can choose from hundreds of programs the perfect tool. Nearly all WIndows text programs have a Linux port and many other Open Source text programs were written in the past.

The borg backup tool

In contrast to the zip program, the borg backup tool is working with an incremental algorithm. The advantage is, that a backup is created within seconds but not within hours. On the command line only two actiions are needed:

- init (without password): borg init -e none borg-backup/

- backup:

borg create --compression lz4 borg-backup::2020-01-25 /home

What the user has to do, is to change the date in the second command before he creates a new backup. The date is used for identify the snapshot. The main feature of the borg software is, that if the user doesn't change the home folder or he simply adds a new file, the backup is created within second. This is important because in most cases the backup is stored not on a local harddrive but on external media which are plugged in with USB or connected over the LAN connection. Creating a backup with the normal zip tool will result into a high load of the CPU and the LAN connection as well. With the borg software this is no longer the case and it's possible to create a snapshot very frequently.

The interesting fact is, that the borg software is nearly unknown in the computer mainstream. Apple users are proud of the time machine software which is doing the same, and Windows users have to install proprietary software from a certain company which is not available within the next 2 years anymore. It's not hard to predict that the borg software will replace all these backup tools similar to what git has done for version control systems. Another command make sense in the context of borg:

borg list borg-backup/

will show all the snapshots stored in the directory. It is doing the same like â€Å“git log”. It provides an overview to the user.

Why wikis won't work

Apart from Wikipedia, wikis are used by no one. That is surprising because on the first look, they are an innovative example for a group ware software. It allows to edit documents in a team, all the versions are tracked and it's possible to use categories, images and a powerful markup language. To investigate why Wikis never were a success we have to investigate a potential use case in which the principle make sense.

Suppose a single user likes to write down his personal notes. Instead of using an outline editor, he is using a local wiki. The categories are equal to the sections and in the wiki the single user notes down keypoints. Such a use case make sense. Wikis have the same or even a better functionality like traditional note taking software. At the same time it gives a hint what the limit is. In most cases, a note taking software is used by a single person, but never by a team. Most of the existing note taking and outline programs are single user applications. The idea is, that only one person creates a personal mindmaps and fill it with knowledge. The reason is, that the written keypoints of person 1 doesn't make much sense for person 2. Even if he has technically access he is not interested in extending the notes of somebody other.

The normal communication form between humans is not with keypoints but with full sentences. E-Mails, books and memos are written in natural language. This format is expected by the other side. That means, a person makes personal notes for himself, but if he likes to provide the knowledge to a larger audience he has to convert the notes into sentences.

In theory, it's possible to fill a wiki with full sentences. But the advantage over a word processing tool is small. A small comparison can explain the situation:

- wikis are strong for note taking purposes which are based on keywords

- group communication is based on full sentences

Bascially spoken a wiki is able to replace stand alone note taking software, but a wiki can't replace e-mails or written pdf documents.

A stripped down version of Linux

The slackware Linux distribution has become famous because it provides a minimum Linux system. It's possible to install the software without the X window system and without programming tools. This will safe a lot of space on the harddrive:

https://www.linuxquestions.org/questions/slackware-14/new-minimal-install-of-slackware-14-2-a-4175586921/

But, even a minimal Slackware installation will need too much space. The base system needs 600 MB and if the X window system plus some basic apps are needed the user needs a harddrive with more than 2 gb. The advantage over a modern Fedora Distribution is not very huge. A complete Fedora Linux which includes Gnome, Tex, Gnu compiler and lots of other programs needs only 15 GB in total. So the basic question is: why does a Slackware system needs 600 MB and more?

It has to do with modern software development. It is working with libraries, with the Gnu Compiler collection and with the Linux kernel. All of these technques are using a lot of discspace. For example, in a normal Linux distribution the directory for all the libraries will take around 2 GB of discspace.

The good news is, that a powerful alternative is available over Linux, it is called Forth. A Forth system in the year 2020 needs the same amount of discspace like a Forth system 30 years ago. In most cases, it will fit in under 100 kb. The disadvantage of Forth is, that it's more complicated to learn. The reason is, that Forth is less documented than Linux and the amount of libraries is low. The advantage is, that Forth provides software which looks different from Slackware Linux. Forth has much in common with the early MS-DOS operating system which consists of 3 files and fits on a single 3.5 floppy disc. The main strength of Forth is, that it's hard to write bloatware. All existing forth programs are very small.

February 05, 2020

Programming with multiple demonstration

In the domain of robotics there is a technique available called Programming by demonstration. The idea is to provide the robotics trajectory in advance and the only thing what the robot software is doing is to replay the given trajectory. That means, the overall system can't be called an intelligent robot but it has more to do with a numerical controlled machine.

Let us describe the idea from a programmers perspective. The first thing to do is to provide a trajectory in a python script:

trajectory=[(0,0),(10,4),(20,6),(10,5)]

What the robot is doing is to follow the points in the list. The interesting question is how to use this idea for more advanced robotics control applications? Let us observe, what will happen if the environment of the robot is different than the provided trajectory. For example, if an obstacle is given which wasn't anticipated by the default trajectory. The result is, that the programming by demonstration has failed. That means, in the replay mode the robot will get blocked by the obstacle.

The answer to the issue is not to reject programming by demonstration in general, but to improve some details in the python script. A possible way in dealing with new situations is to use multiple-demonstration which are enhanced with preconditions and divided into skills. The resulting python datastructure is:

self.skill-library=[

skill1

- precondition: no obstacle

- trajectory: (0,0),(10,4),(20,6),(10,5)

skill2

- precondition: with obstacle

- trajectory (0,0),(60,4),(70,6),(10,5)

]

This datastructure will work in different situations. In the replay mode, the robot will check which trajectory can be applied. Such a skill based datastructure allows to execute more complex tasks. The underlying principle has to do with multiple demonstration. The vanilla â€Å“programming by demonstration” is working with a single demonstration. That means, only one trajectory is stored in the python program. This trajectory can be replayed or not. In the second case many trajectories are provided.


To understand the idea â€Å“programming by demonstration” better it's important to think about situation in which the technique doesn't work. Constructing a counter example for a single demonstrated trajectory is not very complicated. We have to imagine an environment in which the static trajectory will fail. Inventing a counter example for the second case is much harder. Because the datastructure is prepared to react differently.

A possible bug is, that the obstacle in the replay mode is larger than in the demonstration. That means, the replay mode will detect the obstacle, executes the second demonstration but it doesn't work. This will result into the need of inventing a more advanced method in storing the demonstrations. One option is to increase the number of demonstration to 10 which includes larger obstacles, the other option is to extrapolate the skills for new cases.

February 04, 2020

What is the price for a cold medical plasma device?

Cold medical plasma is a relatively new technology for treating lots of diseases. It can be used for healing Wounds and deactivate bacteria. A commercial device is sold by http://www.plasmamedicalsystems.com/ Unfurtunately, the pricelist is hidden behind a password protected website. But according to the picture in the catalogue, I would guess that the device costs around 20k US$ each. It seems, that the technology is available, is sold sometimes, but is not available at Amazon but only sold for a limited audience.

Can't recommend Fedora 31 update

The update from Fedora 30 to Fedora 31 was a disaster. First, the update manager has downloaded 4GB of data from the Internet without telling the progress or the reason why so much packages are needed. Secondly, after the installation the Lyx software was broken.

In the first trial, Lyx had problems with background flickering. A first fix to the problem has introduced a new problem which is the malfunction of copy&paste. Right now, the Lyx software isn’t working at all. So the advice is, that if somebody is using Fedora 30 the better idea is to wait until Fedora 32 is published and not update to early.

The only advantage of the update was, that the home folder wasn’t removed. That means, all the data are available. But it seems, that the maintainer are not sure if this is working in every case. So they are recommending more than 3 times to backup all the data. Perhaps with the reason, that in the past users have lost their home folder?

X11 works

The problems with the Lyx software can be fixed by switching back from Wayland to the X11 environment. This can be activated in the login menu. The settings button allows the user to decide between Gnome under wayland, gnome classic under X11 and gnome under x11. If he selects the X11 version, Lyx will run normal. That means, the graphics are display normal and the copy&paste function is working. It seems, that the problem is located within the wayland display server.

February 03, 2020

Bibtex2mediawiki converter

Bibtex to mediawiki converter

BibTeX to MediaWiki converter


hide url:

line

Helpdesk game




Helpdesk game




Helpdesk game











newline on dialogue turn: