In Elon Musk’s true words of wisdom, technology doesn’t just happen. It develops because a lot of people work very hard on it.
This quote applies to a wide variety of areas that we usually take for granted and we hardly ever imagine our daily lives without them. Many of these methods are based on sophisticated calculation methods which are ultimately made possible by the fact that computers with extremely high calculation capacity do exist, and we have easy access to these computers.
The methods used for neural networks and artificial intelligence are not new. They date back to 1956 when the big bang of artificial intelligence happened, also known as the Dartmouth Summer Research Project on Artificial Intelligence (AI). However, the mathematical methods behind it are much older. They can ultimately be traced back to Jacob Bernoulli, Kurt Gödel, Leibnitz and Newton. Today, the computing power available to us has changed. The iPhone 1, which came onto the market in 2007, had more computing power than the entire IT infrastructure with which mankind managed to reach the moon. Also, there is a never-ending force behind developing software and making it freely available. This technological power resulted in the supply of freely accessible libraries on the topics of Natural Language Processing (NLP) and ultimately also in the publication of neural networks as open-source software.
By looking into the image management of a current smartphone, one can observe many of these possibilities. All the images are recognized, categorized and connected. A current Smartphone can search and find pictures of a person on their mobile phone, based on their baby photo, even though it was taken many years ago. It’s a very complex undertaking with adult humans as a child’s facial morphology between the ages of 1 to 10 years undergoes massive changes. Behind this method lies the “unsupervised learning” in a neural network.
This method makes use of the time machine of the furrow. Content is semantically analysed and automatically categorized. Semantic content analyses are methods that have been used in their fundamentals and algorithms for about 20 years. Since 2000, the search engine giant Google has been researching semantic search techniques without using keywords. In 2009, Marissa Mayer spoke publicly for the first time at a congress about the plan to free keyword searches and introduce new methods of recognizing similarities. In 1999 the Vienna University of Technology (TU) did a research project to calculate the analyses after 36 hours by using the TU server cluster and to analyse an article with 5000 characters.
For this reason, the use of semantic systems for a very long time was reserved for just a handful of institutes that had the corresponding computer centres. The computing power required is immense. The giants (Google, IBM, Intel) of this industry have been working for years to develop their own processors that are better able to handle neural networks. As a rule, graphics chips (GPUs) are still used today because they are optimized for the calculation of vector points and can, therefore, perform faster arithmetic operations. The next generation will be launched in the market as Tensor Processing Unit (TPU) and will be able to massively accelerate the calculation of tensors (simplified are the vectors in multidimensional spaces).
Let’s go back to the time machine of the furrow. After digitizing the articles, they were analyzed semantically. Various databases are used here as ontology in order to be able to use the similarities of terms in their semantic meanings. “Python” can be a snake or a programming language that is extensively used in the field of artificial intelligence and neural networks. The Furche Timeline or Navigator, as it is now called, is based on these technologies. A neural network is used to calculate how discussions change over time. This shift in discourse is made possible by using semantic content analysis in such a way that we can ultimately calculate what users are interested in when they consume certain content. Since September 11th 2001 at the latest, the term “terror” has been inseparably linked with Islamism in our minds. The Furche Timeline will show us how this term “terror” has shifted and which stations it has gone through since 1945. The contents that are written about it show these shifts and the methods of NLP allow us to design an automated discourse analysis from semantic procedures.