One of the biggest difficulties with analyzing and predicting large time series data, is for the system to be able to “understand” data interdependencies. New types of artificial intelligence algorithms are able to weigh the inputs with respect to how important they are, regardless of the sequence order. But with large datasets this proves to be difficult, as the price from 2019, for example, can have the same influence on tomorrow’s price prediction, as a price from the year 2011.
WE AI model utilizes a transformer architecture that solves the “memory” or “attention” problem of sequence transformation of neural machine translation, i.e. any task that converts an input sequence to an output sequence. This includes speech recognition, text-to-speech, natural-language processing, and of course, financial time series predictions.
Another big disadvantage of computational AI is the fact that sequential computation prohibits parallelization, confining the system to prosses data linearly. Our model utilizes a self-attention mechanism with multi-head attention layers, as well as a positional encoding layer that takes in the reprocessed financial data as input and then issues an embedding in the form of positional encoding vectors, in order to provide the model with a notion of time-order and its interdependencies. This gives the AI an ability to take every input in a time sequence into consideration and precisely understand the interdependencies of the input samples, i.e. financial data, and present accurate predictions. One of the major advantages of this architecture, compared to other neural network architectures, is that it has an infinite attention window and can pay attention to multiple inputs at once. Whereas regular networks will quickly forget inputs that happened several time-steps ago, the transformer network can take every part of the sequence into consideration.
The model is being fed preprocessed and analyzed data frame as input consisting of technical, fundamental, and sentiment analysis.
• Technical analyses consist of historical prices and trading interest. This identifies trends and patterns in the historical price data, as well as reveal technical indicators to generate trading signals.
• Fundamental analyses looks at the coin itself, and its fundamental value.
• Sentiment analyses consist of analyzing the bullish or bearish sentiment of the market. This is done by scraping and analyzing segments of the internet, i.e. financial websites, and investment forums. The gathered data is then processed and analyzed by a secondary AI to extract and quantify the marked sentiment for any given data point.
The idea is, by combining the models attention mechanisms together with parallelization technology,
the AI will be able extract the interdependencies of fundamental, technical and sentiment data. Thus giving the model the ability to look at the whole picture and be able to accurately predict price trends