It’s true.
Origins
Some people aren’t aware of this, but that same neat feature that remembered what you searched for and filled it in for you later….. is where modern “Artificial Intelligence” can trace some of it’s earliest roots.
Predictive text truly began when mobile phones became more and more common.
What’s that old saying?
Necessity is the mother of invention.
Businesses realized they needed to make typing on these tiny devices with numeric keypads easier to retain users.
By using partial input to predict words users intended to type, the system was supposed to help users overcome the cumbersome reality of texting on a digital device.
At first, very simple algorithms were used to make predictions based on statistical analysis of the frequency of single words. Although these early predictive text applications could provide simple word-level predictions and they were great for search engines, they lacked any ability to replicate any conversational context.
Evolution
Predictive text algorithms gradually improved as computing power increased. More advanced techniques were being used by researchers; they included n-gram language models, which took word-sequence probability into account rather than relying on only word probability per word.
By utilizing the context of the words that came before them in a discussion, these models improved predictions and produced better suggestions.
They were still unable to provide intelligent and coherent replies to user inputs.

Introduction of GPT
2018 saw the introduction of the initial GPT model, which was OpenAI’s breakthrough. An important development in NLP was the Generative Pre-trained Transformer, or GPT for short. In contrast to conventional predictive text systems, GPT pre-trained its neural network using unsupervised learning on vast volumes of web content.
The model was exposed to a wide variety of language patterns during the pre-training phase, and allowed to absorb a significant portion of the publicly published web, which allowed it to acquire syntax, grammar, and piece together context for the first time.
Transformers are a deep-learning architecture created especially for sequence-to-sequence tasks, and served as the foundation for the architecture of the Open AI model. Transformers captured word dependencies and produced amazingly lifelike outputs, revolutionizing natural language processing.
We Know Where We Are, But Where Are We Going?
From predictive text to ChatGPT, the simple function of word suggestion has evolved into an intelligent conversational partner. It’s kind of nuts when I think about it.
ChatGPT has managed to produce an AI assistant that can provide insightful and interesting answers on just about any topic you can ask it about.
Even if GPT-based models like ChatGPT have advanced significantly, they still have drawbacks.
They still have issues with sensitivity to input language and the sporadic production of inaccurate or nonsensical outcomes.
This is because behind the beautiful facade of coherent rational responses, it is still a numeric statistical calculation engine transforming words into numeric values and attempting to predict probability using advanced algorithmic systems.
It may seem like your conversational partner can parse context and understand you, but its really just number sequences arranged in a beautiful simulacrum of human understanding.
There is no passenger, just a vehicle.
Maybe one day that will change, but for right now, for all intents and purposes, you can look at GPT like a bizarre wonder of predictive text operating in a space of predictive sequences.
What fascinates me about this development the most — it’s potential for revolutionizing trading.
If statistical pattern recognition of word sequences can lead to coherent replication of human conversation, why can’t it also lead to coherent market analysis and …. highly accurate asset price predictions?
Thank you for reading!
Until next time….
Onward and Upward Everybody!
-Chris
Automated Income Lifesyle w/ C.W. Morton YouTube
#gpt #predictivetext #artificialintelligence #searchengine #origins #neat