All You Need is Attention
Is the glass half full or half empty?
I’d assume your answer is determined by where your attention is placed - the water or the space. If we delved deeper into your answer, we could propose that the weight (importance) of certain factors, such as mentality, environment, and culture, influenced your attention. In deep learning transformer models, such as Chat Generative Pre-Trained Transformer (Chat GPT), attention is a pivotal mechanism that weighs the importance of the input text to understand the context and relationships within the text data. It enables the machine to focus on different parts of input text, leading to better processing and text generation.
Are you still with me? Perhaps I’ll start with machine learning and expand to deep learning transformers for greater clarity.
In machine learning, structured data sets, such as numeric, categoric, date and time, are fed into the machine to be processed. The algorithms identify the patterns in the data to predict specific outcomes – such as algorithms can categorise customers into segments based on shared characteristics or behaviours (e.g., gender, preferred product categories, membership status). Unstructured data such as images, text and audio are more complex to process and require deep learning to identify patterns to predict specific outcomes, such as text translation or generation. However, ‘deep learning’ in AI vastly differs from the concept of ‘deep learning’ in humans -cognitive learning.
Keep reading with a 7-day free trial
Subscribe to Reimagining the Machine to keep reading this post and get 7 days of free access to the full post archives.