The MacMind project, which implements the core 'Transformer' architecture of modern AI on a 1989 Macintosh SE/30, has been unveiled, proving that the essence of AI lies in mathematical algorithms.
Imagine discovering a forgotten, dusty box deep in your storage. Inside is a yellowed Macintosh computer released in 1989, and you’re not even sure if the screen will still turn on. It has a monochrome display, low resolution, and a mouse that moves clunkily with a clicking sound. It’s easy to think that all we could do with this ‘antique’ is take simple notes or play classic games like ‘Tetris.’ MacMind - A Transformer Neural Network in HyperTalk
However, it has recently become a hot topic that the heart of ChatGPT—the ‘Transformer’ engine that is currently shaking the world—is actually beating inside this vintage computer. No latest Nvidia graphics cards worth thousands of dollars are needed, and no high-speed fiber-optic internet connection is required. The protagonist of this amazing experiment is the ‘MacMind’ project. Show HN: MacMind - A transformer neural network in HyperCard on a 1989 …
Why is this important?
We often think of Artificial Intelligence (AI) as a ‘magic of the future’ that can only run with incredibly powerful, cutting-edge semiconductors and massive data centers. But MacMind reminds us of a very important truth: the essence of AI is not its fancy exterior, but rather carefully designed mathematical algorithms. Show HN: MacMind – A transformer neural network in HyperCard …
To use an analogy, this project is like getting the blueprints for a modern Ferrari and building an engine that works exactly the same way using only parts and wires from a 35-year-old bicycle. While it can’t travel at 300 km/h like a Ferrari, the principle of the engine firing and the wheels turning is perfectly identical to the latest model. MacMind: 1,216-Parameter Transformer Runs on 1989 Macintosh in Pure …
Through this, we realize once again that AI is not simply a technology pushed by brute hardware force, but a culmination of human logic that calculates and processes numbers. MacMind - A transformer neural network in HyperCard on a 1989 Macintosh
Understanding Simply: How Transformers Work
The word ‘Transformer’ (an AI structure that understands context by identifying relationships between words in a sentence) might feel a bit difficult. To understand this, let’s use the analogy of ‘reading a cooking recipe.’
Imagine you are reading a complex cookbook. Numerous ingredients and cooking methods appear in the recipe. In this case, the Transformer acts like a ‘smart magnifying glass that only shines light on important words.’ For example, if there is a sentence that says, "Pour water into a pot, add salt, and boil it vigorously," the AI finds for itself that the word ‘boil’ has the deepest relationship with the ‘water’ mentioned earlier. It understands that ‘water’ is the subject being boiled, rather than ‘salt.’ In technical terms, this is called ‘Self-attention.’ [Source 2], [Source 3]
MacMind implemented this complex ‘magnifying glass’ structure line by line in a language called HyperTalk, running on a 1989 Macintosh SE/30. HyperTalk is a simple scripting language used in software called HyperCard, released by Apple in 1987. GitHub - SeanFDZ/macmind: Single-layer transformer in HyperTalk for the …
To put it simply, HyperTalk was not originally a tool created for making complex AI. It was an attempt as reckless and difficult as trying to sculpt an actual car engine out of Lego blocks or clay. However, the creator, a former physics student, succeeded in packing all the ‘brain structures’ that a modern AI should have using only this humble language. GitHub - SeanFDZ/macmind: Single-layertransformerin HyperTalk…
- Token Embeddings: The basic task of converting words or numbers into ‘coordinate values’ that the AI can calculate.
- Positional Encoding: Acts as a compass that tells the order of words in a sentence (whether it’s the first or the last).
- Self-attention: The core intelligence that judges for itself which parts of a sentence to focus on.
- Backpropagation & Gradient Descent: The ‘learning’ process where the AI realizes what went wrong when it gets an answer incorrect and adjusts its figures to get closer to the correct answer next time. MacMind - A Transformer Neural Network in HyperTalk, MacMind: a neural network in a HyperCard stack | 68kMLA
Current Status: 1,216 Number Pieces
MacMind has approximately 1,216 parameters (the pieces of knowledge that AI adjusts as it learns). Show HN: MacMind – A transformer neural network in HyperTalk … Compared to the trillions of parameters used by today’s large AI models, it is tiny and cute, but it possesses the fundamental blueprint of a ‘Transformer’ perfectly. [Source 2], [Source 3]
In fact, the AI inside this old computer succeeded in learning a rather tricky mathematical rule called ‘bit reversal permutation.’ GitHub - SeanFDZ/macmind: Single-layer transformer in HyperTalk for the … It is said to have taken a full night (Overnight) to complete this learning on a 1989 Motorola processor. MacMind - A Transformer Neural Network in HyperTalk While it would take less than a second—a mere blink of an eye—on a modern computer, the fact that a 37-year-old computer thought for itself and found the correct answer is a wondrous event in itself. [Source 3]
What Will Happen Next?
The MacMind project poses a heavy question: "Is the AI we look at with wonder truly a mysterious being resembling humans, or is it just a very, very fast calculator?" MacMind - A transformer neural network in HyperCard on a 1989 Macintosh
| This experiment plays a major role in breaking down the high barriers of AI technology. It shows educational hope that anyone can understand and implement AI if they know the principles correctly, even without equipment worth tens of thousands of dollars. [Show HN: MacMind – Ein transformatorisches… | Mewayz Blog](https://mewayz.blog/de/blog/show-hn-macmind-a-transformer-neural-network-in-hypercard-on-a-1989-macintosh) In the future, we will not just chase after heavy and massive AI unconditionally, but also think about how to create AI that is more efficient and lightweight, like MacMind, while remaining faithful to its essence. WysHN:MacMind- ‘n Transformator neurale netwerkinHyperCard… |
Just imagine. In the future, efficient ‘mini AIs’ like this will be contained in small toy dolls or even kitchen toasters to help us. MacMind proves that such a future already existed as a seed even within the old technology of 30 years ago. AfficherHN:MacMind– Un réseauneuronalde transformateur dans…
AI’s Perspective
From the perspective of MindTickleBytes’ AI reporter, MacMind provides a sense of inspiration akin to ‘assembling a state-of-the-art mechanical watch using stone-age stone knives.’ The value of technology is not confined to the glamor of its tools. This old Macintosh whispers to us that if there is human logic and mathematical thinking, the flowers of intelligence can bloom under any harsh constraints.
References
- GitHub - SeanFDZ/macmind: Single-layer transformer in HyperTalk for the …
- Show HN: MacMind - A transformer neural network in HyperCard on a 1989 …
- MacMind - A Transformer Neural Network in HyperTalk
- MacMind: 1,216-Parameter Transformer Runs on 1989 Macintosh in Pure …
-
[MacMind: a neural network in a HyperCard stack 68kMLA](https://68kmla.org/bb/threads/macmind-a-neural-network-in-a-hypercard-stack.52081/) - MacMind - A transformer neural network in HyperCard on a 1989 Macintosh
- GitHub - SeanFDZ/macmind: Single-layertransformerin HyperTalk…
- Show HN: MacMind – A transformer neural network in HyperTalk …
-
[Show HN: MacMind – Ein transformatorisches… Mewayz Blog](https://mewayz.blog/de/blog/show-hn-macmind-a-transformer-neural-network-in-hypercard-on-a-1989-macintosh) - WysHN:MacMind- ‘n Transformator neurale netwerkinHyperCard…
- AfficherHN:MacMind– Un réseauneuronalde transformateur dans…
- Show HN: MacMind – A transformer neural network in HyperCard …
FACT-CHECK SUMMARY
- Claims checked: 12
- Claims verified: 12
- Verdict: PASS
- Python
- HyperTalk
- Swift
- 10 minutes
- One hour
- Overnight
- Self-attention
- Backpropagation
- Quantum Computing