The Real Meaning Behind the AI Revolution
The AI revolution is often seen as a massive paradigm shift in how we use and access information. However, the challenges and need for change, adaptation, and innovation go beyond just information access. At its core, the AI revolution is about computational power and how data is interpreted and processed.
The Binary Approach to Computation
When we first discovered computers, humanity took a binary approach to computation, which is fantastic for mathematical calculations. Our current tech is heavily engrained in math, which is why many analog functions like audio processing are diverted to specific processing chips just to interface with the alternative form. Data in its rawest form can be stored in binary or analog and can be converted between each for a cost and loss of quality.
The Cost of AI Computation
The reason we require so many GPUs and computational power is that AI models are trying to mimic or emulate decision-making in real biological organisms. However, there’s a problem - what animal has ever run on digital data? Most data we can perceive is in some analog form. If you simplify everything, all humans can sense is varying frequencies in different dimensions.
The Limitations of Binary Computation
Moore’s observation is coming upon a physical issue - transistors are getting to the point that they are measurable in units similar to the size of an atom. You get to a point where “just making them smaller” doesn’t work. In complete theory and likely, this wouldn’t function reliably or be reproducible at scale. You would need at least 3 separate “atoms” for I/O, and unless you discover a completely new form of quantum mechanics focusing on the application of sub-atomic units for use in binary computation, there really isn’t anything smaller you can use as a separate entity.
The Efficiency of Biological Organisms
Biological organisms are much more efficient than computers. We clearly don’t remember as much as computers, but that’s the difference between us - instead of storing everything in recreatable formats like digital, we store it in the neural connections in our brain. Analog computers are systems that have 0 overhead like scaling a binary computational system would require. Analog systems can handle multiple concurrent inputs, something we built a specialized system to take care of on almost all modern machines (GPUs).
The Potential of Analog Chips
There is current research into the modern application and use of analog chips. A few years back, a company was producing application-specific AI chips that did stuff like object detection at astonishingly improved efficiencies. These chips were designed to process and display visual data. However, there’s also the question of how you input visual data into a purely analog system.
The Ideal Method to Store Analog Data
What would the ideal method to store analog data be? That’s a good question that is currently being researched. I don’t know if you can do it without loss. I’d guess some form of low-entropy analog input neural model. But I think at some level, the continuous data collection/temporal inferred data could be a large part in what’s keeping AI as just a nice tool and not an opinionated entity with its own needs and wants to recreate as a means of security.
The Future of AI
Maybe data transfer and persistent storage are both concepts that don’t exist anywhere in nature, creating this imagined perceived difficulty of this made-up system of math that we try to solve everything with. The future of AI is uncertain, but one thing is clear - we need to rethink our approach to computation and data storage if we want to create truly efficient and powerful AI systems.