It is rare to see a check for $180 million written to a company that explicitly promises not to sell anything for a while. It is even rarer when that company is run by three people who think the current standard for building artificial intelligence—feeding it the entire internet—is a dead end. This is the specific bet behind Flapping Airplanes, a new research lab that believes the future of AI isn’t about bigger data centers, but about smarter software.
Key Takeaways
- AI lab Flapping Airplanes raised $180 million from Google Ventures, Sequoia, and Index.
- Founders Ben Spector, Asher Spector, and Aidan Smith are prioritizing research over immediate commercialization.
- The team aims to make AI models 1,000x more data efficient.
The lab is founded by brothers Ben and Asher Spector alongside Aidan Smith. While most of the industry is racing to release products and sign enterprise deals, this group is taking a step back. They have secured funding from major firms like Google Ventures and Sequoia to operate as a “neolab”—a small, focused research group prioritizing scientific breakthroughs over immediate revenue.
Their central thesis challenges the brute-force method currently dominating the field. They argue that biological brains are “the floor, not the ceiling” for what intelligence can do, and they intend to prove it by building systems that learn radically faster than today’s models.
The big deal
If you follow AI news, you know the current recipe for success is simple but expensive: take a massive amount of text and images, and run them through thousands of expensive computer chips. This approach works, but it has hit a wall. We are running out of high-quality data on the internet, and the energy costs are becoming unsustainable.
Flapping Airplanes is trying to solve the scarcity problem. If they succeed in making models 1,000x more data-efficient, the barrier to entry for building powerful AI collapses. You would no longer need a trillion-dollar budget or a nuclear power plant to train a top-tier model. This would shift power away from the few tech giants who currently hoard the world’s data supply.
How it works
The goal is to move from statistical guessing to genuine learning.
Think of it like learning to cook an omelet. Current AI models are like a student who has to read every cookbook in the Library of Congress to understand what an egg is. Flapping Airplanes wants to build a student who can watch a chef make an omelet twice, understand the concept, and do it themselves.
By mimicking how humans learn, the software extracts patterns and rules from a tiny amount of information rather than needing to see every possible variation of a scenario. The team is betting that the architecture of the software matters more than the volume of data fed into it.
The catch
This is high-risk science. The founders admit they are attempting something that most other labs have quietly given up on. The industry shifted to “vacuuming up the internet” because it was the only thing that worked reliably at scale. Trying to make AI learn like a human is a problem researchers have banged their heads against for decades with limited success.
There is also no product roadmap. Because they are prioritizing research, there is no app to download or API to plug into right now. Investors are betting on the team’s potential, not their current output. The founders also note they prioritize “creativity over credentials,” which suggests they may lack the traditional academic pedigree found in other major labs. Whether that is a weakness or a strength remains to be seen.
What now?
The team will use the $180 million to hire talent and run experiments without the pressure to turn a quick profit. They are effectively buying time to solve a hard physics problem.
If you are in the industry, this is a signal that venture capital is still willing to fund moonshots if the team looks right. For the rest of us, we wait. Watch for their first published research paper to see if their math backs up their ambition.
