Copyright SiliconANGLE News

MythWorx LLC, an artificial intelligence startup developing energy-efficient next-generation models, said today it has raised $5 million in seed funding at a $100 million valuation. The round was led by Eagle Venture Fund IV and Eagle Freedom Fund II, with participation from several leading angel investors. MythWorx is pursuing what many in the industry call the holy grail of AI — artificial general intelligence — by building high-efficiency, low-power models that outperform traditional large language models at roughly one-tenth the power when reasoning through problems. Chief Executive Jason Williamson told SiliconANGLE in an interview that achieving accurate reasoning at extremely low power requires a specialized architecture that acts more like the human brain. The company’s models are built around neuromorphic computing, which mimics biological processes such as strategic pruning and forgetting to improve efficiency over time. “What MythWorx is trying to accomplish with neuromorphic computing is very divergent from the traditional path,” Williamson explained. “Neuromorphic isn’t new, but people who are actually doing real things with it. That’s new.” MythWorx’s latest system, Echo Ego v2, a 14 billion-parameter model, achieved 71.2% accuracy on the MMLU Pro benchmark, a reasoning and knowledge test spanning more than 12,000 tasks across 14 subjects. The model completed the evaluation without pretraining, chain-of-thought prompting or retries, outperforming far larger competitors including DeepSeek R1 at 671 billion parameters and Meta Platform Inc.’s Llama 4-Behemoth at 4 trillion parameters. The company said Echo Ego v2 also surpassed rivals on the ARC-AGI-1 benchmark, a test of machine reasoning and general problem-solving. It reportedly completed the suite in four hours using 208 watts at 100% accuracy, compared with OpenAI’s 23-hour run that consumed 9.5 million watts and scored 87.5%. “Our most important accomplishment is the fact that we can provide AI capabilities at a fraction of the energy cost,” Williamson added. “It helps environmentally; it helps economically. It’s a good thing for all people if we can consume less energy and be good and ethical at something.” Williamson said MythWorx’s unique architecture enables the creation of compact, high-performance models that can run on edge devices, from older Android phones to internet of things sensors and on-premises corporate networks. By using what the company calls “digital neuroplasticity,” these systems continuously rewire themselves to learn from experience and can operate for extended periods with limited or no internet connectivity. “Neuroplasticity is not statistical probability,” Williamson said, comparing MythWorx’s AI architecture to how traditional LLMs predict the next word based on training and human feedback. “It’s self-rewiring based on what it thinks is a good experience or a bad experience.” The company’s models are multimodal, capable of processing text, audio and visual content and generating both language and imagery. Williamson said this versatility opens opportunities across industry and government, particularly for document understanding, cybersecurity and multidomain knowledge processing in fields such as biology, chemistry and computer science. MythWorx has already seen “particular success within the United States Department of Defense,” Williamson said, noting that its technology performs well in edge AI use cases. In the field, military machinery and personnel are truly at the edge; they don’t always have access to digital connectivity and sometimes must remain radio-silent, he explained. An AI model capable of running at extremely low power, with high intelligence and requiring little to no connectivity could be highly capable in these scenarios. The new funding will support scaling and productization of MythWorx’s technology. With today’s investment, the company plans to expand adoption and build new feature sets based on the systems already deployed for early customers. “You can do more at an AI company with several million dollars than traditional startups could do before because of the opportunity of the cloud, because of the scale,” Williamson said. “Our models are coding themselves, so we even have less developer needs.” Image: SiliconANGLE/Microsoft Designer