Generative AI, primarily large language models (LLMs), produces plausible but not always correct outputs, and often lacks reasoning abilities, making their results unpredictable and hard to interpret.
An alternative AI approach could solve these limitations by using curated knowledge and rules, enabling an inference engine to deduce logical conclusions. However, this approach can be slow, so AI systems usually opt for faster but less expressive logic. A system named Cyc has found a way to balance this trade-off, and the article suggests that future AI will need to combine the LLM approach with more formal approaches.