Liquid AI has announced the availability of its "new generation" of generative AI models that deliver "state-of-the-art performance at every scale while maintaining a smaller memory footprint and more efficient inference." Liquid AI refers to its large language models as Liquid Foundation Models (LFMs). Something quite interesting about these new LFMs is the fact they're not using transformer architectures like ChatGPT (the T stands for transformer). This design choice allows LFMs to be more memory-efficient.