Building modular LLMs locally

Kiki AI
Jan 29, 2024

--

Prediction: we will soon see large, local LLM modules on computers. Most of our applications may interact with what is our private and local large language model.

This will help save us, costs given how expensive is run on the cloud, increased state of privacy for all users and make the models more customizable in self owned.

It’s likely also going to be a mixture of experts — a mix of small modules locally on the computer that can run very quickly quickly, but our expert in certain tasks.

--

--

Kiki AI
Kiki AI

Written by Kiki AI

26K+ Reads ✍🏼 Passionate about tech, AI and impact 👌 Views are all my own.

No responses yet