LLM OS
It has more knowledge than any single human about all OS.
Last updated
It has more knowledge than any single human about all OS.
Last updated
Located in the center of the figure, it is the core of the entire architecture. The large language model generates natural language responses by processing the input text data.
RAM (Random Access Memory): It has a two - way arrow connection with the LLM, indicating that the LLM can read data from the RAM and also write data to it. There is a "context window" in the RAM, which may refer to the storage area for the context information that the LLM depends on when processing text.
Disk: It is connected to the LLM through the file system. The file system contains "embeddings", which may refer to the data structure used to store word or semantic embedding vectors. The LLM can read data from the disk and also write data to it.
Software 1.0 Tools: These tools include "classical computer" tools such as calculators, Python interpreters, and terminals. These tools have a two - way arrow connection with the LLM, indicating that the LLM can interact with these tools to obtain or provide data.
CPU (Central Processing Unit): It has a two - way arrow connection with the LLM, indicating that the operation of the LLM depends on the computing power of the CPU.
Ethernet: Through Ethernet, the LLM can communicate with other LLMs. This shows that different LLMs can exchange data and collaborate through the network.
Browser: It has a two - way arrow connection with the LLM, indicating that the LLM can interact with the browser, probably for obtaining web page content or displaying the generated content in the browser.
Peripheral Devices I/O: It includes video and audio devices. These devices have a two - way arrow connection with the LLM, indicating that the LLM can process and generate video and audio content.