seo147 |
08-31-2016 11:46 PM |
cache writes data to both the cache and storage. The advantage to this approach is that newly written data is always cached, thereby allowing the data to be read quickly. A drawback is that write operations are not considered to be complete until the data is written to both the cache and primary storage. This causes write-through caching to introduce latency into write operations.
Write-back cache is similar to write-through caching in that all write operations are directed to the cache. The difference is that once the data is cached, the write operation is considered complete. The data is later copied from the cache to storage. In this approach, there is low latency for both read and write operations. The disadvantage is that, depending on the caching mechanism used, the data may be vulnerable to loss until it is committed to storage.
Popular uses for cache
Cache server: A dedicated network server, or service acting as a server, that saves webpages or other Internet content locally. This is sometimes referred to as a proxy cache.
Disk cache: Holds data that has recently been read and perhaps adjacent data areas that are likely to be accessed soon. Some disk caches are designed to cache data based on how frequently it is read. Storage blocks that are read frequently are referred to as hot blocks and are automatically moved to the cache.
Cache memory: Random access memory (RAM) that a computer microprocessor can access more quickly than it can access regular RAM. Cache memory is usually tied directly to the CPU and is used to cache instructions that are frequently accessed by the processes that are currently running. Although a RAM cache is much faster than a disk-based cache, cache memory is much faster than a RAM cache because of its proximity to the CPU.
Flash cache: Temporary storage of data on NAND flash memory chips -- often in the form of solid-state drive (SSD) storage -- to enable requests for data to be fulfilled with greater speed than would be possible if the cache were located on a traditional hard disk drive (HDD).
How to increase cache memory
Cache memory is a part of the CPU complex and is therefore either included on the CPU itself or is embedded into a chip on the system board. Typically, the only way to increase cache memory is to install a next-generation system board and a corresponding next-generation CPU. Some older system boards included vacant slots that could be used to increase the cache memory capacity, but most newer system boards do not include such an option.
|