Cache Memory


Cache memory is something sits between the CPU and main memory (RAM). It is a small amount of very fast memory. This is not like normal memory. Cache organization is usually in blocks of cache lines with each line containing some number of bytes, like shown below.


Main reason for having a Cache

As we previously said cash memory isn’t organized as a group of bites. Typically a small number that is a power of two like 16, 32 or 64. What cache memory does is automatically store data when we need it, perhaps fetches new data when CPU requires it. As Hyde (2001, p.297) said, there are two questions to find the answers for this one. 

Obviously Cache is the second smaller memory object of computer system. There is a reason for that. If cache be quite big, data of the program can be executed very high speed. Unfortunately, the data sits here and there memory locations. In general, the data is spread out all over the address space. Therefore, the cache design has got to accommodate the fact that it must map data objects at widely varying addresses in memory.

The idea of cash system is we can attach a different address to each cash line. If cash line is n Bytes long it can hold n Bytes from main memory. Now we are going to see how it does happen and how algorithms work and what are cache systems.