Since the size of cache memory is small the correspondence between the main memory and CPU are specified by a mapping function. There are three standard mapping functions namely
- Direct Mapping
- Associative Mapping
- Block set associative mapping
Direct Mapping
- In this block K of the main memory maps onto block k%128 of the cache
- Since more than one main memory block is mapped onto a given cache block position, contention may arise for that position even when the cache is not full
- This is overcome by allowing the new block to overwrite the currently resident block
- A main memory address can be divided into three fields, TAG, BLOCK and WORD
- The TAG bit is required to identify a main memory block when it is resident in the cache
- When a new block enters the cache the 7 bit cache block field determines the cache position in which this block must be stored
- The TAG field of that block is compared with tag field of the address
- If they match, then the desired word is present in that block of cache
- If there is no match, then the block containing the required word must be first read from the main memory and then loaded into the cache
Associative Mapping
- In this any main memory block can be loaded to any cache block position.
- In this 12 tag bits are required to identify a main memory block when it is resident in the cache
- The tag bits of an address received from the CPU are compared with the tag bits of each cache block to see if the desired block is present in the cache
- Here we need to search all 128 tag patterns to determine whether a given block is in the cache
- This type of search is called associative search
- Here complete freedom is given to positioning the block and for replacement. Cost of implementation high
Cache is organized in two ways << Previous
Next >>Block set associative mapping
Support us generously: contact@lessons2all.com
Our aim is to provide information to the knowledge
seekers.