When program exhibit locality,caching works quite well.However,if programs exhibit bad locality,caching breaks down and the performance of the memory hierarchy is poor.
In particular,object-oriented programming can cause programs to exhibit less than opimal locality.
Another example of bad locality can be seen in two-dimensional array access. Arrays are typically stored in row-major order.Suppose,for purpose of this example,that one row fits exactly in one cache block and cache can hold all but one row of the array.If a program accesses the array one row at a time,the first row access produces a miss,but once the block is transferred into cache,all subsequent accesses to that row are hits.So a 5 * 4 array would produce 5 misses and 15 hits over 20 accesses( assuming we are accessing each element of the array ).If a program accesses the array in column-major order,the first access to the column results in a miss,after which an entire row if transferred in.However,the second access to the column results in another miss.The data being transferred in for each row is not being used because the array is being accessed by column.Because cache is not large enough,this would produce 20 misses on 20 accesses.
A third example would be a program that loops through a linear array that dose not fit in cache.
In particular,object-oriented programming can cause programs to exhibit less than opimal locality.
Another example of bad locality can be seen in two-dimensional array access.
A third example would be a program that loops through a linear array that dose not fit in cache.