PPT-Data Prefetching
Author : yoshiko-marsland | Published Date : 2017-06-21
Smruti R Sarangi Data Prefetching Instead of instructions let us now prefetch data Important distinction Instructions are fetched into the inorder part of the
Presentation Embed Code
Download Presentation
Download Presentation The PPT/PDF document "Data Prefetching" is the property of its rightful owner. Permission is granted to download and print the materials on this website for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Data Prefetching: Transcript
Smruti R Sarangi Data Prefetching Instead of instructions let us now prefetch data Important distinction Instructions are fetched into the inorder part of the OOO pipeline Data is fetched into the OOO pipeline. Ankit Sethia. 1. , Ganesh Dasika. 2. , . Mehrzad. Samadi. 1. ,. Scott Mahlke. 1. 1-University of Michigan. 2-ARM R&D Austin. 1. Introduction. 2. High Throughput – 1 . TeraFlop. . High Energy Efficiency. Cache . III. Steve Ko. Computer Sciences and Engineering. University at Buffalo. 2. Last . time…. Basic cache architecture. Placement policy. Replacement policy. Average memory access time =. hit time + miss rate * miss penalty. Tomofumi Yuki . . INRIA . Rennes. Antoine . Morvan. . . ENS . Cachan. Bretagne. Steven . Derrien. . University of Rennes 1. Memory Optimizations. Memory Wall. CMP: 2 or 4 cores, 3.2GHz ROB size: 128 Fetch/Exec/Retire/Commit width: 4 / 7 / 5 / 3 Branch predictor: G-share, 64KB, 4K BTB Branch misprediction penalty: 10 cycles Processor side prefetcher: Stride Enhancement . for Accurate Stream Prefetching. Gang Liu. 1. , . Zhuo. Huang. 1. , . Jih. -Kwon Peir. 1. , . Xudong. Shi. 2. , Lu Peng. 3. 1. University of Florida . 2. Google Inc. 3. Louisiana . State . Seth Pugsley. Predicting the Future. Where have we seen prediction before?. Does it always work?. Prefetching is prediction. Predict which cache line will be used next, and place it in the cache before it is used. Adwait Jog. , Onur Kayiran, Asit Mishra, Mahmut Kandemir, Onur Mutlu, Ravi Iyer, Chita Das . Multi-threading. Caching. Prefetching . Main Memory. . Improve. . Replacement . Policies. Parallelize your code!. Ankit Sethia. 1. , Ganesh Dasika. 2. , . Mehrzad. Samadi. 1. ,. Scott Mahlke. 1. 1-University of Michigan. 2-ARM R&D Austin. 1. Introduction. 2. High Throughput – 1 . TeraFlop. . High Energy Efficiency. . Smruti. R. . Sarangi. Contents. Motivation for Prefetching. Simple Schemes. Recent Work. Proactive Instruction Fetching. Return Address Stack Directed Prefetching. Pentium 4 Trace Cache. University of Virginia. April . 26, . 2016. COMPUTER ARCHITECTURE . CS 6354. Prefetching. The content and concept of this course are adapted from CMU ECE 740. AGENDA. Logistics. Review . from last . lecture. . Smruti. R. . Sarangi. Contents. Motivation for Prefetching. Simple Schemes. Recent Work. Proactive Instruction Fetching. Return Address Stack Directed Prefetching. Pentium 4 Trace Cache. Exploiting Unique Opportunities. Ahmad . Lashgar. and . Amirali. . Baniasadi. ECE Department. University of Victoria. 1. Overview. Prefetching Challenges:. Prefetching accuracy. Timeliness. Bandwidth control. Tomofumi Yuki . . INRIA . Rennes. Antoine . Morvan. . . ENS . Cachan. Bretagne. Steven . Derrien. . University of Rennes 1. Memory Optimizations. Memory Wall. Results and Awards. Seth . Pugsley. Thanks!. Big thanks to Hyesoon Kim and the Program Committee. Babak. . Falsafi. Mike . Ferdman. . Aamer. Jaleel. Daniel Jiménez Calvin Lin . Moin. Qureshi.
Download Document
Here is the link to download the presentation.
"Data Prefetching"The content belongs to its owner. You may download and print it for personal use, without modification, and keep all copyright notices. By downloading, you agree to these terms.
Related Documents