Dr. Xiaochen Guo

Institution: Lehigh University

Department: ECE

Proposed research ideas

Prefetching is an efficient mechanism to hide long memory latency. Emerging memory technologies (e.g., 3D XPoint) exhibit even longer latencies for higher density as compared to DRAM. Prefetchers will become increasingly important to make judicious decisions on and when to move which data to faster memories. Conventional history table-based hardware prefetchers have to be kept simple in order to provide timely predictions, which limits the size of the history tables and hence constrains the prediction accuracy. Resistive memory-based neuromorphic crossbar arrays enable efficient hardware implementations of neural networks, which can be used for hardware prefetchers. We propose to leverage resistive memory-based neural networks to improve prefetcher accuracy and timeliness. Our preliminary work published at MEMSYS’17 has already shown promising results of using Long Short Term Memory (LSTM)-based prefetcher on synthetic traces. We would like to continue on this project and show effectiveness of using LSTM based prefetcher or other neural network based prefetchers on real applications.


My primary motivation to participate this program is to create opportunities for students to interact with world-leading researchers in computer architectures. Lehigh University is a small university. There are very few faculty members who are working on computer architectures and systems. Participating the Sustainable Research Pathways program will create great opportunities for the students to learn from experts outside the university. My second motivation is to get early feedbacks for our ongoing research. As a junior faculty, this will be a great opportunity for me to test whether our research ideas have the potential to create long-term impact. We would like to learn and understand demands, challenges, and trends of HPC applications. We would also like to explore and nourish collaboration opportunities with the staffs as the Berkeley Lab.