Doctor of Philosophy (PhD)


the Division of Computer Science and Engineering

Document Type



In today's Internet services, key-value system is playing a crucial and indispensable role, and has attracted extensive research in both academia and industry. During the past years, we have witnessed an unprecedented quick development of both memory-based and flash-based key-value systems. With the rapid evolution of memory and storage technologies, we are seeing both challenges and research opportunities.

In this dissertation, we focus on understanding and optimizing the efficiency of large-capacity, high-speed key-value systems from the perspective of both software and hardware to meet the ever-growing performance expectations. We first propose a multi-tier mapping structure, called Cascade Mapping, exploring the unique characteristics of key-value data and workload, to reduce the memory demand and also deliver fast and reliable flash-based caching services. Our second research effort is to make key-value system cache aware. We present our study on a highly cache-efficient scheme, called Cavast, which exploits the very limited CPU cache resources through a software-only approach. Furthermore, to improve the efficiency in metadata management, we abandon the traditional caching scheme and propose a novel design, called Catalyst, to allow the in-memory key-value cache system to efficiently manage billions of small items at low cost.

In our study, we reveal several scalability bottlenecks in the current key-value system design. Our experimental results show that our proposed schemes can significantly improve the performance of key-value systems with minimal changes to the existing systems. Our study contributes to the community by presenting several highly effective new designs of key-value systems, which optimize the current systems' scalability and performance for the purpose of achieving higher efficiency with limited systems resources.



Committee Chair

Chen, Feng

Available for download on Tuesday, December 31, 2024