Files in this item



application/pdfSP21-ECE499-Thesis-Zhang, Jialiang.pdf (1MB)Restricted to U of Illinois
(no description provided)PDF


Title:Analyzing bottlenecks in large recommendation systems
Author(s):Zhang, Jialiang
Contributor(s):Hwu, Wen-Mei
Degree:B.S. (bachelor's)
Subject(s):Recommendation Systems
Machine Learning
Abstract:Training and inferencing recommendation systems often have a greater need for analysis and computation over a large number of unstructured user-specific data blobs. One of the state-of-the-art recommendation models is Deep Learning Recommendation Model (DLRM) by Facebook. DLRM model consumes a large memory for storing embedding features with terabytes in size during training and inference. Aside from the memory cost, the long training time of DLRM is another issue. In this work, we investigated the potential bottlenecks of DLRM and discuss in detail two recent improvements proposed in the literature: pipeDLRM and TT-Rec. PipeDLRM proposes pipeline parallelism and split the whole model onto several GPUs to address compute time without compromising on accuracy while the TT-Rec proposes a new compression method to save embedding memory consumption at a loss of accuracy to an acceptable range. Our analysis of these two models shows that irrespective of the method of implementation, they still have certain issues to improve. For instance, the embedding memory bottleneck still remains in the lookup operation of the embedding tables in the PipeDLRM model. This is because PipeDLRM’s partition only sits on one GPU and impedes the further scaling up process. On the other hand, even though TT-Rec succeeds in reducing the memory complexity of the model, it also requires a significant amount of reuse of the compressed information to retain accuracy. These suggest that there is no right solution to address the memory capacity problem present in the DLRM.
Issue Date:2021-05
Genre:Dissertation / Thesis
Date Available in IDEALS:2021-08-11

This item appears in the following Collection(s)

Item Statistics