PPT-NeuGraph : Parallel Deep Neural Network Computation on Large Graphs
Author : pattyhope | Published Date : 2020-08-28
Lingxiao Ma Zhi Yang Youshan Miao Jilong Xue Ming Wu Lidong Zhou Yafei Dai Peking University Microsoft Research USENIX ATC 19 Renton WA USA
Presentation Embed Code
Download Presentation
Download Presentation The PPT/PDF document "NeuGraph : Parallel Deep Neural Network ..." is the property of its rightful owner. Permission is granted to download and print the materials on this website for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
NeuGraph : Parallel Deep Neural Network Computation on Large Graphs: Transcript
Lingxiao Ma Zhi Yang Youshan Miao Jilong Xue Ming Wu Lidong Zhou Yafei Dai Peking University Microsoft Research USENIX ATC 19 Renton WA USA. associationism. Associationism. David Hume (1711-1776) was one of the first philosophers to develop a detailed theory of mental processes.. Associationism. “There . is a secret tie or union among particular ideas, which causes the mind to conjoin them more frequently together, and makes the one, upon its appearance, introduce the . and parallel corpus generation. Ekansh. Gupta. Rohit. Gupta. Advantages of Neural Machine Translation Models. Require . only a fraction of the memory needed by traditional statistical machine translation (SMT) . Professor Qiang Yang. Outline. Introduction. Supervised Learning. Convolutional Neural Network. Sequence Modelling: RNN and its extensions. Unsupervised Learning. Autoencoder. Stacked . Denoising. . Shuochao Yao, Yiwen Xu, Daniel Calzada. Network Compression and Speedup. 1. Source: . http://isca2016.eecs.umich.edu/. wp. -content/uploads/2016/07/4A-1.pdf. Network Compression and Speedup. 2. Why smaller models?. Deep Neural Networks . Huan Sun. Dept. of Computer Science, UCSB. March 12. th. , 2012. Major Area Examination. Committee. Prof. . Xifeng. . Yan. Prof. . Linda . Petzold. Prof. . Ambuj. Singh. Deep . Learning. James K . Baker, Bhiksha Raj. , Rita Singh. Opportunities in Machine Learning. Great . advances are being made in machine learning. Artificial Intelligence. Machine. Learning. After decades of intermittent progress, some applications are beginning to demonstrate human-level performance!. 1. Table of contents. Recurrent models. Partially recurrent neural networks. . Elman networks. Jordan networks. Recurrent neural networks. BackPropagation Through Time. Dynamics of a neuron with feedback. . Kartik . Nayak. With Xiao . Shaun . Wang, . Stratis. Ioannidis, Udi . Weinsberg. , Nina Taft, Elaine Shi. 1. 2. Users. Data. Data. Privacy concern!. Data Mining Engine. Data Model. Data Mining on User Data. Weifeng Li, . Victor Benjamin, Xiao . Liu, and . Hsinchun . Chen. University of Arizona. 1. Acknowledgements. Many of the pictures, results, and other materials are taken from:. Aarti. Singh, Carnegie Mellon University. Eye-height and Eye-width Estimation Method. Daehwan Lho. Advisor: Prof. . Joungho. Kim. TeraByte Interconnection and Package Laboratory. Department of Electrical Engineering . KAIST. Concept of the Proposed Fast and Accurate Deep . Topics: 1. st. lecture wrap-up, difficulty training deep networks,. image classification problem, using convolutions,. tricks to train deep networks . . Resources: http://www.cs.utah.edu/~rajeev/cs7960/notes/ . Mark Hasegawa-Johnson. April 6, 2020. License: CC-BY 4.0. You may remix or redistribute if you cite the source.. Outline. Why use more than one layer?. Biological inspiration. Representational power: the XOR function. Eli Gutin. MIT 15.S60. (adapted from 2016 course by Iain Dunning). Goals today. Go over basics of neural nets. Introduce . TensorFlow. Introduce . Deep Learning. Look at key applications. Practice coding in Python. Just a PC. Aapo Kyrölä . akyrola@cs.cmu.edu. Carlos . Guestrin. University of . Washington & CMU. Guy . Blelloch. CMU. Alex . Smola. CMU. Dave Andersen. CMU. Jure . Leskovec. Stanford. Thesis Committee:.
Download Document
Here is the link to download the presentation.
"NeuGraph : Parallel Deep Neural Network Computation on Large Graphs"The content belongs to its owner. You may download and print it for personal use, without modification, and keep all copyright notices. By downloading, you agree to these terms.
Related Documents