Jump to navigation

  • Academics
  • Research
  • Partnership
  • News
  • Events
  • LTI Intranet
  • Contact Us
  • Careers
  • Apply
  • Academics
  • Research
  • Partnership
  • People

Search form

Events Listing for: 02/10/2020

 Download .ics
Monday, February 10, 2020 - 1:00pm
LTI Colloquium: Deep Learning Representations with Continuous Structure
Paul Smolensky, Partner Researcher and Part-Year Krieger-Eisenhower Professor of Cognitive Science
John Hopkins University
2315 Doherty Hall
Abstract:
Current deep learning models exploit the power of continuous, dense-vector embeddings of discrete tokens such as words. Traditional symbolic methods of AI, NLP and linguistics exploit the power of structured discrete representations. I will present models that exploit dense-vector embeddings of representations possessing continuous structure that is learned from data. The continuous structured representations of natural and formal language expressions learned by these models enhance performance and interpretability relative to the unstructured hidden representations of standard deep learning models. 
Bio:
Paul Smolensky is a partner researcher in the Deep Learning Group and part-year Krieger-Eisenhower Professor of Cognitive Science at Johns Hopkins University.

His work focuses on the integration of symbolic and neural network computation for modeling reasoning and, especially, grammar in the human mind/brain. This work created: Harmony Networks (a.k.a. Restricted Boltzmann Machines); Tensor Product Representations; Optimality Theory and Harmonic Grammar (grammar frameworks grounded in neural computation); and Gradient Symbolic Computation. The work up through the early 2000’s is presented in the 2-volume MIT Press book with G Legendre, The Harmonic Mind.

Before assuming his position at Johns Hopkins he was a professor in the Computer Science Department of the University of Colorado at Boulder. Prior to that, as a postdoc and Research Professor, with G Hinton, D Rumelhart & J McClelland he was one of the founding members of the Parallel Distributed Processing Research Group at UCSD, which produced the bible of the previous wave of neural network modeling, the two-volume ‘PDP books’. His BA and PhD are in (Mathematical) Physics from Harvard and Indiana University.

He received the fifth David E. Rumelhart Prize for Outstanding Contributions to the Formal Analysis of Human Cognition in 2005.

Details:

John Friday jfriday@cs.cmu.edu

LTI Colloquium


February 2020

S M T W T F S
 
 
 
 
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
10
 
11
 
12
 
13
 
14
 
15
 
16
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 

Display Events by Category:

All Events
Thesis Oral
Special Event
Seminar
Thesis Proposal
Thesis Defense
LTI Colloquium
Upcoming Guest Lectures
LTI PhD Thesis Proposal
LTI PhD Theseis Defense
LTI Summer Seminar Talk
Roundtable

Contact Us

Language Technologies Institute
5000 Forbes Avenue
Pittsburgh, PA
15213-3891

412-268-6591
ltiwebmaster@cs.cmu.edu

Connect

   
Carnegie Mellon University School of Computer Science

Login | Logout