Alexander G. Ororbia II

Contact Information:
Dept. of Computer Science
Rochester Institute of Technology
20 Lomb Memorial Dr
Rochester, NY 14623
ago AT cs DOT rit DOT edu
"The whole thinking process is still rather mysterious to us, but I believe that the attempt to make a thinking machine will help us greatly in finding out how we think ourselves."
-Alan Turing
Alex, one of the Connectionists!


Eductional Background:

  • B.S.E, Computer Science & Engineering, Bucknell University
    • Philosophy Minor, Bucknell University
    • Mathematics Minor, Bucknell University
  • Ph.D., Information & Science Technology, Pennsylvania State University
    • Social Data Analytics (SoDA) Minor, Pennsylvania State University

You can find a link to my curriculum vitae here and a link to my Google Scholar page as well as my Research Gate page (which is much more accurate in terms of paper citations than Google Scholar). I am also quazi-active on Quora (a Question-Answer forum/website). I fall under the RIT Computer Science Department's Artificial Intelligence Cluster (or Intelligent Systems Cluster).
The Temporal Neural Coding Network.
The proposed Temporal Neural Coding Network (TNCN) for continual (sequence) learning and modeling (Ororbia et al., 2018).

News

  • Ph.D. Position for Fall 2019
    • I am looking for motivated, talented, and enthusiastic PhD students to work in the area of machine learning, specifically on developing neural memory models, biologically-motivated learning and optimization algorithms, and models of predictive coding with a focus on continual (machine) learning. Please contact me with a short description about yourself, your interests, and your CV. (In addition, if you are well-versed with Scala in addition to Python, certainly make that clear to me.)
  • December 27, 2018: New work Learned Neural Iterative Decoding for Lossy Image Compression Systems accepted as full paper for presentation/publication at 2019 Data Compression Conference proceedings (DCC 2019)
  • October 31, 2018: New work Biologically Motivated Algorithms for Propagating Local Target Representations, accepted for presentation/publication at Thirty-Third AAAI Conference on Artificial Intelligence (AAAI-19)
  • June 15, 2018: Dissertation defense passed: Coordinated Local Learning Algorithms for Continuously Adaptive Neural Systems (Ororbia)

    Research Interests

    • Lifelong machine learning (LML), also referred to as never-ending learning, lifelong learning, and continual learning
    • Biologically-inspired learning algorithms, statistical learning procedures (especially for semi-supervised/stream-based/multi-modal learning)
    • Artificial neural networks:
      • Recurrent networks, neural memory models
      • Neural probabilistic/generative models, particularly Boltzmann machines and variational autoencoders

    Research Statement

    The focus of my work is on lifelong learning--an important and challenging open problem for machine learning. I study representation learning and draw insights and ideas from cognitive science and neuroscience to create intelligent systems that are ultimately mean to improve their performance in online, semi-supervised, real-world environments. Statistical learning programs today perform well on very constrained, narrowly-defined tasks but struggle and fail when required to extract and aggregate knowledge across multiple tasks, consisting of data from multiple modalities, and dealing with non-stationary, one-shot, and zero-shot learning environments. My mission is to develop the learning algorithms and models needed to create such general-purpose, adaptive agents.

    It is my research group's endeavor to synthesize key aspects of models of cognition and biological neuro-circuitry, as well as theories of mind and brain functionality, to construct new learning algorithms and architectures that generalize better to unseen data and continually adapt to novel situations. Ultimately, the hope is that by building lifelong learning machines, we might gain further insight into the workings of human intelligence itself.

    Current Group Members/Students

    Ankur Mali, Ph.D. student
    Xu Sun, MSc student
    Michael Peechatt, MSc student

    Teaching

    2018-2019:

  • CSCI 633: Biologically Inspired Intelligent Systems
  • CSCI 630: Foundations of Intelligent Systems (Intro to AI)