Hi! My name is Bo Morgan.
I was 7 years old when I learned to program an Atari 800 in the programming language, Basic.
I was in high school when my dad gave me a book titled C++ Neural Networks and Fuzzy Logic. While in high school, I made a greenhouse gas weather simulation using four “feedforward backpropagation” neural networks trained on one gigabyte of data from the nasa.gov Internet site. I entered my project in the high school science fair and won first place at the San Diego county level. This helped me to springboard to an acceptance letter to MIT, a school I had never heard about, but that my physics teacher, Mr. Dave Thuleen of FUHS (Fallbrook Union High School), recommended to my parents.
While at MIT, I was repeatedly honored to be introduced to and allowed to attempt to have conversations with some of the most brilliant minds: friends, colleagues, advisors, professors, and readers. For my undergraduate studies, I focused on Neuroscience and Artificial Intelligence equally. I worked on a UROP (Undergraduate Research Opportunity Project) with a graduate student of the great AI professor Marvin Minsky. I built a rigid-body physical simulation of a robot for Push Singh’s PhD demo.
For my masters studies, I focused on Artificial Intelligence applications of probabilistic semantic reasoning algorithms at the Media Lab. I used an MRF (Markov Random Field) to represent the probabilistic transition matrix of a 100,000 bit “first-person perspective commonsense language” propositional state space in a natural language story inference tool called LifeNet, which also processed sensor network data.
For my PhD studies, I focused on a combination of Neuroscience and Artificial Intelligence, trying to understand the top three layers of Prof. Minsky’s Model-6 “Emotion Machine” theory. My dissertation focused much more strictly on a subcomponent of Prof. Minsky’s theory, which is a computational implementation of a theory of how to plan, execute plans, and respond to plan execution failures with a very general form of reflective learning. My PhD demo implementation is called SALS (Substrate for Accountable Layered Systems) and is written in a custom lisp-like programming language called Funk2, named after the 1970’s style of music related to soul and jazz, called “funk”.
Just after graduating with my PhD, I was lucky to be hired as CTO (Chief Technology Officer) of a Palo Alto startup called AIBrain, Inc. We worked to develop a very general conversational smartphone robot toy that plans conversations that include physical actions along with speech actions. I also worked with M.D. Fumiko Hoeft at UCSF to develop an SEL (Social and Emotional Learning) training application for children that exercised their self-affirming self-reflective thought processes as perceived scientifically through an fMRI (fast Magnetic Resonance Imaging) machine focused on the brains of children playing our app.
Briefly, I had an amazing time working at DreamWorks Animation as a Technology Lead focused on Artificial Intelligence of stories, film, and interactive experiences. Please also see my Current Research page for up-to-date information about my current public research projects.
Currently, I work at Apple as an Artificial Intelligence Project Lead.