What children can teach us about building AI
Dr Waku Dr Waku
15.8K subscribers
5,622 views
440

 Published On Mar 5, 2024

Human children are extremely proficient at picking up skills like motor control, planning, social communication, etc. Professor Elizabeth S. Spelke studies these abilities from the very beginning by tracking infant gazes to see what holds their attention. She concludes that language is a cognitive catalyst that allows 6 core types of reasoning to be combined together, in addition to enabling speech. AI systems based around large language models have focused exclusively on language so far, but perhaps we should be thinking about utilizing these other six core types.

Recent research titled SyReLM examines one such combination, combining formal reasoning (symbolic solver) with language capabilities. They extend a small language model, which does not have any numeric reasoning capabilities, to generate python that calls a formal reasoning engine. This represents a deep integration of a tool into the fine-tuning process for a model, and the experiments are highly positive as a result.

Yann LeCun describes AI systems as lacking hierarchical planning based on hierarchical world models. Humans undergo hierarchical planning all the time, though we sometimes forget small steps like where our keys are. AI systems will need more precise world models that may depend on more of those types of reasoning that are already present in humans and other animals. An interesting direction for future research.

#ai #human #language

Insights in Human Knowledge, From the Minds of Babies
https://www.nytimes.com/2012/05/01/sc...

How Children Learn - Elizabeth S. Spelke, Harvard University
https://aaai.org/aaai-conference/invi...

Objective-Driven AI: Towards Machines that can Learn, Reason, and Plan [Yann LeCun]
https://aaai.org/aaai-conference/invi...

Frugal LMs Trained to Invoke Symbolic Solvers Achieve Parameter-Efficient Arithmetic Reasoning
https://arxiv.org/abs/2312.05571

Animal cognition and the evolution of human language: why we cannot focus solely on communication
https://royalsocietypublishing.org/do...

0:00 Intro
0:26 Contents
0:33 Part 1: Child psychology
0:52 Talk by Elizabeth S. Spelke
1:42 Infant gaze tracking experiments
2:07 Experiment: learning 4 vs 12 objects
2:52 Word segmentation problem
3:26 Example: exposure to Spanish as a baby
4:09 Experiment: paying attention to native language
4:48 Part 2: The special sauce
5:05 The six core knowledge types
5:35 Seventh knowledge type in humans: language
6:02 Example: learning about Myanmar
6:31 Language is a cognitive catalyst (compositional)
7:36 Recursively structured thought (Chomsky)
8:33 Do animals have language?
9:05 Example: limitations of honeybee communication
9:39 Use multiple core types of knowledge for AI
10:23 Part 3: Relation to AI research
10:34 Paper: SyReLM: fusing a symbolic solver
11:19 Generating python or pseudocode as a formal language
11:57 Walkthrough and experimental results
12:42 Deep integration with symbolic solver
13:49 Yann LeCun: hierarchical planning and world models
14:34 LLMs don't have good planning capabilities
15:06 Example: hierarchical goals to make a sandwich
16:11 Conditions like ADHD can impede hierarchical planning
16:38 Example: hierarchical planning in programming
17:19 A hierarchical world model benefits from more types of reasoning
18:17 Conclusion
18:40 Six core knowledge systems plus language
19:19 Recursive compositional thought
19:38 AI research SyReLM and hierarchical planning
20:25 Outro

show more

Share/Embed