Skip to main content

Artificial Intelligence: past, present, future(s)

Artificial Intelligence: past, present, future(s)

A brief history of AI

A recent discipline

Kickoff: 1956 Dartmouth workshop on Artificial Intelligence.

“AI is the science and engineering of making intelligent machines.” (John McCarthy)

“AI is the science of making machines do things that would require intelligence if done by men.” (Marvin Minsky)

Dartmouth workshop

Image credits: the Marvin Minsky family

High ambitions

“Every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it.” (Dartmouth workshop proposal, 1956)

“Machines will be capable, within twenty years, of doing any work a man can do.” (Herbert Simon, 1965)

“In for three to eight years we will have a machine with the general intelligence of an average human being.” (Marvin Minsky, 1970)

What is intelligence, by the way?

Controversial definition, varying in what its abilities are and whether or not it is quantifiable [Legg and Hutter, 2007].

“Intelligence is the ability to perceive or infer information; and to retain it as knowledge to be applied to adaptive behaviors within an environment or context [Radha R. Sharma, 2008].”

“Intelligence is what you use when you don’t know what to do.” (Jean Piaget)

The AI landscape

  • Main fields of research:
    • Problem solving (e.g. search algorithms, constraint solving).
    • Reasoning and decision making (e.g. logic, knowledge representation).
    • Machine Learning (e.g. systems that improve with experience).
    • Real-world interactions (e.g. computer vision, natural language understanding).
  • Either purely software-based, or embedded in hardware devices.

A highly interdisciplinary field

AI fields

A tumultuous history

AI timeline

Image credits: David Lavenda

Two competing approaches

AI approaches

Image credits: [Cardon et al., 2018]

How to define AI?

“AI is an interdisciplinary field aiming at understanding and imitating the mechanisms of cognition and reasoning, in order to assist or substitute humans in their activities.” (Commission d’enrichissement de la langue française, 2018)

AI is whatever hasn’t been done yet.” (Larry Tesler)

AI’s booming present

Machine Learning: a new paradigm

“The field of study that gives computers the ability to learn without being explicitly programmed.” (Arthur Samuel, 1959).

Programming paradigm

Training paradigm

Algorithm #1: K-Nearest Neighbors

Prediction is based on the $k$ nearest neighbors of a data sample.

K-NN

Algorithm #2: Decision Trees

Build a tree-like structure based on a series of learned questions on the data.

Decision Tree for Iris dataset

Algorithm #3: Artificial Neural Networks

Layers of loosely neuron-inpired computation units that can approximate any continuous function.

MNIST neural network example

Image credits: 3Blue1Brown

The Deep Learning tsunami

Subfield of Machine Learning consisting of multilayered neural networks trained on vast amounts of data.

AlexNet'12 (simplified)

Image credits: [Krizhevsky et al., 2012]

In a decade, outperformed previous SOTA approaches in many fields (computer vision, language, processing, and much more).

From labs to everyday life in 25 years

LeCun - LeNet

Image credits: Yann LeCun

Facial recognition in Chinese elementary school

Image credits: Matthew Brennan

Reasons for success

Reasons for DL success

Image credits: DeepMind

Case study: Large Language Models

ChatGPT letters

Image credits: Shutterstock

  • Model: a function $f$ defining the relationship between inputs (data) and outputs (results).

$$f(inputs) = outputs$$

  • LLM: a (very) large model designed for language processing tasks.
  • Kickoff: ChatGPT, November 2022.

T for Transformer

Transfomer architecture

  • Neural network architecture designed to handle sequential data.
  • Cheaper to train and easier to parallelize than previous approaches.

Image credits: [Vaswani et al., 2017]

G for Generative

  • The next word is predicted via an attention-powered statistical analysis of the other words.
  • This process can be repeated to produce entire texts.

Autoregressive model animation

Image credits: Welch Labs

Smurfs extract (in French)

Image credits: Peyo

P for Pretrained

  • Models like ChatGPT contain billions of internal parameters.
  • They are trained on humongous amounts of data (whole parts of the Internet).
  • These processes have huge energetical costs.

A nuclear power plant

Image credits: Getty

The uncertain future(s) of AI

AI’s growing impact on individuals and society

AI has more and more social and societal implications:

  • Job market transformation.
  • Human/machine interactions.
  • Trust and acceptability.
  • Legal aspects and regulation (AI hallucination cases).
  • Fairness.
  • Ethics.
  • Privacy and usage of personal data.

Which AI will prevail?

  • Substitutive intelligence: replacement of men by machines.
  • Augmented intelligence: human-centered AI for performance augmentation & autonomy enhancement.
  • Hybrid intelligence: human-machine collaboration on complex tasks.

The AGI debate

  • AGI = Artificial General Intelligence
    • The ability to perform any task as well as a human.
  • Related concept: strong AI (intentionality anc consciousness).
  • Are recent models sparks of AGI [Buceck et al., 2023] or merely stochastic parrots [Bender et al. 2021]?

The technological singularity

Compared cognitive evolutions

Image credits: Bernard Claverie

The Chinese room argument

Is showing intelligent behavior the same as being intelligent? [Searle, 1980s]

Chinese room

Image credits: elementsofai

The human brain: a masterpiece of evolution

  • Approx. 86 billions neurons in 1.4 kg.
  • Typical energy consumption: 20 W (!)
  • So much of it is still unknown.

The human brain

Babies are outstanding learners

Babies learning roadmap

Image credits: Emmanuel Dupoux

Thanks for your attention!

Any questions?

A freindly robot

Image credits: Adobe Stock