screwlisp proposes kittens

Common lisp Classical AI and deep learning ai the difference in two clear example implementations

Now, I am just writing this off the cuff, because I promised my friend Vassil I would probably two months ago now. I hope the reader personally corrects any particularly egregious errors here via the Mastodon thread. But I think this example gets to the heart of things without any detours.

Here we will write common lisp for a minimal classical ai program and a minimal deep learning ai program that will use classical ai and deep learning inferencing respectively to identify the artist of well known artworks.

My own eev emacs lisp Setup

#|
• (setq inferior-lisp-program "ecl")
• (slime)
• (setq eepitch-buffer-name "*slime-repl ECL*")
|#

Query Inputs

The programs will operate upon

(defvar *queries* '((water lillies)
		    (melting clocks)))

Data inputs

This data basically spans the non-unix-sureallism art that I am aware of.

(defvar *data*
  '(((water icarus)
     Bruegel-the-Elder)
    ((water fish big-fish eat little-fish boats)
     Bruegel-the-Elder)
    ((water fish boats)
     Monet)
    ((water lillies)
     Monet)
    ((melting clocks)
     Salvador-Dali)))

Classical AI program

(loop
  :for query :in *queries*
  
  :for result
    := (assoc query *data* :test 'equal)
  
  :do (format t "~&~a ⮕ ~a"
	      query result))

Result:

(WATER LILLIES) ⮕ ((WATER LILLIES) MONET)
(MELTING CLOCKS) ⮕ ((MELTING CLOCKS) SALVADOR-DALI)

Deep learning

A Hopfield net is dual to deep learning with a certain activation function. This means a Hopfield net is literally a deep learning implementation.

I find it tremendously reassuring when someone keeps saying nouveau deep learning words for me to remember that what they are talking about can be implemented with this bitesized iteration (and lengthy numeric reencoding of data).

(require 'alexandria)

(defun encode-memories (data &optional (queries nil))
  (let ((uniq (remove-duplicates
	       (alexandria:flatten data))))
    (values (cons uniq
		  (loop
		    :for entry :in data
		    :for flat := (alexandria:flatten entry)
		    :collect
		    (loop :for u :in uniq
			  :if (member u flat)
			    :collect +1
			  :else
			    :collect -1)))
	    
	    (loop :for query :in queries
		  :collect
		  (loop
		    :with  flat := (alexandria:flatten query)
		    :for u :in uniq
		    :if (member u flat)
		      :collect +1
		    :else
		      :collect -1)))))

(defun make-rectified-polynomial (degree)
  (lambda (x) (cond ((plusp x) (expt x degree))
		    ((not (plusp x)) '0))))

(defun hopfield-update (idx query memories rect)
  (loop
    :for memory :in (cdr memories)
    :sum
    (loop :for c :in query
	  :for m :in memory
	  :for count :from 0
	  :for cm := (* c m)
	  :sum (cond ((= idx count) (* +1 m))
		     ((not (= idx count)) cm))
	    :into plusp-sum
	  :sum (cond ((= idx count) (* -1 m))
		     ((not (= idx count)) cm))
	    :into minusp-sum
	  :finally
	     (return (- (funcall rect plusp-sum)
			(funcall rect minusp-sum))))
      :into tally
    :finally (setf (nth idx query)
		   (if (plusp tally) +1 -1))
	     (return query)))

(defun interpret (query data)
  (loop :for q :in query :for d :in (car data)
	:when (plusp q) :collect d))

The Hopfield net requires us to either update uniform randomly, or every neuron (every +1 or -1 in the query) every step in parallel. We have to keep forcing the query values or it will try and wander away from them.

Now, we can see classically

(car (encode-memories *data* *queries*))
(search '(salvador-dali) *)
(search '(monet) **)

that the correct updates to try are at

CL-USER> (car (encode-memories *data* *queries*))
(ICARUS BIG-FISH EAT LITTLE-FISH BRUEGEL-THE-ELDER FISH BOATS WATER LILLIES
 MONET MELTING CLOCKS SALVADOR-DALI)
CL-USER> (search '(salvador-dali) *)
12
CL-USER> (search '(monet) **)
9
CL-USER> 

are at 12 and 9, which lets us jump straight to the ending and save the lengthy numeric process.

(multiple-value-bind
      (data encoded-queries)
    (encode-memories *data* *queries*)
  (loop
    :with rect := (make-rectified-polynomial 2)
    
    :for encoded-query :in encoded-queries :do
      (loop
	:for idx :in '(9 12) ; cheat
	;; or
	;; := (random (length encoded-query)) ; one at a time..
	;; or
	;; :below (length encoded-query) ; need to turn this one into a new query
	
	:for encoded-update
	  := (hopfield-update idx
			      (copy-list encoded-query)
			      data
			      rect)
	:do (format t "~&Update ~d: ~a ⮕ ~a"
		    idx
		    (interpret encoded-query data)
		    (interpret encoded-update data))
	:finally (terpri))))

and so

CL-USER> (multiple-value-bind
      (data encoded-queries)
    (encode-memories *data* *queries*)
  (loop
    :with rect := (make-rectified-polynomial 2)
    
    :for encoded-query :in encoded-queries :do
      (loop
	:for idx :in '(9 12) ; cheat
	;; or
	;; := (random (length encoded-query)) ; one at a time..
	;; or
	;; :below (length encoded-query) ; need to turn this one into a new query
	
	:for encoded-update
	  := (hopfield-update idx
			      (copy-list encoded-query)
			      data
			      rect)
	:do (format t "~&Update ~d: ~a ⮕ ~a"
		    idx
		    (interpret encoded-query data)
		    (interpret encoded-update data))
	:finally (terpri))))
Update 9: (WATER LILLIES) ⮕ (WATER LILLIES MONET)
Update 12: (WATER LILLIES) ⮕ (WATER LILLIES)
Update 9: (MELTING CLOCKS) ⮕ (MELTING CLOCKS)
Update 12: (MELTING CLOCKS) ⮕ (MELTING CLOCKS SALVADOR-DALI)
NIL
CL-USER> 

Let us quickly look at one problem scenario that happens in the random updating scheme if we try and update position 8:

CL-USER> (multiple-value-bind
      (data encoded-queries)
    (encode-memories *data* *queries*)
  (loop
    :with rect := (make-rectified-polynomial 2)
    
    :for encoded-query :in encoded-queries :do
      (loop
	:for idx :in '(8) ; cheat
	;; or
	;; := (random (length encoded-query)) ; one at a time..
	;; or
	;; :below (length encoded-query) ; need to turn this one into a new query
	
	:for encoded-update
	  := (hopfield-update idx
			      (copy-list encoded-query)
			      data
			      rect)
	:do (format t "~&Update ~d: ~a ⮕ ~a"
		    idx
		    (interpret encoded-query data)
		    (interpret encoded-update data))
	:finally (terpri))))
Update 8: (WATER LILLIES) ⮕ (WATER)
Update 8: (MELTING CLOCKS) ⮕ (MELTING CLOCKS)
NIL
CL-USER> 

We can see that we actually moved away from the correct data. This is kind of a feature, and is what the deep in deep learning refers to. We know that this algorithm (choice of deep learning activation function) eventually converges to the most similar training data to the query; that is just what it says to do. Unless there is a problem. Which here, there was.

Here, it numerically decided to move away from the query data first. In our case with very small data, this created an ambiguity as to which famous picture it was (lots of pictures have water). It is standard practice to keep forcing the query as we did since we “know” that it ends up being correct to use our query for our query to help it along.

Conclusions

We were able to identify the artist of famous pictures using classical AI (a list processing approach) and deep learning inferencing, implemented as a modern hopfield network.

We can see that in the classical ai example it makes sense to say

(Water Lillies) was by Monet, because Monet was associated with (water lillies)

Whereas the deep learning AI had a lengthy numerical encoding of the data and queries, and the only explanation for it arriving at Monet from water lillies is

This numeric algorithm operating on this data and this query eventually places +1 at MONET’s encoding.

In the case of a hopfield network, we know that the hopfield network’s implicit choice of deep learning activation function acts as an autoassociative memory (it picks the most similar training data = memory)

Fin.

I can imagine I could have made a mistake; your thoughts and corrections on the Mastodon thread will be deeply appreciated.

See everyone at the live show (The Lispy Gopher Climate) in three hours at Tuesday, 8pm Boston time in the Americas (0UTC Wednesday) on https://anonradio.net ; archives in https://communitymedia.video/c/screwtape_channel/videos

We will talk about this article I guess, unless some guest I attempted to schedule and forgot about shows up, in which case we will talk to them.

You are especially welcome to share this article where and how occurs to you, since I think it clarifies modern DL and classical AI, and could foster criticism and responses since the article deals with a central modern computing topic. The same is true for all my writings and archives (the show archives are cc/sa though- let me know if you need a different license).

screwlisp proposes kittens