help-gnu-emacs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Enhancing ELisp for AI Work


From: Jean Louis
Subject: Re: Enhancing ELisp for AI Work
Date: Mon, 16 Dec 2024 19:26:09 +0300
User-agent: Mutt/2.2.12 (2023-09-09)

* Tomáš Petit <petitthomas34@gmail.com> [2024-12-16 17:57]:
> Greetings,
> 
> wouldn't Common Lisp or some Scheme dialect be better suited for this job
> instead of Emacs Lisp?

Emacs Lisp reaches out to many external environments, so it is portal
to everything else. Even when I use externally Common Lisp, I may be
invoking it from Emacs Lisp, some people live in Emacs, and anyway,
whatever language, it can be still edited within Emacs, run, tested,
it gets somehow similar feelings no matter which language runs.

In my work I have to heavily work with text, and accessing HTTP
endpoints to reach to some of Large Language Models (LLM), is not
hard. Emacs Lisp does it.

Let us say preparing the dataset, I have good tools within Emacs Lisp
to find the data necessary for training of the LLM within
seconds. Then it would need some preparation with external tools which
are ready made for that task. But a lot may be done within Emacs.

Here is simple function:

(defun rcd-llm-response (response-buffer)
  "Parse LLM's RESPONSE-BUFFER and return decoded string."
  (when response-buffer
    (with-current-buffer response-buffer
      ;; Skip HTTP headers
      (goto-char (point-min))
      (when (search-forward "\n\n" nil t)
        (let ((response (decode-coding-string (buffer-substring-no-properties 
(point) (point-max)) 'utf-8)))
          (kill-buffer response-buffer)
          ;; Parse JSON and extract the reply
          (let* ((json-response (json-parse-string response :object-type 
'alist))
                 (choices (alist-get 'choices json-response))
                 (message (alist-get 'message (aref choices 0)))
                 (message (decode-coding-string (alist-get 'content message) 
'utf-8)))
            (string-replace "</s>" "\n" message)))))))

The model Qwen2.5-Coder-32B-Instruct is Apache 2.0. which is free
software license.

(defun rcd-llm-huggingface (prompt &optional memory rcd-llm-model temperature 
max-tokens top-p stream)
  "Send PROMPT to Hugging Face API with specified parameters.

Optional MEMORY, RCD-LLM-MODEL, TEMPERATURE, MAX-TOKENS, TOP-P, and STREAM can 
be used."
  (let* ((rcd-llm-model (or rcd-llm-model "Qwen/Qwen2.5-Coder-32B-Instruct"))
         (temperature (or temperature 0.5))
         (max-tokens (or max-tokens 2048))
         (top-p (or top-p 0.7))
         (stream (if stream t :json-false))
         (url-request-method "POST")
         (url-request-extra-headers
          '(("Content-Type" . "application/json")
            ("Authorization" . "Bearer hf_YOUR-API-KEY")))
         (url-request-data
          (encode-coding-string
           (setq rcd-llm-last-json
                 (json-encode
                  `((model . ,rcd-llm-model)
                    (messages . [((role . "user") (content . ,prompt))])
                    (temperature . ,temperature)
                    (max_tokens . ,max-tokens)
                    (top_p . ,top-p)
                    (stream . ,stream))))
           'utf-8))
         (buffer (url-retrieve-synchronously
                  
"https://api-inference.huggingface.co/models/Qwen/Qwen2.5-Coder-32B-Instruct/v1/chat/completions";)))
    (rcd-llm-response buffer)))

The whole library then does everything I nee to interact with
LLMs. Emacs is for text, LLM is for text, it must go hand in hand.

But generation of the LLM is not yet workable through Emacs Lisp, even
though for sure not impossible, it is just nobody yet tried to create
it that way.

-- 
Jean Louis



reply via email to

[Prev in Thread] Current Thread [Next in Thread]