This is an automated email from the ASF dual-hosted git repository.

cmeier pushed a commit to branch clojure-bert-qa-example
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git

commit 03d2f4f7b3d95995d9b1fd4f542d7400570a971a
Author: gigasquid <[email protected]>
AuthorDate: Sat Apr 13 11:59:06 2019 -0400

    move input to edn file and rearrange things
---
 contrib/clojure-package/examples/bert-qa/README.md |  79 +++++++++++++---
 .../examples/bert-qa/squad-samples.edn             |  17 ++++
 .../examples/bert-qa/src/bert_qa/core.clj          | 104 +++++++++++----------
 3 files changed, 137 insertions(+), 63 deletions(-)

diff --git a/contrib/clojure-package/examples/bert-qa/README.md 
b/contrib/clojure-package/examples/bert-qa/README.md
index fc21bdd..a61e270 100644
--- a/contrib/clojure-package/examples/bert-qa/README.md
+++ b/contrib/clojure-package/examples/bert-qa/README.md
@@ -1,22 +1,73 @@
 # bert-qa
 
-A Clojure library designed to ... well, that part is up to you.
+**This example was based off of the Java API one. It shows how to do inference 
with a pre-trained BERT network that is trained on Questions and Answers using 
the [SQuAD Dataset](https://rajpurkar.github.io/SQuAD-explorer/)**
 
-## Usage
+The pretrained model was created using GluonNLP and then exported to the MXNet 
symbol format. You can find more information in the background section below.
 
-FIXME
+In this tutorial, we will walk through the BERT QA model trained by MXNet. 
+Users can provide a question with a paragraph contains answer to the model and
+the model will be able to find the best answer from the answer paragraph.
 
-## License
+Example:
 
-Copyright © 2019 FIXME
+```
+{:input-answer "Steam engines are external combustion engines, where the 
working fluid is separate from the combustion products. Non-combustion heat 
sources such as solar power, nuclear power or geothermal energy may be used. 
The ideal thermodynamic cycle used to analyze this process is called the 
Rankine cycle. In the cycle, water is heated and transforms into steam within a 
boiler operating at a high pressure. When expanded through pistons or turbines, 
mechanical work is done. The redu [...]
+  :input-question "Along with geothermal and nuclear, what is a notable 
non-combustion heat source?"
+  :ground-truth-answers ["solar"
+                         "solar power"
+                         "solar power, nuclear power or geothermal 
energysolar"]}
+```
 
-This program and the accompanying materials are made available under the
-terms of the Eclipse Public License 2.0 which is available at
-http://www.eclipse.org/legal/epl-2.0.
+The prediction in this case would be `solar power`
+
+## Setup Guide
+
+### Step 1: Download the model
+
+For this tutorial, you can get the model and vocabulary by running following 
bash file. This script will use `wget` to download these artifacts from AWS S3.
+
+From the `scala-package/examples/scripts/infer/bert/` folder run:
+
+```bash
+./get_bert_data.sh
+```
+
+Some sample questions and answers are provide in the `squad-sample.edn` file. 
Some are taken directly from the SQuAD dataset and one was just made up. Feel 
free to edit the file and add your own!
+
+
+## To run
+
+* `lein install` in the root of the main project directory
+* cd into this project directory and do `lein run`. This will execute the cpu 
version.
+
+`lein run :cpu` - to run with cpu
+`lein run :gpu` - to run with gpu
+
+## Background
+
+To learn more about how BERT works in MXNet, please follow this [MXNet Gluon 
tutorial on NLP using 
BERT](https://medium.com/apache-mxnet/gluon-nlp-bert-6a489bdd3340).
+
+The model was extracted from MXNet GluonNLP with static length settings.
+
+[Download link for the script](https://gluon-nlp.mxnet.io/_downloads/bert.zip)
+
+The original description can be found in the [MXNet GluonNLP model 
zoo](https://gluon-nlp.mxnet.io/model_zoo/bert/index.html#bert-base-on-squad-1-1).
+```bash
+python static_finetune_squad.py --optimizer adam --accumulate 2 --batch_size 6 
--lr 3e-5 --epochs 2 --gpu 0 --export
+
+```
+This script will generate `json` and `param` fles that are the standard MXNet 
model files.
+By default, this model are using `bert_12_768_12` model with extra layers for 
QA jobs.
+
+After that, to be able to use it in Java, we need to export the dictionary 
from the script to parse the text
+to actual indexes. Please add the following lines after [this 
line](https://github.com/dmlc/gluon-nlp/blob/master/scripts/bert/staticbert/static_finetune_squad.py#L262).
+```python
+import json
+json_str = vocab.to_json()
+f = open("vocab.json", "w")
+f.write(json_str)
+f.close()
+```
+This would export the token vocabulary in json format.
+Once you have these three files, you will be able to run this example without 
problems.
 
-This Source Code may also be made available under the following Secondary
-Licenses when the conditions for such availability set forth in the Eclipse
-Public License, v. 2.0 are satisfied: GNU General Public License as published 
by
-the Free Software Foundation, either version 2 of the License, or (at your
-option) any later version, with the GNU Classpath Exception which is available
-at https://www.gnu.org/software/classpath/license.html.
diff --git a/contrib/clojure-package/examples/bert-qa/squad-samples.edn 
b/contrib/clojure-package/examples/bert-qa/squad-samples.edn
new file mode 100644
index 0000000..1eb2b13
--- /dev/null
+++ b/contrib/clojure-package/examples/bert-qa/squad-samples.edn
@@ -0,0 +1,17 @@
+[{:input-answer "Computational complexity theory is a branch of the theory of 
computation in theoretical computer science that focuses on classifying 
computational problems according to their inherent difficulty, and relating 
those classes to each other. A computational problem is understood to be a task 
that is in principle amenable to being solved by a computer, which is 
equivalent to stating that the problem may be solved by mechanical application 
of mathematical steps, such as an alg [...]
+  :input-question "By what main attribute are computational problems 
classified utilizing computational complexity theory?"
+  :ground-truth-answers ["Computational complexity theory"
+                         "Computational  complexity theory"
+                         "complexity theory"]}
+ {:input-answer "Steam engines are external combustion engines, where the 
working fluid is separate from the combustion products. Non-combustion heat 
sources such as solar power, nuclear power or geothermal energy may be used. 
The ideal thermodynamic cycle used to analyze this process is called the 
Rankine cycle. In the cycle, water is heated and transforms into steam within a 
boiler operating at a high pressure. When expanded through pistons or turbines, 
mechanical work is done. The red [...]
+  :input-question "Along with geothermal and nuclear, what is a notable 
non-combustion heat source?"
+  :ground-truth-answers ["solar"
+                         "solar power"
+                         "solar power, nuclear power or geothermal 
energysolar"]}
+ {:input-answer "In the 1960s, a series of discoveries, the most important of 
which was seafloor spreading, showed that the Earth's lithosphere, which 
includes the crust and rigid uppermost portion of the upper mantle, is 
separated into a number of tectonic plates that move across the plastically 
deforming, solid, upper mantle, which is called the asthenosphere. There is an 
intimate coupling between the movement of the plates on the surface and the 
convection of the mantle: oceanic plate [...]
+  :input-question "What was the most important discovery that led to the 
understanding that Earth's lithosphere is separated into tectonic plates?"
+  :ground-truth-answers ["seafloor spreading"]}
+ ;;; totally made up
+ {:input-answer "Susan had a cat named Sammy when she lived in the green 
house."
+  :input-question "What was Susan's cat named?"
+  :ground-truth-answers ["Sammy" "sammy"]}]
diff --git a/contrib/clojure-package/examples/bert-qa/src/bert_qa/core.clj 
b/contrib/clojure-package/examples/bert-qa/src/bert_qa/core.clj
index 079f227..a0078f5 100644
--- a/contrib/clojure-package/examples/bert-qa/src/bert_qa/core.clj
+++ b/contrib/clojure-package/examples/bert-qa/src/bert_qa/core.clj
@@ -8,7 +8,8 @@
             [org.apache.clojure-mxnet.context :as context]
             [org.apache.clojure-mxnet.layout :as layout]
             [org.apache.clojure-mxnet.ndarray :as ndarray]
-            [org.apache.clojure-mxnet.infer :as infer]))
+            [org.apache.clojure-mxnet.infer :as infer]
+            [clojure.pprint :as pprint]))
 
 (def model-path-prefix "model/static_bert_qa")
 ;; epoch number of the model
@@ -16,13 +17,6 @@
 ;; the vocabulary used in the model
 (def model-vocab "model/vocab.json")
 ;; the input question
-#_(def input-q "When did BBC Japan start broadcasting?")
-  (def input-q "What branch of theoretical computer science deals with broadly 
classifying computational problems by difficulty and class of relationship?")
-;;; the input answer
-(def input-a "Computational complexity theory is a branch of the theory of 
computation in theoretical computer science that focuses on classifying 
computational problems according to their inherent difficulty, and relating 
those classes to each other. A computational problem is understood to be a task 
that is in principle amenable to being solved by a computer, which is 
equivalent to stating that the problem may be solved by mechanical application 
of mathematical steps, such as an algorithm.")
-#_(def input-a (str "BBC Japan was a general entertainment Channel.\n"
-                  " Which operated between December 2004 and April 2006.\n"
-                  "It ceased operations after its Japanese distributor 
folded."))
 ;; the maximum length of the sequence
 (def seq-length 384)
 
@@ -72,36 +66,13 @@
         end-idx (-> (ndarray/argmax end-prob 1)
                     (ndarray/->vec)
                     (first))]
-    (println "start-idx" start-idx "end-idx" end-idx)
     (if (> end-idx start-idx)
       (subvec tokens start-idx (inc end-idx))
       (subvec tokens end-idx (inc end-idx)) )
-))
+    ))
 
-(defn infer [ctx]
-  (let [ctx (context/default-context)
-        ;;; pre-processing tokenize sentence
-        token-q (tokenizer (string/lower-case input-q))
-        token-a (tokenizer (string/lower-case input-a))
-        valid-length (+ (count token-q) (count token-a))
-        _ (println "Valid length " valid-length)
-        ;;; generate token types [0000...1111...0000]
-        qa-embedded (into (pad [] 0 (count token-q))
-                          (pad [] 1 (count token-a)))
-        token-types (pad qa-embedded 0 seq-length)
-        ;;; make BERT pre-processing standard
-        token-a (conj token-a "[SEP]")
-        token-q (into [] (concat ["[CLS]"] token-q ["[SEP]"] token-a))
-        tokens (pad token-q "[PAD]" seq-length)
-        _ (println "Pre-processed tokens " token-q)
-        ;;; pre-processing - token to index translation
-        {:keys [idx2token token2idx]} (get-vocab)
-        indexes (tokens->idxs token2idx tokens)
-        ;;; preparing the input data
-        input-batch [(ndarray/array indexes [1 seq-length] {:context ctx})
-                     (ndarray/array token-types [1 seq-length] {:context ctx})
-                     (ndarray/array [valid-length] [1] {:context ctx})]
-        input-descs [{:name "data0"
+(defn make-predictor [ctx]
+  (let [input-descs [{:name "data0"
                       :shape [1 seq-length]
                       :dtype dtype/FLOAT32
                       :layout layout/NT}
@@ -113,26 +84,61 @@
                       :shape [1]
                       :dtype dtype/FLOAT32
                       :layout layout/N}]
-        factory (infer/model-factory model-path-prefix input-descs)
-        predictor (infer/create-predictor
-                   factory
-                   {:contexts [ctx]
-                    :epoch 2})
-        ;;; start predication
-        result (first (infer/predict-with-ndarray predictor input-batch))
-        answer (post-processing result tokens)]
-    (println "Question: " input-q)
-    (println "Answer paragraph: " input-a)
-    (println "Answer: " answer)))
+        factory (infer/model-factory model-path-prefix input-descs)]
+    (infer/create-predictor
+     factory
+     {:contexts [ctx]
+      :epoch 2})))
+
+(defn pre-processing [ctx idx2token token2idx qa-map]
+  (let [{:keys [input-question input-answer ground-truth-answers]} qa-map
+       ;;; pre-processing tokenize sentence
+        token-q (tokenizer (string/lower-case input-question))
+        token-a (tokenizer (string/lower-case input-answer))
+        valid-length (+ (count token-q) (count token-a))
+        ;;; generate token types [0000...1111...0000]
+        qa-embedded (into (pad [] 0 (count token-q))
+                          (pad [] 1 (count token-a)))
+        token-types (pad qa-embedded 0 seq-length)
+        ;;; make BERT pre-processing standard
+        token-a (conj token-a "[SEP]")
+        token-q (into [] (concat ["[CLS]"] token-q ["[SEP]"] token-a))
+        tokens (pad token-q "[PAD]" seq-length)
+        ;;; pre-processing - token to index translation
+
+        indexes (tokens->idxs token2idx tokens)]
+    {:input-batch [(ndarray/array indexes [1 seq-length] {:context ctx})
+                   (ndarray/array token-types [1 seq-length] {:context ctx})
+                   (ndarray/array [valid-length] [1] {:context ctx})]
+     :tokens tokens
+     :qa-map qa-map}))
+
+(defn infer [ctx]
+  (let [ctx (context/default-context)
+        predictor (make-predictor ctx)
+        {:keys [idx2token token2idx]} (get-vocab)
+        ;;; samples taken from 
https://rajpurkar.github.io/SQuAD-explorer/explore/v2.0/dev/
+        question-answers (clojure.edn/read-string (slurp "squad-samples.edn"))
+        ]
+    (doseq [qa-map question-answers]
+      (let [{:keys [input-batch tokens qa-map]} (pre-processing ctx idx2token 
token2idx qa-map)
+            result (first (infer/predict-with-ndarray predictor input-batch))
+            answer (post-processing result tokens)]
+        (println "===============================")
+        (println "      Question Answer Data")
+        (pprint/pprint qa-map)
+        (println)
+        (println "  Predicted Answer: " answer)
+        (println "===============================") ))))
 
 (defn -main [& args]
   (let [[dev] args]
-            (if (= dev ":gpu")
-              (infer (context/gpu))
-              (infer (context/cpu)))))
+    (if (= dev ":gpu")
+      (infer (context/gpu))
+      (infer (context/cpu)))))
 
 (comment
 
   (infer :cpu)
 
-)
+  )

Reply via email to