Github user cestella commented on the issue:

    https://github.com/apache/metron/pull/795
  
    TESTING PLAN
    
    # Preliminaries
    
    Setup some environment variables for convenience:
    * `export METRON_HOME=/usr/metron/0.4.1` 
    * `export ZOOKEEPER=node1:2181`
    * `export BROKERLIST=node1:6667`
    
    ## Adjust the Profiler Period Duration
    
    * Adjust the profiler config by adjusting the "Period Duration" to 1
      minute in Ambari Profiler config section. 
    
    ## Create data generator
    * Open `~/rand_gen.py` and paste the following:
    ```
    #!/usr/bin/python
    import random
    import sys
    import time
    def main():
      mu = float(sys.argv[1])
      sigma = float(sys.argv[2])
      freq_s = int(sys.argv[3])
      while True:
        out = '{ "value" : ' + str(random.gauss(mu, sigma)) + ' }'
        print out
        sys.stdout.flush()
        time.sleep(freq_s)
    
    if __name__ == '__main__':
      main()
    ```
    This will generate random JSON maps with a numeric field called `value`
    
    ## Create Parser
    
    * Create a new parser called `dummy` by editing 
`$METRON_HOME/config/zookeeper/parsers/dummy.json`:
    ```
    {
      "parserClassName":"org.apache.metron.parsers.json.JSONMapParser",
      "sensorTopic":"dummy"
    }
    
    ```
    * Create the dummy kafka topic:
      `/usr/hdp/current/kafka-broker/bin/kafka-topics.sh --zookeeper $ZOOKEEPER 
--create --topic dummy --partitions 1 --replication-factor 1`
    
    # Test Cases
    
    ## Base Case
    First, let's ensure that data flows through with no enrichments or
    profiler.
    * Push the configs via `$METRON_HOME/bin/zk_load_configs.sh -m PUSH -i 
$METRON_HOME/config/zookeeper -z $ZOOKEEPER`
    * Start the parser via `$METRON_HOME/bin/start_parser_topology.sh -z 
$ZOOKEEPER -s dummy`
    * Send some synthetic data through (stop this after a few seconds):
      `python ~/rand_gen.py 0 1 1 | 
/usr/hdp/current/kafka-broker/bin/kafka-console-producer.sh --broker-list 
$BROKERLIST --topic dummy`
    * Validate data has been written to the index:
    ```
    curl -XPOST 'http://localhost:9200/dummy*/_search?pretty' -d '
    {
      "_source" : [ "source:type", "value" ]
    }
    '
    ```
    
    ## Modify Parser
    Now we'll modify the parser and see if our modifications show up.
    * Delete the dummy indices:
      `curl -XDELETE "http://localhost:9200/dummy*"`
    * Modify `$METRON_HOME/config/zookeeper/parsers/dummy.json` like so:
    ```
    {
      "parserClassName":"org.apache.metron.parsers.json.JSONMapParser",
      "sensorTopic":"dummy",
      "fieldTransformations" : [
        {
        "transformation" : "STELLAR"
       ,"output" : [ "parser_my_name", "parser_value_abs" ]
       ,"config" : {
    "parser_my_name" : "TO_UPPER('casey')",
    "parser_value_abs" : "ABS(value)"
                   }
        }
                               ]
    }
    ```
    * Push the configs via `$METRON_HOME/bin/zk_load_configs.sh -m PUSH -i 
$METRON_HOME/config/zookeeper -z $ZOOKEEPER`
    * Send some synthetic data through (stop this after a few seconds):
      `python ~/rand_gen.py 0 1 1 | 
/usr/hdp/current/kafka-broker/bin/kafka-console-producer.sh --broker-list 
$BROKERLIST --topic dummy`
    * Validate data has been written to the index with the new fields, 
`parser_my_name` and `parser_value_abs`:
    ```
    curl -XPOST 'http://localhost:9200/dummy*/_search?pretty' -d '
    {
      "_source" : [ "source:type", "value", "parser_my_name", 
"parser_value_abs" ]
    }
    '
    ```
    
    ## Modify Enrichment
    * Delete the dummy indices:
      `curl -XDELETE "http://localhost:9200/dummy*"`
    * Modify `$METRON_HOME/config/zookeeper/enrichments/dummy.json` like so:
    ```
    {
      "enrichment" : {
       "fieldMap": {
          "stellar" : {
            "config" : {
    "enr_my_name" : "TO_UPPER('casey')",
    "enr_value_abs" : "ABS(value)"
                      }
          }
        }
      }
    }
    ```
    * Push the configs via `$METRON_HOME/bin/zk_load_configs.sh -m PUSH -i 
$METRON_HOME/config/zookeeper -z $ZOOKEEPER`
    * Send some synthetic data through (stop this after a few seconds):
      `python ~/rand_gen.py 0 1 1 | 
/usr/hdp/current/kafka-broker/bin/kafka-console-producer.sh --broker-list 
$BROKERLIST --topic dummy`
    * Validate data has been written to the index with the new fields, 
`enr_my_name` and `enr_value_abs`:
    ```
    curl -XPOST 'http://localhost:9200/dummy*/_search?pretty' -d '
    {
      "_source" : [ "source:type", "value", "parser_my_name", 
"parser_value_abs", "enr_my_name", "enr_value_abs"]
    }
    '
    ```
    
    ## Modify Indexing
    We'll change the name of the index now and make sure data is flowing
    through
    
    * Delete the dummy indices:
      `curl -XDELETE "http://localhost:9200/dummy*"`
    * Modify `$METRON_HOME/config/zookeeper/indexing/dummy.json` like so:
    ```
    {
      "hdfs" : {
        "index": "smarty",
        "batchSize": 5,
        "enabled" : true
      },
      "elasticsearch" : {
        "index": "smarty",
        "batchSize": 5,
        "enabled" : true
      },
      "solr" : {
        "index": "yaf",
        "batchSize": 5,
        "enabled" : false
      }
    }
    ```
    * Push the configs via `$METRON_HOME/bin/zk_load_configs.sh -m PUSH -i 
$METRON_HOME/config/zookeeper -z $ZOOKEEPER`
    * Send some synthetic data through (stop this after a few seconds):
      `python ~/rand_gen.py 0 1 1 | 
/usr/hdp/current/kafka-broker/bin/kafka-console-producer.sh --broker-list 
$BROKERLIST --topic dummy`
    * Validate data has been written to the new index name:
    ```
    curl -XPOST 'http://localhost:9200/smarty*/_search?pretty' -d '
    {
      "_source" : [ "source:type", "value", "parser_my_name", 
"parser_value_abs", "enr_my_name", "enr_value_abs"]
    }
    '
    ```
    
    ## Modify Profiler
    
    * Modify `$METRON_HOME/config/zookeeper/profiler.json` like so:
    ```
    {
      "profiles": [
        {
          "profile": "stat",
          "foreach": "'global'",
          "onlyif": "true",
          "init" : {
                   },
          "update": {
            "s": "STATS_ADD(s, value)"
                    },
          "result": "s"
        }
      ]
    }
    ```
    * Push the configs via `$METRON_HOME/bin/zk_load_configs.sh -m PUSH -i 
$METRON_HOME/config/zookeeper -z $ZOOKEEPER`
    * Send some synthetic data through (let this go for some time in another 
window):
      `python ~/rand_gen.py 0 1 1 | 
/usr/hdp/current/kafka-broker/bin/kafka-console-producer.sh --broker-list 
$BROKERLIST --topic dummy`
    * Execute the following and ensure the number out is neither `null` nor 
`NaN`:
    ```
    sleep 128 && echo "STATS_MEAN(STATS_MERGE(PROFILE_GET('stat', 'global', 
PROFILE_WINDOW('from 5 minutes ago'))))" | $METRON_HOME/bin/stellar -z 
node1:2181 -na
    ```
    
    # REST Test 
    
    ## Parser Configurations
    * Ensure that you can retrieve all the parser configs by running the 
following and ensuring the default sensors exist:
      `curl -u user:password -X GET --header 'Accept: application/json' 
'http://node1:8082/api/v1/sensor/parser/config' | python -m json.tool | grep 
sensorTopic`
      * You should see the default sensors as well as our sensor, `dummy`:
        * `jsonMap`
        * `squid`
        * `websphere`,
        * `snort`
        * `asa`
        * `bro`
        * `yaf`
        * `dummy`
    * Delete the `websphere` parser sensor via:
      `curl -u user:password -X DELETE --header 'Accept: */*' 
'http://node1:8082/api/v1/sensor/parser/config/websphere'`
    * Now, list the parser configs and ensure `websphere` is not in the list:
      `curl -u user:password -X GET --header 'Accept: application/json' 
'http://node1:8082/api/v1/sensor/parser/config' | python -m json.tool | grep 
sensorTopic`
    * Add back the websphere parser:
    ```
    curl -u user:password -X POST --header 'Content-Type: application/json' 
--header 'Accept: application/json' -d '{
      
"parserClassName":"org.apache.metron.parsers.websphere.GrokWebSphereParser",
      "sensorTopic":"websphere",
      "parserConfig":
      {
        "grokPath":"/patterns/websphere",
        "patternLabel":"WEBSPHERE",
        "timestampField":"timestamp_string",
        "dateFormat":"yyyy MMM dd HH:mm:ss"
      }
    }' 'http://node1:8082/api/v1/sensor/parser/config'
    ```
    * Now, list the parser configs and ensure `websphere` is in the list:
      `curl -u user:password -X GET --header 'Accept: application/json' 
'http://node1:8082/api/v1/sensor/parser/config' | python -m json.tool | grep 
sensorTopic`
    
    ## Enrichment Configurations
    * Ensure that you can retrieve all the enrichment configs by running the 
following and ensuring the default sensors exist:
      `curl -u user:password -X GET --header 'Accept: application/json' 
'http://node1:8082/api/v1/sensor/enrichment/config' | python -m json.tool | 
grep "^    \"" | awk -F: '{print $1}' | sed 's/ //g'`
      * You should see the default sensors as well as our sensor, `dummy`:
        * `asa`
        * `bro`
        * `dummy`
        * `snort`
        * `websphere`
        * `yaf`
    * Delete the `websphere` enrichment via:
      `curl -u user:password -X DELETE --header 'Accept: */*' 
'http://node1:8082/api/v1/sensor/enrichment/config/websphere'`
    * Now, list the enrichment configs and ensure `websphere` is not in the 
list:
      `curl -u user:password -X GET --header 'Accept: application/json' 
'http://node1:8082/api/v1/sensor/enrichment/config' | python -m json.tool | 
grep "^    \"" | awk -F: '{print $1}' | sed 's/ //g'`
    * Add back the websphere enrichment:
    ```
    curl -u user:password -X POST --header 'Content-Type: application/json' 
--header 'Accept: application/json' -d '{
      "enrichment": {
        "fieldMap": {
          "geo": [
            "ip_src_addr"
          ],
          "host": [
            "ip_src_addr"
          ]
        },
      "fieldToTypeMap": {
          "ip_src_addr": [
            "playful_classification"
          ]
        }
      }
    }' 'http://node1:8082/api/v1/sensor/enrichment/config/websphere'
    ```
    * Now, list the enrichment configs and ensure `websphere` is in the list:
      `curl -u user:password -X GET --header 'Accept: application/json' 
'http://node1:8082/api/v1/sensor/enrichment/config' | python -m json.tool | 
grep "^    \"" | awk -F: '{print $1}' | sed 's/ //g'`
    ## Indexing Configurations
    * Ensure that you can retrieve all the indexing configs by running the 
following and ensuring the default sensors exist:
      `curl -u user:password -X GET --header 'Accept: application/json' 
'http://node1:8082/api/v1/sensor/indexing/config' | python -m json.tool | grep 
"^    \"" | awk -F: '{print $1}' | sed 's/ //g'`
      * You should see the default sensors as well as our sensor, `dummy`:
        * `asa`
        * `bro`
        * `error`
        * `dummy`
        * `snort`
        * `websphere`
        * `yaf`
    * Delete the `websphere` indexing config via:
      `curl -u user:password -X DELETE --header 'Accept: */*' 
'http://node1:8082/api/v1/sensor/indexing/config/websphere'`
    * Now, list the indexing configs and ensure `websphere` is not in the list:
      `curl -u user:password -X GET --header 'Accept: application/json' 
'http://node1:8082/api/v1/sensor/indexing/config' | python -m json.tool | grep 
"^    \"" | awk -F: '{print $1}' | sed 's/ //g'`
    * Add back the websphere indexing config:
    ```
    curl -u user:password -X POST --header 'Content-Type: application/json' 
--header 'Accept: application/json' -d '{
      "hdfs" : {
        "index": "websphere",
        "batchSize": 5,
        "enabled" : true
      },
      "elasticsearch" : {
        "index": "websphere",
        "batchSize": 5,
        "enabled" : true
      },
      "solr" : {
        "index": "websphere",
        "batchSize": 5,
        "enabled" : true
      }
    }' 'http://node1:8082/api/v1/sensor/indexing/config/websphere'
    ```
    * Now, list the indexing configs and ensure `websphere` is in the list:
      `curl -u user:password -X GET --header 'Accept: application/json' 
'http://node1:8082/api/v1/sensor/indexing/config' | python -m json.tool | grep 
"^    \"" | awk -F: '{print $1}' | sed 's/ //g'`
    



---

Reply via email to