Modified: eagle/site/docs/tutorial/site-0.3.0.html URL: http://svn.apache.org/viewvc/eagle/site/docs/tutorial/site-0.3.0.html?rev=1789954&r1=1789953&r2=1789954&view=diff ============================================================================== --- eagle/site/docs/tutorial/site-0.3.0.html (original) +++ eagle/site/docs/tutorial/site-0.3.0.html Mon Apr 3 11:23:42 2017 @@ -129,86 +129,27 @@ <li class="divider"></li> - <li class="heading">Download</li> - - <li class="sidenavli "><a href="/docs/download-latest.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Latest version</a></li> - - <li class="sidenavli "><a href="/docs/download.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Archived</a></li> - - <li class="divider"></li> - - <li class="heading">Installation</li> - - <li class="sidenavli "><a href="/docs/quick-start.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Get Started with Sandbox</a></li> - - <li class="sidenavli "><a href="/docs/deployment-in-docker.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Get Started with Docker</a></li> - - <li class="sidenavli "><a href="/docs/deployment-env.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Setup Environment</a></li> - - <li class="sidenavli "><a href="/docs/deployment-in-production.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Setup Eagle in Production</a></li> - - <li class="sidenavli "><a href="/docs/configuration.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Eagle Application Configuration</a></li> - - <li class="sidenavli "><a href="/docs/serviceconfiguration.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Eagle Service Configuration</a></li> - - <li class="sidenavli "><a href="/docs/tutorial/ldap.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Eagle LDAP Authentication</a></li> - - <li class="divider"></li> - - <li class="heading">Tutorial</li> - - <li class="sidenavli "><a href="/docs/tutorial/site.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Site Management</a></li> + <li class="heading">Documentations</li> - <li class="sidenavli "><a href="/docs/tutorial/policy.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Policy Management</a></li> - - <li class="sidenavli "><a href="/docs/tutorial/policy-capabilities.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Policy Engine Capabilities</a></li> - - <li class="sidenavli "><a href="/docs/hdfs-data-activity-monitoring.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">HDFS Data Activity Monitoring</a></li> - - <li class="sidenavli "><a href="/docs/hive-query-activity-monitoring.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">HIVE Query Activity Monitoring</a></li> - - <li class="sidenavli "><a href="/docs/hbase-data-activity-monitoring.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">HBASE Data Activity Monitoring</a></li> - - <li class="sidenavli "><a href="/docs/mapr-integration.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">MapR FS Data Activity Monitoring</a></li> - - <li class="sidenavli "><a href="/docs/cloudera-integration.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Cloudera FS Data Activity Monitoring</a></li> - - <li class="sidenavli "><a href="/docs/jmx-metric-monitoring.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Hadoop JMX Metrics Monitoring</a></li> - - <li class="sidenavli "><a href="/docs/import-hdfs-auditLog.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Stream HDFS audit logs into Kafka</a></li> - - <li class="sidenavli "><a href="/docs/tutorial/userprofile.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">User Profile Feature</a></li> - - <li class="sidenavli "><a href="/docs/tutorial/classification.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Data Classification Feature</a></li> - - <li class="sidenavli "><a href="/docs/tutorial/topologymanagement.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Topology Management Feature</a></li> - - <li class="sidenavli "><a href="/docs/tutorial/notificationplugin.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Alert Notification Plugin</a></li> - - <li class="sidenavli "><a href="/docs/metadata-api.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Metadata RESTful API</a></li> + <li class="sidenavli "><a href="/docs/latest/" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Latest version (v0.5.0)</a></li> <li class="divider"></li> - <li class="heading">Development Guide</li> - - <li class="sidenavli "><a href="/docs/development-quick-guide.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Development Quick Guide</a></li> + <li class="heading">Download</li> - <li class="sidenavli "><a href="/docs/development-in-macosx.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Development in Mac OSX</a></li> + <li class="sidenavli "><a href="/docs/download-latest.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Latest version (v0.5.0)</a></li> - <li class="sidenavli "><a href="/docs/development-in-intellij.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Development in Intellij</a></li> + <li class="sidenavli "><a href="/docs/download.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Archived</a></li> <li class="divider"></li> - <li class="heading">Advanced</li> + <li class="heading">Supplement</li> - <li class="sidenavli "><a href="/docs/user-profile-ml.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">User Profile Machine Learning</a></li> + <li class="sidenavli "><a href="/docs/security.html" data-permalink="/docs/tutorial/site-0.3.0.html" id="">Security</a></li> <li class="divider"></li> <li class="sidenavli"> - <a href="/sup/index.html">Go To Supplement</a> - </li> - <li class="sidenavli"> <a href="mailto:[email protected]" target="_blank">Need Help?</a> </li> </ul> @@ -239,35 +180,32 @@ Here we give configuration examples for <p>You may configure the default path for Hadoop clients to connect remote hdfs namenode.</p> - <div class="highlighter-rouge"><pre class="highlight"><code><span class="w"> </span><span class="p">{</span><span class="nt">"fs.defaultFS"</span><span class="p">:</span><span class="s2">"hdfs://sandbox.hortonworks.com:8020"</span><span class="p">}</span><span class="w"> -</span></code></pre> - </div> + <pre><code> {"fs.defaultFS":"hdfs://sandbox.hortonworks.com:8020"} +</code></pre> </li> <li> <p>HA case</p> <p>Basically, you point your fs.defaultFS at your nameservice and let the client know how its configured (the backing namenodes) and how to fail over between them under the HA mode</p> - <div class="highlighter-rouge"><pre class="highlight"><code><span class="w"> </span><span class="p">{</span><span class="nt">"fs.defaultFS"</span><span class="p">:</span><span class="s2">"hdfs://nameservice1"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"dfs.nameservices"</span><span class="p">:</span><span class="w"> </span><span class="s2">"nameservice1"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"dfs.ha.namenodes.nameservice1"</span><span class="p">:</span><span class="s2">"namenode1,namenode2"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"dfs.namenode.rpc-address.nameservice1.namenode1"</span><span class="p">:</span><span class="w"> </span><span class="s2">"hadoopnamenode01:8020"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"dfs.namenode.rpc-address.nameservice1.namenode2"</span><span class="p">:</span><span class="w"> </span><span class="s2">"hadoopnamenode02:8020"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"dfs.client.failover.proxy.provider.nameservice1"</span><span class="p">:</span><span class="w"> </span><span class="s2">"org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider"</span><span class="w"> - </span><span class="p">}</span><span class="w"> -</span></code></pre> - </div> + <pre><code> {"fs.defaultFS":"hdfs://nameservice1", + "dfs.nameservices": "nameservice1", + "dfs.ha.namenodes.nameservice1":"namenode1,namenode2", + "dfs.namenode.rpc-address.nameservice1.namenode1": "hadoopnamenode01:8020", + "dfs.namenode.rpc-address.nameservice1.namenode2": "hadoopnamenode02:8020", + "dfs.client.failover.proxy.provider.nameservice1": "org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider" + } +</code></pre> </li> <li> <p>Kerberos-secured cluster</p> <p>For Kerberos-secured cluster, you need to get a keytab file and the principal from your admin, and configure âeagle.keytab.fileâ and âeagle.kerberos.principalâ to authenticate its access.</p> - <div class="highlighter-rouge"><pre class="highlight"><code><span class="w"> </span><span class="p">{</span><span class="w"> </span><span class="nt">"eagle.keytab.file"</span><span class="p">:</span><span class="s2">"/EAGLE-HOME/.keytab/eagle.keytab"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"eagle.kerberos.principal"</span><span class="p">:</span><span class="s2">"[email protected]"</span><span class="w"> - </span><span class="p">}</span><span class="w"> -</span></code></pre> - </div> + <pre><code> { "eagle.keytab.file":"/EAGLE-HOME/.keytab/eagle.keytab", + "eagle.kerberos.principal":"[email protected]" + } +</code></pre> <p>If there is an exception about âinvalid server principal nameâ, you may need to check the DNS resolver, or the data transfer , such as âdfs.encrypt.data.transferâ, âdfs.encrypt.data.transfer.algorithmâ, âdfs.trustedchannel.resolver.classâ, âdfs.datatransfer.client.encryptâ.</p> </li> @@ -278,15 +216,14 @@ Here we give configuration examples for <li> <p>Basic</p> - <div class="highlighter-rouge"><pre class="highlight"><code><span class="w"> </span><span class="p">{</span><span class="w"> - </span><span class="nt">"accessType"</span><span class="p">:</span><span class="w"> </span><span class="s2">"metastoredb_jdbc"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"password"</span><span class="p">:</span><span class="w"> </span><span class="s2">"hive"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"user"</span><span class="p">:</span><span class="w"> </span><span class="s2">"hive"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"jdbcDriverClassName"</span><span class="p">:</span><span class="w"> </span><span class="s2">"com.mysql.jdbc.Driver"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"jdbcUrl"</span><span class="p">:</span><span class="w"> </span><span class="s2">"jdbc:mysql://sandbox.hortonworks.com/hive?createDatabaseIfNotExist=true"</span><span class="w"> - </span><span class="p">}</span><span class="w"> -</span></code></pre> - </div> + <pre><code> { + "accessType": "metastoredb_jdbc", + "password": "hive", + "user": "hive", + "jdbcDriverClassName": "com.mysql.jdbc.Driver", + "jdbcUrl": "jdbc:mysql://sandbox.hortonworks.com/hive?createDatabaseIfNotExist=true" + } +</code></pre> </li> </ul> </li> @@ -299,29 +236,27 @@ Here we give configuration examples for <p>You need to sett âhbase.zookeeper.quorumâ:âlocalhostâ property and âhbase.zookeeper.property.clientPortâ property.</p> - <div class="highlighter-rouge"><pre class="highlight"><code><span class="w"> </span><span class="p">{</span><span class="w"> - </span><span class="nt">"hbase.zookeeper.property.clientPort"</span><span class="p">:</span><span class="s2">"2181"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"hbase.zookeeper.quorum"</span><span class="p">:</span><span class="s2">"localhost"</span><span class="w"> - </span><span class="p">}</span><span class="w"> -</span></code></pre> - </div> + <pre><code> { + "hbase.zookeeper.property.clientPort":"2181", + "hbase.zookeeper.quorum":"localhost" + } +</code></pre> </li> <li> <p>Kerberos-secured cluster</p> <p>According to your environment, you can add or remove some of the following properties. Here is the reference.</p> - <div class="highlighter-rouge"><pre class="highlight"><code><span class="w"> </span><span class="p">{</span><span class="w"> - </span><span class="nt">"hbase.zookeeper.property.clientPort"</span><span class="p">:</span><span class="s2">"2181"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"hbase.zookeeper.quorum"</span><span class="p">:</span><span class="s2">"localhost"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"hbase.security.authentication"</span><span class="p">:</span><span class="s2">"kerberos"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"hbase.master.kerberos.principal"</span><span class="p">:</span><span class="s2">"hadoop/[email protected]"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"zookeeper.znode.parent"</span><span class="p">:</span><span class="s2">"/hbase"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"eagle.keytab.file"</span><span class="p">:</span><span class="s2">"/EAGLE-HOME/.keytab/eagle.keytab"</span><span class="p">,</span><span class="w"> - </span><span class="nt">"eagle.kerberos.principal"</span><span class="p">:</span><span class="s2">"[email protected]"</span><span class="w"> - </span><span class="p">}</span><span class="w"> -</span></code></pre> - </div> + <pre><code> { + "hbase.zookeeper.property.clientPort":"2181", + "hbase.zookeeper.quorum":"localhost", + "hbase.security.authentication":"kerberos", + "hbase.master.kerberos.principal":"hadoop/[email protected]", + "zookeeper.znode.parent":"/hbase", + "eagle.keytab.file":"/EAGLE-HOME/.keytab/eagle.keytab", + "eagle.kerberos.principal":"[email protected]" + } +</code></pre> </li> </ul> </li>
Modified: eagle/site/docs/tutorial/site.html URL: http://svn.apache.org/viewvc/eagle/site/docs/tutorial/site.html?rev=1789954&r1=1789953&r2=1789954&view=diff ============================================================================== --- eagle/site/docs/tutorial/site.html (original) +++ eagle/site/docs/tutorial/site.html Mon Apr 3 11:23:42 2017 @@ -129,86 +129,27 @@ <li class="divider"></li> - <li class="heading">Download</li> - - <li class="sidenavli "><a href="/docs/download-latest.html" data-permalink="/docs/tutorial/site.html" id="">Latest version</a></li> - - <li class="sidenavli "><a href="/docs/download.html" data-permalink="/docs/tutorial/site.html" id="">Archived</a></li> - - <li class="divider"></li> - - <li class="heading">Installation</li> - - <li class="sidenavli "><a href="/docs/quick-start.html" data-permalink="/docs/tutorial/site.html" id="">Get Started with Sandbox</a></li> - - <li class="sidenavli "><a href="/docs/deployment-in-docker.html" data-permalink="/docs/tutorial/site.html" id="">Get Started with Docker</a></li> - - <li class="sidenavli "><a href="/docs/deployment-env.html" data-permalink="/docs/tutorial/site.html" id="">Setup Environment</a></li> - - <li class="sidenavli "><a href="/docs/deployment-in-production.html" data-permalink="/docs/tutorial/site.html" id="">Setup Eagle in Production</a></li> - - <li class="sidenavli "><a href="/docs/configuration.html" data-permalink="/docs/tutorial/site.html" id="">Eagle Application Configuration</a></li> - - <li class="sidenavli "><a href="/docs/serviceconfiguration.html" data-permalink="/docs/tutorial/site.html" id="">Eagle Service Configuration</a></li> - - <li class="sidenavli "><a href="/docs/tutorial/ldap.html" data-permalink="/docs/tutorial/site.html" id="">Eagle LDAP Authentication</a></li> - - <li class="divider"></li> - - <li class="heading">Tutorial</li> - - <li class="sidenavli current"><a href="/docs/tutorial/site.html" data-permalink="/docs/tutorial/site.html" id="">Site Management</a></li> + <li class="heading">Documentations</li> - <li class="sidenavli "><a href="/docs/tutorial/policy.html" data-permalink="/docs/tutorial/site.html" id="">Policy Management</a></li> - - <li class="sidenavli "><a href="/docs/tutorial/policy-capabilities.html" data-permalink="/docs/tutorial/site.html" id="">Policy Engine Capabilities</a></li> - - <li class="sidenavli "><a href="/docs/hdfs-data-activity-monitoring.html" data-permalink="/docs/tutorial/site.html" id="">HDFS Data Activity Monitoring</a></li> - - <li class="sidenavli "><a href="/docs/hive-query-activity-monitoring.html" data-permalink="/docs/tutorial/site.html" id="">HIVE Query Activity Monitoring</a></li> - - <li class="sidenavli "><a href="/docs/hbase-data-activity-monitoring.html" data-permalink="/docs/tutorial/site.html" id="">HBASE Data Activity Monitoring</a></li> - - <li class="sidenavli "><a href="/docs/mapr-integration.html" data-permalink="/docs/tutorial/site.html" id="">MapR FS Data Activity Monitoring</a></li> - - <li class="sidenavli "><a href="/docs/cloudera-integration.html" data-permalink="/docs/tutorial/site.html" id="">Cloudera FS Data Activity Monitoring</a></li> - - <li class="sidenavli "><a href="/docs/jmx-metric-monitoring.html" data-permalink="/docs/tutorial/site.html" id="">Hadoop JMX Metrics Monitoring</a></li> - - <li class="sidenavli "><a href="/docs/import-hdfs-auditLog.html" data-permalink="/docs/tutorial/site.html" id="">Stream HDFS audit logs into Kafka</a></li> - - <li class="sidenavli "><a href="/docs/tutorial/userprofile.html" data-permalink="/docs/tutorial/site.html" id="">User Profile Feature</a></li> - - <li class="sidenavli "><a href="/docs/tutorial/classification.html" data-permalink="/docs/tutorial/site.html" id="">Data Classification Feature</a></li> - - <li class="sidenavli "><a href="/docs/tutorial/topologymanagement.html" data-permalink="/docs/tutorial/site.html" id="">Topology Management Feature</a></li> - - <li class="sidenavli "><a href="/docs/tutorial/notificationplugin.html" data-permalink="/docs/tutorial/site.html" id="">Alert Notification Plugin</a></li> - - <li class="sidenavli "><a href="/docs/metadata-api.html" data-permalink="/docs/tutorial/site.html" id="">Metadata RESTful API</a></li> + <li class="sidenavli "><a href="/docs/latest/" data-permalink="/docs/tutorial/site.html" id="">Latest version (v0.5.0)</a></li> <li class="divider"></li> - <li class="heading">Development Guide</li> - - <li class="sidenavli "><a href="/docs/development-quick-guide.html" data-permalink="/docs/tutorial/site.html" id="">Development Quick Guide</a></li> + <li class="heading">Download</li> - <li class="sidenavli "><a href="/docs/development-in-macosx.html" data-permalink="/docs/tutorial/site.html" id="">Development in Mac OSX</a></li> + <li class="sidenavli "><a href="/docs/download-latest.html" data-permalink="/docs/tutorial/site.html" id="">Latest version (v0.5.0)</a></li> - <li class="sidenavli "><a href="/docs/development-in-intellij.html" data-permalink="/docs/tutorial/site.html" id="">Development in Intellij</a></li> + <li class="sidenavli "><a href="/docs/download.html" data-permalink="/docs/tutorial/site.html" id="">Archived</a></li> <li class="divider"></li> - <li class="heading">Advanced</li> + <li class="heading">Supplement</li> - <li class="sidenavli "><a href="/docs/user-profile-ml.html" data-permalink="/docs/tutorial/site.html" id="">User Profile Machine Learning</a></li> + <li class="sidenavli "><a href="/docs/security.html" data-permalink="/docs/tutorial/site.html" id="">Security</a></li> <li class="divider"></li> <li class="sidenavli"> - <a href="/sup/index.html">Go To Supplement</a> - </li> - <li class="sidenavli"> <a href="mailto:[email protected]" target="_blank">Need Help?</a> </li> </ul> Modified: eagle/site/feed.xml URL: http://svn.apache.org/viewvc/eagle/site/feed.xml?rev=1789954&r1=1789953&r2=1789954&view=diff ============================================================================== --- eagle/site/feed.xml (original) +++ eagle/site/feed.xml Mon Apr 3 11:23:42 2017 @@ -5,9 +5,9 @@ <description>Eagle - Analyze Big Data Platforms for Security and Performance</description> <link>http://goeagle.io/</link> <atom:link href="http://goeagle.io/feed.xml" rel="self" type="application/rss+xml"/> - <pubDate>Thu, 12 Jan 2017 15:28:13 +0800</pubDate> - <lastBuildDate>Thu, 12 Jan 2017 15:28:13 +0800</lastBuildDate> - <generator>Jekyll v3.3.1</generator> + <pubDate>Mon, 03 Apr 2017 19:17:59 +0800</pubDate> + <lastBuildDate>Mon, 03 Apr 2017 19:17:59 +0800</lastBuildDate> + <generator>Jekyll v2.5.3</generator> <item> <title>Apache Eagle æ£å¼åå¸ï¼åå¸å¼å®æ¶Hadoopæ°æ®å®å ¨æ¹æ¡</title> @@ -17,7 +17,7 @@ <p>æ¥åï¼eBayå ¬å¸éé宣叿£å¼å弿ºä¸çæ¨åºåå¸å¼å®æ¶å®å ¨çæ§æ¹æ¡ ï¼ Apache Eagle (http://goeagle.io)ï¼è¯¥é¡¹ç®å·²äº2015å¹´10æ26æ¥æ£å¼å å ¥Apache æä¸ºåµåå¨é¡¹ç®ãApache Eagleæä¾ä¸å¥é«æåå¸å¼çæµå¼çç¥å¼æï¼å ·æé«å®æ¶ãå¯ä¼¸ç¼©ãææ©å±ã交äºå好çç¹ç¹ï¼åæ¶éææºå¨å¦ä¹ å¯¹ç¨æ·è¡ä¸ºå»ºç«Profile以å®ç°æºè½å®æ¶å°ä¿æ¤Hadoopçæç³»ç»ä¸å¤§æ°æ®çå®å ¨ã</p> -<h2 id="èæ¯">èæ¯</h2> +<h2 id="section">èæ¯</h2> <p>éçå¤§æ°æ®çåå±ï¼è¶æ¥è¶å¤çæåä¼ä¸æè ç»ç»å¼å§éåæ°æ®é©±å¨åä¸çè¿ä½æ¨¡å¼ãå¨eBayï¼æä»¬æ¥ææ°ä¸åå·¥ç¨å¸ãåæå¸åæ°æ®ç§å¦å®¶ï¼ä»ä»¬æ¯å¤©è®¿é®åææ°PBçº§çæ°æ®ï¼ä»¥ä¸ºæä»¬çç¨æ·å¸¦æ¥æ ä¸ä¼¦æ¯çä½éªãå¨å ¨çä¸å¡ä¸ï¼æä»¬ä¹å¹¿æ³å°å©ç¨æµ·éå¤§æ°æ®æ¥è¿æ¥æä»¬æ°ä»¥äº¿è®¡çç¨æ·ã</p> <p>è¿å¹´æ¥ï¼Hadoopå·²ç»éæ¸æä¸ºå¤§æ°æ®åæé¢åæå欢è¿çè§£å³æ¹æ¡ï¼eBayä¹ä¸ç´å¨ä½¿ç¨Hadoopææ¯ä»æ°æ®ä¸ææä»·å¼ï¼ä¾å¦ï¼æä»¬éè¿å¤§æ°æ®æé«ç¨æ·çæç´¢ä½éªï¼è¯å«åä¼åç²¾åå¹¿åææ¾ï¼å 宿们ç产åç®å½ï¼ä»¥åéè¿ç¹å»æµåæä»¥çè§£ç¨æ·å¦ä½ä½¿ç¨æä»¬çå¨çº¿å¸åºå¹³å°çã</p> @@ -54,20 +54,20 @@ <li><strong>弿º</strong>ï¼Eagleä¸ç´æ ¹æ®å¼æºçæ åå¼åï¼å¹¶æå»ºäºè¯¸å¤å¤§æ°æ®é¢åç弿ºäº§åä¹ä¸ï¼å æ¤æä»¬å³å®ä»¥Apache许å¯è¯å¼æºEagleï¼ä»¥åé¦ç¤¾åºï¼åæ¶ä¹æå¾ è·å¾ç¤¾åºçåé¦ãåä½ä¸æ¯æã</li> </ul> -<h2 id="eagleæ¦è§">Eagleæ¦è§</h2> +<h2 id="eagle">Eagleæ¦è§</h2> <p><img src="/images/posts/eagle-group.png" alt="" /></p> -<h4 id="æ°æ®æµæ¥å ¥ååå¨data-collection-and-storage">æ°æ®æµæ¥å ¥ååå¨ï¼Data Collection and Storageï¼</h4> +<h4 id="data-collection-and-storage">æ°æ®æµæ¥å ¥ååå¨ï¼Data Collection and Storageï¼</h4> <p>Eagleæä¾é«åº¦å¯æ©å±çç¼ç¨APIï¼å¯ä»¥æ¯æå°ä»»ä½ç±»åçæ°æ®æºéæå°Eagleççç¥æ§è¡å¼æä¸ãä¾å¦ï¼å¨Eagle HDFS 审计äºä»¶ï¼Auditï¼çæ§æ¨¡åä¸ï¼éè¿Kafkaæ¥å®æ¶æ¥æ¶æ¥èªNamenode Log4j Appender æè Logstash Agent æ¶éçæ°æ®ï¼å¨Eagle Hive çæ§æ¨¡åä¸ï¼éè¿YARN API æ¶éæ£å¨è¿è¡JobçHive æ¥è¯¢æ¥å¿ï¼å¹¶ä¿è¯æ¯è¾é«çå¯ä¼¸ç¼©æ§å容鿧ã</p> -<h4 id="æ°æ®å®æ¶å¤çdata-processing">æ°æ®å®æ¶å¤çï¼Data Processingï¼</h4> +<h4 id="data-processing">æ°æ®å®æ¶å¤çï¼Data Processingï¼</h4> <p><strong>æµå¤çAPIï¼Stream Processing APIï¼Eagle</strong> æä¾ç¬ç«äºç©çå¹³å°èé«åº¦æ½è±¡çæµå¤çAPIï¼ç®åé»è®¤æ¯æApache Stormï¼ä½æ¯ä¹å 许æ©å±å°å ¶ä»ä»»ææµå¤çå¼æï¼æ¯å¦Flink æè Samzaçãè¯¥å±æ½è±¡å 许å¼åè å¨å®ä¹çæ§æ°æ®å¤çé»è¾æ¶ï¼æ éå¨ç©çæ§è¡å±ç»å®ä»»ä½ç¹å®æµå¤çå¹³å°ï¼èåªééè¿å¤ç¨ãæ¼æ¥åç»è£ ä¾å¦æ°æ®è½¬æ¢ãè¿æ»¤ãå¤é¨æ°æ®Joinçç»ä»¶ï¼ä»¥å®ç°æ»¡è¶³éæ±çDAGï¼æåæ ç¯å¾ï¼ï¼åæ¶ï¼å¼å� � ä¹å¯ä»¥å¾å®¹æå°ä»¥ç¼ç¨å°æ¹å¼å°ä¸å¡é»è¾æµç¨åEagle çç¥å¼ææ¡æ¶éæèµ·æ¥ãEagleæ¡æ¶å é¨ä¼å°æè¿°ä¸å¡é»è¾çDAGç¼è¯æåºå±æµå¤çæ¶æçåçåºç¨ï¼ä¾å¦Apache Storm Topology çï¼ä»äºå®ç°å¹³å°çç¬ç«ã</p> <p><strong>以䏿¯ä¸ä¸ªEagleå¦ä½å¤çäºä»¶ååè¦ç示ä¾ï¼</strong></p> -<div class="highlighter-rouge"><pre class="highlight"><code>StormExecutionEnvironment env = ExecutionEnvironmentFactory.getStorm(config); // storm env +<pre><code>StormExecutionEnvironment env = ExecutionEnvironmentFactory.getStorm(config); // storm env StreamProducer producer = env.newSource(new KafkaSourcedSpoutProvider().getSpout(config)).renameOutputFields(1) // declare kafka source .flatMap(new AuditLogTransformer()) // transform event .groupBy(Arrays.asList(0)) // group by 1st field @@ -75,7 +75,6 @@ StreamProducer producer = env.newSource( .alertWithConsumer(âuserActivityâ,âuserProfileExecutorâ) // ML policy evaluation env.execute(); // execute stream processing and alert </code></pre> -</div> <p><strong>åè¦æ¡æ¶ï¼Alerting Frameworkï¼Eagle</strong>åè¦æ¡æ¶ç±æµå æ°æ®APIãçç¥å¼ææå¡æä¾APIãçç¥Partitioner API 以åé¢è¦å»éæ¡æ¶çç»æ:</p> @@ -85,7 +84,7 @@ env.execute(); // execute stream process <li> <p><strong>æ©å±æ§</strong> Eagleççç¥å¼ææå¡æä¾APIå è®¸ä½ æå ¥æ°ççç¥å¼æ</p> - <div class="highlighter-rouge"><pre class="highlight"><code> public interface PolicyEvaluatorServiceProvider { + <pre><code> public interface PolicyEvaluatorServiceProvider { public String getPolicyType(); // literal string to identify one type of policy public Class&lt;? extends PolicyEvaluator&gt; getPolicyEvaluator(); // get policy evaluator implementation public List&lt;Module&gt; getBindingModules(); // policy text with json format to object mapping @@ -96,17 +95,15 @@ env.execute(); // execute stream process public void onPolicyDelete(); // invoked when policy is deleted } </code></pre> - </div> </li> <li><strong>çç¥Partitioner API</strong> å 许çç¥å¨ä¸åçç©çèç¹ä¸å¹¶è¡æ§è¡ãä¹å è®¸ä½ èªå®ä¹çç¥Partitionerç±»ãè¿äºåè½ä½¿å¾çç¥åäºä»¶å®å ¨ä»¥åå¸å¼çæ¹å¼æ§è¡ã</li> <li> <p><strong>å¯ä¼¸ç¼©æ§</strong> Eagle éè¿æ¯æçç¥çååºæ¥å£æ¥å®ç°å¤§éççç¥å¯ä¼¸ç¼©å¹¶åå°è¿è¡</p> - <div class="highlighter-rouge"><pre class="highlight"><code> public interface PolicyPartitioner extends Serializable { + <pre><code> public interface PolicyPartitioner extends Serializable { int partition(int numTotalPartitions, String policyType, String policyId); // method to distribute policies } </code></pre> - </div> <p><img src="/images/posts/policy-partition.png" alt="" /></p> @@ -163,29 +160,26 @@ Eagle æ¯ææ ¹æ®ç¨æ <li> <p>åä¸äºä»¶æ§è¡çç¥ï¼ç¨æ·è®¿é®Hiveä¸çæææ°æ®åï¼</p> - <div class="highlighter-rouge"><pre class="highlight"><code> from hiveAccessLogStream[sensitivityType=='PHONE_NUMBER'] select * insert into outputStream; + <pre><code> from hiveAccessLogStream[sensitivityType=='PHONE_NUMBER'] select * insert into outputStream; </code></pre> - </div> </li> <li> <p>åºäºçªå£ççç¥ï¼ç¨æ·å¨10åéå 访é®ç®å½ /tmp/private å¤ä½ 5次ï¼</p> - <div class="highlighter-rouge"><pre class="highlight"><code> hdfsAuditLogEventStream[(src == '/tmp/private')]#window.externalTime(timestamp,10 min) select user, count(timestamp) as aggValue group by user having aggValue &gt;= 5 insert into outputStream; + <pre><code> hdfsAuditLogEventStream[(src == '/tmp/private')]#window.externalTime(timestamp,10 min) select user, count(timestamp) as aggValue group by user having aggValue &gt;= 5 insert into outputStream; </code></pre> - </div> </li> </ul> <p><strong>æ¥è¯¢æå¡ï¼Query Serviceï¼</strong> Eagle æä¾ç±»SQLçREST APIç¨æ¥å®ç°éå¯¹æµ·éæ°æ®éç综å计ç®ãæ¥è¯¢ååæçè½åï¼æ¯æä¾å¦è¿æ»¤ãèåãç´æ¹è¿ç®ãæåºãtopãç®æ¯è¡¨è¾¾å¼ä»¥åå页çãEagleä¼å æ¯æHBase ä½ä¸ºå ¶é»è®¤æ°æ®åå¨ï¼ä½æ¯åæ¶ä¹æ¯æåºJDBCçå ³ç³»åæ°æ®åºãç¹å«æ¯å½éæ©ä»¥HBaseä½ä¸ºå卿¶ï¼Eagle便åçæ¥æäºHBaseåå¨åæ¥è¯¢æµ·éçæ§æ°æ®çè½åï¼Eagle æ¥è¯¢æ¡æ¶ä¼å°ç¨æ·æä¾çç±»SQLæ¥è¯¢è¯æ³æç» ç¼è¯æä¸ºHBase åççFilter 对象ï¼å¹¶æ¯æéè¿HBase Coprocessorè¿ä¸æ¥æåååºé度ã</p> -<div class="highlighter-rouge"><pre class="highlight"><code>query=AlertDefinitionService[@dataSource="hiveQueryLog"]{@policyDef}&amp;pageSize=100000 +<pre><code>query=AlertDefinitionService[@dataSource="hiveQueryLog"]{@policyDef}&amp;pageSize=100000 </code></pre> -</div> -<h2 id="eagleå¨ebayç使ç¨åºæ¯">Eagleå¨eBayç使ç¨åºæ¯</h2> +<h2 id="eagleebay">Eagleå¨eBayç使ç¨åºæ¯</h2> <p>ç®åï¼Eagleçæ°æ®è¡ä¸ºçæ§ç³»ç»å·²ç»é¨ç½²å°ä¸ä¸ªæ¥æ2500å¤ä¸ªèç¹çHadoopé群ä¹ä¸ï¼ç¨ä»¥ä¿æ¤æ°ç¾PBæ°æ®çå®å ¨ï¼å¹¶æ£è®¡åäºä»å¹´å¹´åºä¹åæ©å±å°å ¶ä»ä¸å个Hadoopé群ä¸ï¼ä»èè¦çeBay ææä¸»è¦Hadoopç10000å¤å°èç¹ã卿们çç产ç¯å¢ä¸ï¼æä»¬å·²é对HDFSãHive çé群ä¸çæ°æ®é ç½®äºä¸äºåºç¡çå®å ¨çç¥ï¼å¹¶å°äºå¹´åºä¹å䏿å¼å ¥æ´å¤ççç¥ï¼ä»¥ç¡®ä¿éè¦æ°æ®çç»å¯¹å®å ¨ãç®åï¼Eagleççç¥æ¶µçå¤ç§� �模å¼ï¼å æ¬ä»è®¿é®æ¨¡å¼ãé¢ç¹è®¿é®æ°æ®éï¼é¢å®ä¹æ¥è¯¢ç±»åãHive 表ååãHBase 表以ååºäºæºå¨å¦ä¹ 模åçæçç¨æ·Profileç¸å ³çææçç¥çãåæ¶ï¼æä»¬ä¹æå¹¿æ³ççç¥æ¥é²æ¢æ°æ®çä¸¢å¤±ãæ°æ®è¢«æ·è´å°ä¸å®å ¨å°ç¹ãæææ°æ®è¢«æªææåºå访é®çãEagleçç¥å®ä¹ä¸æå¤§ççµæ´»æ§åæ©å±æ§ä½¿å¾æä»¬æªæ¥å¯ä»¥è½»æå°ç»§ç»æ©å±æ´å¤æ´å¤æççç¥ä»¥æ¯ææ´å¤å¤å åçç¨ä¾åºæ¯ã</p> -<h2 id="åç»è®¡å">åç»è®¡å</h2> +<h2 id="section-1">åç»è®¡å</h2> <p>è¿å»ä¸¤å¹´ä¸ï¼å¨eBay é¤äºè¢«ç¨äºæ°æ®è¡ä¸ºçæ§ä»¥å¤ï¼Eagle æ ¸å¿æ¡æ¶è¿è¢«å¹¿æ³ç¨äºçæ§èç¹å¥åº·ç¶åµãHadoopåºç¨æ§è½ææ ãHadoop æ ¸å¿æå¡ä»¥åæ´ä¸ªHadoopé群çå¥åº·ç¶åµç诸å¤é¢åãæä»¬è¿å»ºç«ä¸ç³»åçèªå¨åæºå¶ï¼ä¾å¦èç¹ä¿®å¤çï¼å¸®å©æä»¬å¹³å°é¨é¨æå¤§å¾èçäºæä»¬äººå·¥å³åï¼å¹¶ææå°æåäºæ´ä¸ªéç¾¤èµæºå°å©ç¨çã</p> <p>以䏿¯æä»¬ç®åæ£å¨å¼åä¸å°ä¸äºç¹æ§ï¼</p> @@ -202,7 +196,7 @@ Eagle æ¯ææ ¹æ®ç¨æ </li> </ul> -<h2 id="å ³äºä½è ">å ³äºä½è </h2> +<h2 id="section-2">å ³äºä½è </h2> <p><a href="https://github.com/haoch">éæµ©</a>ï¼Apache Eagle Committer å PMC æåï¼eBay åæå¹³å°åºç¡æ¶æé¨é¨é«çº§è½¯ä»¶å·¥ç¨å¸ï¼è´è´£Eagleç产åè®¾è®¡ãææ¯æ¶æãæ ¸å¿å®ç°ä»¥å弿ºç¤¾åºæ¨å¹¿çã</p> <p>æè°¢ä»¥ä¸æ¥èªApache Eagle社åºåeBayå ¬å¸çèåä½è ä»¬å¯¹æ¬æçè´¡ç®ï¼</p> @@ -216,7 +210,7 @@ Eagle æ¯ææ ¹æ®ç¨æ <p>eBay åæå¹³å°åºç¡æ¶æé¨ï¼Analytics Data Infrastructureï¼æ¯eBayçå ¨çæ°æ®ååæåºç¡æ¶æé¨é¨ï¼è´è´£eBay卿°æ®åºãæ°æ®ä»åºãHadoopãå塿ºè½ä»¥åæºå¨å¦ä¹ çåä¸ªæ°æ®å¹³å°å¼åã管çç,æ¯æeBayå ¨çåé¨é¨è¿ç¨é«ç«¯çæ°æ®åæè§£å³æ¹æ¡ä½åºåæ¶ææçä½ä¸å³çï¼ä¸ºéå¸å ¨ççä¸å¡ç¨æ·æä¾æ°æ®åæè§£å³æ¹æ¡ã</p> -<h2 id="åèèµæ">åèèµæ</h2> +<h2 id="section-3">åèèµæ</h2> <ul> <li>Apache Eagle ææ¡£ï¼<a href="http://goeagle.io">http://goeagle.io</a></li> @@ -224,7 +218,7 @@ Eagle æ¯ææ ¹æ®ç¨æ <li>Apache Eagle 项ç®ï¼<a href="http://incubator.apache.org/projects/eagle.html">http://incubator.apache.org/projects/eagle.html</a></li> </ul> -<h2 id="å¼ç¨é¾æ¥">å¼ç¨é¾æ¥</h2> +<h2 id="section-4">å¼ç¨é¾æ¥</h2> <ul> <li><strong>CSDN</strong>: <a href="http://www.csdn.net/article/2015-10-29/2826076">http://www.csdn.net/article/2015-10-29/2826076</a></li> <li><strong>OSCHINA</strong>: <a href="http://www.oschina.net/news/67515/apache-eagle">http://www.oschina.net/news/67515/apache-eagle</a></li> Modified: eagle/site/post/2015/10/27/apache-eagle-announce-cn.html URL: http://svn.apache.org/viewvc/eagle/site/post/2015/10/27/apache-eagle-announce-cn.html?rev=1789954&r1=1789953&r2=1789954&view=diff ============================================================================== --- eagle/site/post/2015/10/27/apache-eagle-announce-cn.html (original) +++ eagle/site/post/2015/10/27/apache-eagle-announce-cn.html Mon Apr 3 11:23:42 2017 @@ -93,7 +93,7 @@ <p>æ¥åï¼eBayå ¬å¸éé宣叿£å¼å弿ºä¸çæ¨åºåå¸å¼å®æ¶å®å ¨çæ§æ¹æ¡ ï¼ Apache Eagle (http://goeagle.io)ï¼è¯¥é¡¹ç®å·²äº2015å¹´10æ26æ¥æ£å¼å å ¥Apache æä¸ºåµåå¨é¡¹ç®ãApache Eagleæä¾ä¸å¥é«æåå¸å¼çæµå¼çç¥å¼æï¼å ·æé«å®æ¶ãå¯ä¼¸ç¼©ãææ©å±ã交äºå好çç¹ç¹ï¼åæ¶éææºå¨å¦ä¹ å¯¹ç¨æ·è¡ä¸ºå»ºç«Profile以å®ç°æºè½å®æ¶å°ä¿æ¤Hadoopçæç³»ç»ä¸å¤§æ°æ®çå®å ¨ã</p> -<h2 id="èæ¯">èæ¯</h2> +<h2 id="section">èæ¯</h2> <p>éçå¤§æ°æ®çåå±ï¼è¶æ¥è¶å¤çæåä¼ä¸æè ç»ç»å¼å§éåæ°æ®é©±å¨åä¸çè¿ä½æ¨¡å¼ãå¨eBayï¼æä»¬æ¥ææ°ä¸åå·¥ç¨å¸ãåæå¸åæ°æ®ç§å¦å®¶ï¼ä»ä»¬æ¯å¤©è®¿é®åææ°PBçº§çæ°æ®ï¼ä»¥ä¸ºæä»¬çç¨æ·å¸¦æ¥æ ä¸ä¼¦æ¯çä½éªãå¨å ¨çä¸å¡ä¸ï¼æä»¬ä¹å¹¿æ³å°å©ç¨æµ·éå¤§æ°æ®æ¥è¿æ¥æä»¬æ°ä»¥äº¿è®¡çç¨æ·ã</p> <p>è¿å¹´æ¥ï¼Hadoopå·²ç»éæ¸æä¸ºå¤§æ°æ®åæé¢åæå欢è¿çè§£å³æ¹æ¡ï¼eBayä¹ä¸ç´å¨ä½¿ç¨Hadoopææ¯ä»æ°æ®ä¸ææä»·å¼ï¼ä¾å¦ï¼æä»¬éè¿å¤§æ°æ®æé«ç¨æ·çæç´¢ä½éªï¼è¯å«åä¼åç²¾åå¹¿åææ¾ï¼å 宿们ç产åç®å½ï¼ä»¥åéè¿ç¹å»æµåæä»¥çè§£ç¨æ·å¦ä½ä½¿ç¨æä»¬çå¨çº¿å¸åºå¹³å°çã</p> @@ -130,20 +130,20 @@ <li><strong>弿º</strong>ï¼Eagleä¸ç´æ ¹æ®å¼æºçæ åå¼åï¼å¹¶æå»ºäºè¯¸å¤å¤§æ°æ®é¢åç弿ºäº§åä¹ä¸ï¼å æ¤æä»¬å³å®ä»¥Apache许å¯è¯å¼æºEagleï¼ä»¥åé¦ç¤¾åºï¼åæ¶ä¹æå¾ è·å¾ç¤¾åºçåé¦ãåä½ä¸æ¯æã</li> </ul> -<h2 id="eagleæ¦è§">Eagleæ¦è§</h2> +<h2 id="eagle">Eagleæ¦è§</h2> <p><img src="/images/posts/eagle-group.png" alt="" /></p> -<h4 id="æ°æ®æµæ¥å ¥ååå¨data-collection-and-storage">æ°æ®æµæ¥å ¥ååå¨ï¼Data Collection and Storageï¼</h4> +<h4 id="data-collection-and-storage">æ°æ®æµæ¥å ¥ååå¨ï¼Data Collection and Storageï¼</h4> <p>Eagleæä¾é«åº¦å¯æ©å±çç¼ç¨APIï¼å¯ä»¥æ¯æå°ä»»ä½ç±»åçæ°æ®æºéæå°Eagleççç¥æ§è¡å¼æä¸ãä¾å¦ï¼å¨Eagle HDFS 审计äºä»¶ï¼Auditï¼çæ§æ¨¡åä¸ï¼éè¿Kafkaæ¥å®æ¶æ¥æ¶æ¥èªNamenode Log4j Appender æè Logstash Agent æ¶éçæ°æ®ï¼å¨Eagle Hive çæ§æ¨¡åä¸ï¼éè¿YARN API æ¶éæ£å¨è¿è¡JobçHive æ¥è¯¢æ¥å¿ï¼å¹¶ä¿è¯æ¯è¾é«çå¯ä¼¸ç¼©æ§å容鿧ã</p> -<h4 id="æ°æ®å®æ¶å¤çdata-processing">æ°æ®å®æ¶å¤çï¼Data Processingï¼</h4> +<h4 id="data-processing">æ°æ®å®æ¶å¤çï¼Data Processingï¼</h4> <p><strong>æµå¤çAPIï¼Stream Processing APIï¼Eagle</strong> æä¾ç¬ç«äºç©çå¹³å°èé«åº¦æ½è±¡çæµå¤çAPIï¼ç®åé»è®¤æ¯æApache Stormï¼ä½æ¯ä¹å 许æ©å±å°å ¶ä»ä»»ææµå¤çå¼æï¼æ¯å¦Flink æè Samzaçãè¯¥å±æ½è±¡å 许å¼åè å¨å®ä¹çæ§æ°æ®å¤çé»è¾æ¶ï¼æ éå¨ç©çæ§è¡å±ç»å®ä»»ä½ç¹å®æµå¤çå¹³å°ï¼èåªééè¿å¤ç¨ãæ¼æ¥åç»è£ ä¾å¦æ°æ®è½¬æ¢ãè¿æ»¤ãå¤é¨æ°æ®Joinçç»ä»¶ï¼ä»¥å®ç°æ»¡è¶³éæ±çDAGï¼æåæ ç¯å¾ï¼ï¼åæ¶ï¼å¼åè ä¹å¯� �»¥å¾å®¹æå°ä»¥ç¼ç¨å°æ¹å¼å°ä¸å¡é»è¾æµç¨åEagle çç¥å¼ææ¡æ¶éæèµ·æ¥ãEagleæ¡æ¶å é¨ä¼å°æè¿°ä¸å¡é»è¾çDAGç¼è¯æåºå±æµå¤çæ¶æçåçåºç¨ï¼ä¾å¦Apache Storm Topology çï¼ä»äºå®ç°å¹³å°çç¬ç«ã</p> <p><strong>以䏿¯ä¸ä¸ªEagleå¦ä½å¤çäºä»¶ååè¦ç示ä¾ï¼</strong></p> -<div class="highlighter-rouge"><pre class="highlight"><code>StormExecutionEnvironment env = ExecutionEnvironmentFactory.getStorm(config); // storm env +<pre><code>StormExecutionEnvironment env = ExecutionEnvironmentFactory.getStorm(config); // storm env StreamProducer producer = env.newSource(new KafkaSourcedSpoutProvider().getSpout(config)).renameOutputFields(1) // declare kafka source .flatMap(new AuditLogTransformer()) // transform event .groupBy(Arrays.asList(0)) // group by 1st field @@ -151,7 +151,6 @@ StreamProducer producer = env.newSource( .alertWithConsumer(âuserActivityâ,âuserProfileExecutorâ) // ML policy evaluation env.execute(); // execute stream processing and alert </code></pre> -</div> <p><strong>åè¦æ¡æ¶ï¼Alerting Frameworkï¼Eagle</strong>åè¦æ¡æ¶ç±æµå æ°æ®APIãçç¥å¼ææå¡æä¾APIãçç¥Partitioner API 以åé¢è¦å»éæ¡æ¶çç»æ:</p> @@ -161,7 +160,7 @@ env.execute(); // execute stream process <li> <p><strong>æ©å±æ§</strong> Eagleççç¥å¼ææå¡æä¾APIå è®¸ä½ æå ¥æ°ççç¥å¼æ</p> - <div class="highlighter-rouge"><pre class="highlight"><code> public interface PolicyEvaluatorServiceProvider { + <pre><code> public interface PolicyEvaluatorServiceProvider { public String getPolicyType(); // literal string to identify one type of policy public Class<? extends PolicyEvaluator> getPolicyEvaluator(); // get policy evaluator implementation public List<Module> getBindingModules(); // policy text with json format to object mapping @@ -172,17 +171,15 @@ env.execute(); // execute stream process public void onPolicyDelete(); // invoked when policy is deleted } </code></pre> - </div> </li> <li><strong>çç¥Partitioner API</strong> å 许çç¥å¨ä¸åçç©çèç¹ä¸å¹¶è¡æ§è¡ãä¹å è®¸ä½ èªå®ä¹çç¥Partitionerç±»ãè¿äºåè½ä½¿å¾çç¥åäºä»¶å®å ¨ä»¥åå¸å¼çæ¹å¼æ§è¡ã</li> <li> <p><strong>å¯ä¼¸ç¼©æ§</strong> Eagle éè¿æ¯æçç¥çååºæ¥å£æ¥å®ç°å¤§éççç¥å¯ä¼¸ç¼©å¹¶åå°è¿è¡</p> - <div class="highlighter-rouge"><pre class="highlight"><code> public interface PolicyPartitioner extends Serializable { + <pre><code> public interface PolicyPartitioner extends Serializable { int partition(int numTotalPartitions, String policyType, String policyId); // method to distribute policies } </code></pre> - </div> <p><img src="/images/posts/policy-partition.png" alt="" /></p> @@ -239,29 +236,26 @@ Eagle æ¯ææ ¹æ®ç¨æ <li> <p>åä¸äºä»¶æ§è¡çç¥ï¼ç¨æ·è®¿é®Hiveä¸çæææ°æ®åï¼</p> - <div class="highlighter-rouge"><pre class="highlight"><code> from hiveAccessLogStream[sensitivityType=='PHONE_NUMBER'] select * insert into outputStream; + <pre><code> from hiveAccessLogStream[sensitivityType=='PHONE_NUMBER'] select * insert into outputStream; </code></pre> - </div> </li> <li> <p>åºäºçªå£ççç¥ï¼ç¨æ·å¨10åéå 访é®ç®å½ /tmp/private å¤ä½ 5次ï¼</p> - <div class="highlighter-rouge"><pre class="highlight"><code> hdfsAuditLogEventStream[(src == '/tmp/private')]#window.externalTime(timestamp,10 min) select user, count(timestamp) as aggValue group by user having aggValue >= 5 insert into outputStream; + <pre><code> hdfsAuditLogEventStream[(src == '/tmp/private')]#window.externalTime(timestamp,10 min) select user, count(timestamp) as aggValue group by user having aggValue >= 5 insert into outputStream; </code></pre> - </div> </li> </ul> <p><strong>æ¥è¯¢æå¡ï¼Query Serviceï¼</strong> Eagle æä¾ç±»SQLçREST APIç¨æ¥å®ç°éå¯¹æµ·éæ°æ®éç综å计ç®ãæ¥è¯¢ååæçè½åï¼æ¯æä¾å¦è¿æ»¤ãèåãç´æ¹è¿ç®ãæåºãtopãç®æ¯è¡¨è¾¾å¼ä»¥åå页çãEagleä¼å æ¯æHBase ä½ä¸ºå ¶é»è®¤æ°æ®åå¨ï¼ä½æ¯åæ¶ä¹æ¯æåºJDBCçå ³ç³»åæ°æ®åºãç¹å«æ¯å½éæ©ä»¥HBaseä½ä¸ºå卿¶ï¼Eagle便åçæ¥æäºHBaseåå¨åæ¥è¯¢æµ·éçæ§æ°æ®çè½åï¼Eagle æ¥è¯¢æ¡æ¶ä¼å°ç¨æ·æä¾çç±»SQLæ¥è¯¢è¯æ³æç»ç¼è¯æ 为HBase åççFilter 对象ï¼å¹¶æ¯æéè¿HBase Coprocessorè¿ä¸æ¥æåååºé度ã</p> -<div class="highlighter-rouge"><pre class="highlight"><code>query=AlertDefinitionService[@dataSource="hiveQueryLog"]{@policyDef}&pageSize=100000 +<pre><code>query=AlertDefinitionService[@dataSource="hiveQueryLog"]{@policyDef}&pageSize=100000 </code></pre> -</div> -<h2 id="eagleå¨ebayç使ç¨åºæ¯">Eagleå¨eBayç使ç¨åºæ¯</h2> +<h2 id="eagleebay">Eagleå¨eBayç使ç¨åºæ¯</h2> <p>ç®åï¼Eagleçæ°æ®è¡ä¸ºçæ§ç³»ç»å·²ç»é¨ç½²å°ä¸ä¸ªæ¥æ2500å¤ä¸ªèç¹çHadoopé群ä¹ä¸ï¼ç¨ä»¥ä¿æ¤æ°ç¾PBæ°æ®çå®å ¨ï¼å¹¶æ£è®¡åäºä»å¹´å¹´åºä¹åæ©å±å°å ¶ä»ä¸å个Hadoopé群ä¸ï¼ä»èè¦çeBay ææä¸»è¦Hadoopç10000å¤å°èç¹ã卿们çç产ç¯å¢ä¸ï¼æä»¬å·²é对HDFSãHive çé群ä¸çæ°æ®é ç½®äºä¸äºåºç¡çå®å ¨çç¥ï¼å¹¶å°äºå¹´åºä¹å䏿å¼å ¥æ´å¤ççç¥ï¼ä»¥ç¡®ä¿éè¦æ°æ®çç»å¯¹å®å ¨ãç®åï¼Eagleççç¥æ¶µçå¤ç§æ¨� �å¼ï¼å æ¬ä»è®¿é®æ¨¡å¼ãé¢ç¹è®¿é®æ°æ®éï¼é¢å®ä¹æ¥è¯¢ç±»åãHive 表ååãHBase 表以ååºäºæºå¨å¦ä¹ 模åçæçç¨æ·Profileç¸å ³çææçç¥çãåæ¶ï¼æä»¬ä¹æå¹¿æ³ççç¥æ¥é²æ¢æ°æ®çä¸¢å¤±ãæ°æ®è¢«æ·è´å°ä¸å®å ¨å°ç¹ãæææ°æ®è¢«æªææåºå访é®çãEagleçç¥å®ä¹ä¸æå¤§ççµæ´»æ§åæ©å±æ§ä½¿å¾æä»¬æªæ¥å¯ä»¥è½»æå°ç»§ç»æ©å±æ´å¤æ´å¤æççç¥ä»¥æ¯ææ´å¤å¤å åçç¨ä¾åºæ¯ã</p> -<h2 id="åç»è®¡å">åç»è®¡å</h2> +<h2 id="section-1">åç»è®¡å</h2> <p>è¿å»ä¸¤å¹´ä¸ï¼å¨eBay é¤äºè¢«ç¨äºæ°æ®è¡ä¸ºçæ§ä»¥å¤ï¼Eagle æ ¸å¿æ¡æ¶è¿è¢«å¹¿æ³ç¨äºçæ§èç¹å¥åº·ç¶åµãHadoopåºç¨æ§è½ææ ãHadoop æ ¸å¿æå¡ä»¥åæ´ä¸ªHadoopé群çå¥åº·ç¶åµç诸å¤é¢åãæä»¬è¿å»ºç«ä¸ç³»åçèªå¨åæºå¶ï¼ä¾å¦èç¹ä¿®å¤çï¼å¸®å©æä»¬å¹³å°é¨é¨æå¤§å¾èçäºæä»¬äººå·¥å³åï¼å¹¶ææå°æåäºæ´ä¸ªéç¾¤èµæºå°å©ç¨çã</p> <p>以䏿¯æä»¬ç®åæ£å¨å¼åä¸å°ä¸äºç¹æ§ï¼</p> @@ -278,7 +272,7 @@ Eagle æ¯ææ ¹æ®ç¨æ </li> </ul> -<h2 id="å ³äºä½è ">å ³äºä½è </h2> +<h2 id="section-2">å ³äºä½è </h2> <p><a href="https://github.com/haoch">éæµ©</a>ï¼Apache Eagle Committer å PMC æåï¼eBay åæå¹³å°åºç¡æ¶æé¨é¨é«çº§è½¯ä»¶å·¥ç¨å¸ï¼è´è´£Eagleç产åè®¾è®¡ãææ¯æ¶æãæ ¸å¿å®ç°ä»¥å弿ºç¤¾åºæ¨å¹¿çã</p> <p>æè°¢ä»¥ä¸æ¥èªApache Eagle社åºåeBayå ¬å¸çèåä½è ä»¬å¯¹æ¬æçè´¡ç®ï¼</p> @@ -292,7 +286,7 @@ Eagle æ¯ææ ¹æ®ç¨æ <p>eBay åæå¹³å°åºç¡æ¶æé¨ï¼Analytics Data Infrastructureï¼æ¯eBayçå ¨çæ°æ®ååæåºç¡æ¶æé¨é¨ï¼è´è´£eBay卿°æ®åºãæ°æ®ä»åºãHadoopãå塿ºè½ä»¥åæºå¨å¦ä¹ çåä¸ªæ°æ®å¹³å°å¼åã管çç,æ¯æeBayå ¨çåé¨é¨è¿ç¨é«ç«¯çæ°æ®åæè§£å³æ¹æ¡ä½åºåæ¶ææçä½ä¸å³çï¼ä¸ºéå¸å ¨ççä¸å¡ç¨æ·æä¾æ°æ®åæè§£å³æ¹æ¡ã</p> -<h2 id="åèèµæ">åèèµæ</h2> +<h2 id="section-3">åèèµæ</h2> <ul> <li>Apache Eagle ææ¡£ï¼<a href="http://goeagle.io">http://goeagle.io</a></li> @@ -300,7 +294,7 @@ Eagle æ¯ææ ¹æ®ç¨æ <li>Apache Eagle 项ç®ï¼<a href="http://incubator.apache.org/projects/eagle.html">http://incubator.apache.org/projects/eagle.html</a></li> </ul> -<h2 id="å¼ç¨é¾æ¥">å¼ç¨é¾æ¥</h2> +<h2 id="section-4">å¼ç¨é¾æ¥</h2> <ul> <li><strong>CSDN</strong>: <a href="http://www.csdn.net/article/2015-10-29/2826076">http://www.csdn.net/article/2015-10-29/2826076</a></li> <li><strong>OSCHINA</strong>: <a href="http://www.oschina.net/news/67515/apache-eagle">http://www.oschina.net/news/67515/apache-eagle</a></li>
