This is an automated email from the ASF dual-hosted git repository.

vinoth pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/hudi.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new bfba9f6  Travis CI build asf-site
bfba9f6 is described below

commit bfba9f6a809a4d1819917a56911d0ba58590f68c
Author: CI <[email protected]>
AuthorDate: Fri May 29 04:49:24 2020 +0000

    Travis CI build asf-site
---
 content/docs/docker_demo.html |  5 ++++-
 content/docs/powered_by.html  | 14 ++++++++++++++
 2 files changed, 18 insertions(+), 1 deletion(-)

diff --git a/content/docs/docker_demo.html b/content/docs/docker_demo.html
index 384f271..e3c531c 100644
--- a/content/docs/docker_demo.html
+++ b/content/docs/docker_demo.html
@@ -390,7 +390,7 @@ data infrastructure is brought up in a local docker cluster 
within your computer
 
 <ul>
   <li>Docker Setup :  For Mac, Please follow the steps as defined in 
[https://docs.docker.com/v17.12/docker-for-mac/install/]. For running Spark-SQL 
queries, please ensure atleast 6 GB and 4 CPUs are allocated to Docker (See 
Docker -&gt; Preferences -&gt; Advanced). Otherwise, spark-SQL queries could be 
killed because of memory issues.</li>
-  <li>kafkacat : A command-line utility to publish/consume from kafka topics. 
Use <code class="highlighter-rouge">brew install kafkacat</code> to install 
kafkacat</li>
+  <li>kafkacat : A command-line utility to publish/consume from kafka topics. 
Use <code class="highlighter-rouge">brew install kafkacat</code> to install 
kafkacat.</li>
   <li>
     <p>/etc/hosts : The demo references many services running in container by 
the hostname. Add the following settings to /etc/hosts</p>
 
@@ -405,6 +405,9 @@ data infrastructure is brought up in a local docker cluster 
within your computer
 <span class="mf">127.0</span><span class="o">.</span><span 
class="mf">0.1</span> <span class="n">zookeeper</span>
 </code></pre></div>    </div>
   </li>
+  <li>Java : Java SE Development Kit 8.</li>
+  <li>Maven : A build automation tool for Java projects.</li>
+  <li>jq : A lightweight and flexible command-line JSON processor. Use <code 
class="highlighter-rouge">brew instlal jq</code> to install jq.</li>
 </ul>
 
 <p>Also, this has not been tested on some environments like Docker on 
Windows.</p>
diff --git a/content/docs/powered_by.html b/content/docs/powered_by.html
index 4065ead..a6f589d 100644
--- a/content/docs/powered_by.html
+++ b/content/docs/powered_by.html
@@ -355,10 +355,22 @@ offering, providing means for AWS users to perform 
record-level updates/deletes
 
 <p><a href="https://www.emishealth.com/";>EMIS Health</a> is the largest 
provider of Primary Care IT software in the UK with datasets including more 
than 500Bn healthcare records. HUDI is used to manage their analytics dataset 
in production and keeping them up-to-date with their upstream source. Presto is 
being used to query the data written in HUDI format.</p>
 
+<h3 id="kyligence">Kyligence</h3>
+
+<p><a href="https://kyligence.io/zh/";>Kyligence</a> is the leading Big Data 
analytics platform company. We’ve built end to end solutions for various Global 
Fortune 500 companies in US and China. We adopted Apache Hudi in our Cloud 
solution on AWS in 2019. With the help of Hudi, we are able to process upserts 
and deletes easily and we use incremental views to build efficient data 
pipelines in AWS. The Hudi datasets can also be integrated to Kyligence Cloud 
directly for high concurrent OLA [...]
+
+<h3 id="lingyue-digital-corporation">Lingyue-digital Corporation</h3>
+
+<p><a href="https://www.lingyue-digital.com/";>Lingyue-digital Corporation</a> 
belongs to BMW Group. Apache Hudi is used to perform ingest MySQL and 
PostgreSQL change data capture. We build up upsert scenarios on Hadoop and 
spark.</p>
+
 <h3 id="logical-clocks">Logical Clocks</h3>
 
 <p><a 
href="https://www.logicalclocks.com/blog/introducing-the-hopsworks-1-x-series";>Hopsworks
 1.x series</a> supports Apache Hudi feature groups, to enable upserts and time 
travel.</p>
 
+<h3 id="sf-express">SF-Express</h3>
+
+<p><a href="https://www.sf-express.com/cn/sc/";>SF-Express</a> is the leading 
logistics service provider in China. HUDI is used to build a real-time data 
warehouse, providing real-time computing solutions with higher efficiency and 
lower cost for our business.</p>
+
 <h3 id="tathastuai">Tathastu.ai</h3>
 
 <p><a href="https://www.tathastu.ai";>Tathastu.ai</a> offers the largest AI/ML 
playground of consumer data for data scientists, AI experts and technologists 
to build upon. They have built a CDC pipeline using Apache Hudi and Debezium. 
Data from Hudi datasets is being queried using Hive, Presto and Spark.</p>
@@ -463,6 +475,8 @@ December 2019, AWS re:Invent 2019, Las Vegas, NV, USA</p>
       
         <div class="power-item"><img src="/assets/images/powers/yotpo.png" 
/></div>
       
+        <div class="power-item"><img src="/assets/images/powers/kyligence.png" 
/></div>
+      
         <div class="power-item"><img src="/assets/images/powers/tathastu.png" 
/></div>
       
         <div class="power-item"><img src="/assets/images/powers/shunfeng.png" 
/></div>

Reply via email to