Hi,

I had a try to build system cube in our env. It works fine.

From your error log,
18/04/03 15:13:22 ERROR yarn.ApplicationMaster: User class threw exception: 
java.lang.RuntimeException: error execute 
org.apache.kylin.engine.spark.SparkCubingByLayer
java.lang.RuntimeException: error execute 
org.apache.kylin.engine.spark.SparkCubingByLayer
                at 
org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:42)
                at 
org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                at java.lang.reflect.Method.invoke(Method.java:498)
                at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:636)
Caused by: java.lang.NullPointerException
                at 
org.apache.kylin.engine.mr.common.CubeStatsReader.estimateLayerSize(CubeStatsReader.java:281)
                at 
org.apache.kylin.engine.spark.SparkCubingByLayer.estimateRDDPartitionNum(SparkCubingByLayer.java:218)
                at 
org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:189)
                at 
org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
                ... 6 more

Could you check the source code at line 281 in CubeStatsReader. It may be 
related to no statistics generated for cuboids when there’s no source records. 
If in this case, it’s better to check the metadata of the related statistics.

Best regards,
Yanghong Zhong

From: 凡梦星尘 <elkan1...@gmail.com>
Reply-To: "dev@kylin.apache.org" <dev@kylin.apache.org>
Date: Tuesday, April 3, 2018 at 3:28 PM
To: "dev@kylin.apache.org" <dev@kylin.apache.org>
Subject: Re: Why Klyin system cube of dashboard can't build under spark mode?

Hi Yanghong,

There had upload all logs file in attach, and I found a NPE exception at spark 
cube building time. Like below hope it can help you.


18/04/03 15:13:22 ERROR yarn.ApplicationMaster: User class threw exception: 
java.lang.RuntimeException: error execute 
org.apache.kylin.engine.spark.SparkCubingByLayer
java.lang.RuntimeException: error execute 
org.apache.kylin.engine.spark.SparkCubingByLayer
at 
org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:42)
at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:636)
Caused by: java.lang.NullPointerException
at 
org.apache.kylin.engine.mr.common.CubeStatsReader.estimateLayerSize(CubeStatsReader.java:281)
at 
org.apache.kylin.engine.spark.SparkCubingByLayer.estimateRDDPartitionNum(SparkCubingByLayer.java:218)
at 
org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:189)
at 
org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)





Thanks.

2018-04-03 2:21 GMT+08:00 Zhong, Yanghong 
<yangzh...@ebay.com<mailto:yangzh...@ebay.com>>:
Hi,

Could you attach the app master log? For system cube, it's better to use MR 
mode with INMEM algorithm. However, we should figure out why spark cubing not 
works.

Best regards,
Yanghong Zhong

Email: yangzh...@ebay.com<mailto:yangzh...@ebay.com>

On 4/2/18, 10:35 AM, "凡梦星尘" <elkan1...@gmail.com<mailto:elkan1...@gmail.com>> 
wrote:

    So no one can help me about this trouble?

    2018-03-28 19:12 GMT+08:00 凡梦星尘 
<elkan1...@gmail.com<mailto:elkan1...@gmail.com>>:

    > Hi guys:
    >
    > Is there someone use newest Kylin and try open dashboard? I had try this
    > new feature, by the default the system cube use MR mode, build and refresh
    > action no any problems. But when I change to spark mode not success. The
    > error logs see below:
    >
    >
    > OS command error exit with return code: 1, error message: log4j:WARN No
    > appenders could be found for logger (org.apache.hadoop.util.Shell).
    > log4j:WARN Please initialize the log4j system properly.
    > log4j:WARN See 
https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Flogging.apache.org%2Flog4j%2F1.2%2Ffaq.html%23noconfig&data=02%7C01%7Cyangzhong%40ebay.com%7C56cb5aff32af4699415b08d598426d17%7C46326bff992841a0baca17c16c94ea99%7C0%7C1%7C636582333428432372&sdata=O74xeYvbdtaJ2BtDNvBkX0AinRnOobc0LvgdxG9%2Fgx4%3D&reserved=0
 for
    > more info.
    > Using Spark's default log4j profile: org/apache/spark/log4j-
    > defaults.properties
    > 18/03/28 18:03:30 INFO Client: Requesting a new application from cluster
    > with 3 NodeManagers
    > 18/03/28 18:03:30 INFO Client: Verifying our application has not requested
    > more than the maximum memory capability of the cluster (8192 MB per
    > container)
    > 18/03/28 18:03:30 INFO Client: Will allocate AM container, with 1408 MB
    > memory including 384 MB overhead
    > 18/03/28 18:03:30 INFO Client: Setting up container launch context for our
    > AM
    > 18/03/28 18:03:30 INFO Client: Setting up the launch environment for our
    > AM container
    > 18/03/28 18:03:30 INFO Client: Preparing resources for our AM container
    > 18/03/28 18:03:32 WARN Client: Neither spark.yarn.jars nor
    > spark.yarn.archive is set, falling back to uploading libraries under
    > SPARK_HOME.
    > 18/03/28 18:03:34 INFO Client: Uploading resource
    > 
file:/tmp/spark-e9a0636e-6d6e-476a-ad62-dfcb1e4d6102/__spark_libs__437322364481459969.zip
    > -> hdfs://xxx:8020/user/hdfs/.sparkStaging/application_
    > 1522141683864_0091/__spark_libs__437322364481459969.zip
    > 18/03/28 18:03:38 INFO Client: Uploading resource
    > file:/opt/apache-kylin-2.3.0/lib/kylin-job-2.3.0.jar ->
    > hdfs://xxx:8020/user/hdfs/.sparkStaging/application_
    > 1522141683864_0091/kylin-job-2.3.0.jar
    > 18/03/28 18:03:39 INFO Client: Uploading resource
    > file:/data/hdp/2.6.4.0-91/hbase/lib/htrace-core-3.1.0-incubating.jar ->
    > hdfs://xxx:8020/user/hdfs/.sparkStaging/application_
    > 1522141683864_0091/htrace-core-3.1.0-incubating.jar
    > 18/03/28 18:03:39 INFO Client: Uploading resource
    > file:/data/hdp/2.6.4.0-91/hbase/lib/metrics-core-2.2.0.jar ->
    > hdfs://xxx:8020/user/hdfs/.sparkStaging/application_
    > 1522141683864_0091/metrics-core-2.2.0.jar
    > 18/03/28 18:03:39 INFO Client: Uploading resource
    > file:/data/hdp/2.6.4.0-91/hbase/lib/guava-12.0.1.jar ->
    > hdfs://xxx:8020/user/hdfs/.sparkStaging/application_
    > 1522141683864_0091/guava-12.0.1.jar
    > 18/03/28 18:03:40 INFO Client: Uploading resource
    > 
file:/tmp/spark-e9a0636e-6d6e-476a-ad62-dfcb1e4d6102/__spark_conf__5932514761984685807.zip
    > -> hdfs://xxx:8020/user/hdfs/.sparkStaging/application_
    > 1522141683864_0091/__spark_conf__.zip
    > 18/03/28 18:03:40 WARN Client: spark.yarn.am.extraJavaOptions will not
    > take effect in cluster mode
    > 18/03/28 18:03:40 INFO SecurityManager: Changing view acls to: hdfs
    > 18/03/28 18:03:40 INFO SecurityManager: Changing modify acls to: hdfs
    > 18/03/28 18:03:40 INFO SecurityManager: Changing view acls groups to:
    > 18/03/28 18:03:40 INFO SecurityManager: Changing modify acls groups to:
    > 18/03/28 18:03:40 INFO SecurityManager: SecurityManager: authentication
    > disabled; ui acls disabled; users  with view permissions: Set(hdfs); 
groups
    > with view permissions: Set(); users  with modify permissions: Set(hdfs);
    > groups with modify permissions: Set()
    > 18/03/28 18:03:40 INFO Client: Submitting application
    > application_1522141683864_0091 to ResourceManager
    > 18/03/28 18:03:40 INFO YarnClientImpl: Submitted application
    > application_1522141683864_0091
    > 18/03/28 18:03:41 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:03:41 INFO Client:
    > client token: N/A
    > diagnostics: AM container is launched, waiting for AM container to
    > Register with RM
    > ApplicationMaster host: N/A
    > ApplicationMaster RPC port: -1
    > queue: default
    > start time: 1522231420490
    > final status: UNDEFINED
    > tracking URL: http://xxx:8088/proxy/application_1522141683864_0091/
    > user: hdfs
    > 18/03/28 18:03:42 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:03:43 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:03:44 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:03:45 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:03:46 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:03:47 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:03:48 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:03:49 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:03:50 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:03:51 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:03:52 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:03:53 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:03:53 INFO Client:
    > client token: N/A
    > diagnostics: N/A
    > ApplicationMaster host: xxx
    > ApplicationMaster RPC port: 0
    > queue: default
    > start time: 1522231420490
    > final status: UNDEFINED
    > tracking URL: http://xxx:8088/proxy/application_1522141683864_0091/
    > user: hdfs
    > 18/03/28 18:03:54 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:03:55 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:03:56 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:03:57 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:03:58 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:03:59 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:00 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:01 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:02 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:03 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:04 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:05 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:06 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:07 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:08 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:09 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:10 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:11 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:12 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:13 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:14 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:15 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:16 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:04:16 INFO Client:
    > client token: N/A
    > diagnostics: AM container is launched, waiting for AM container to
    > Register with RM
    > ApplicationMaster host: N/A
    > ApplicationMaster RPC port: -1
    > queue: default
    > start time: 1522231420490
    > final status: UNDEFINED
    > tracking URL: http://xxx:8088/proxy/application_1522141683864_0091/
    > user: hdfs
    > 18/03/28 18:04:17 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:04:18 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:04:19 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:04:20 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:04:21 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:04:22 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:04:23 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:04:25 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:04:26 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:04:27 INFO Client: Application report for
    > application_1522141683864_0091 (state: ACCEPTED)
    > 18/03/28 18:04:28 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:28 INFO Client:
    > client token: N/A
    > diagnostics: N/A
    > ApplicationMaster host: xxx
    > ApplicationMaster RPC port: 0
    > queue: default
    > start time: 1522231420490
    > final status: UNDEFINED
    > tracking URL: http://xxx:8088/proxy/application_1522141683864_0091/
    > user: hdfs
    > 18/03/28 18:04:29 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:30 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:31 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:32 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:33 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:34 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:35 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:36 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:37 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:38 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:39 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:40 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:41 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:42 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:43 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:44 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:45 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:46 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:47 INFO Client: Application report for
    > application_1522141683864_0091 (state: RUNNING)
    > 18/03/28 18:04:48 INFO Client: Application report for
    > application_1522141683864_0091 (state: FINISHED)
    > 18/03/28 18:04:48 INFO Client:
    > client token: N/A
    > diagnostics: User class threw exception: java.lang.RuntimeException: error
    > execute org.apache.kylin.engine.spark.SparkCubingByLayer
    > ApplicationMaster host: xxx
    > ApplicationMaster RPC port: 0
    > queue: default
    > start time: 1522231420490
    > final status: FAILED
    > tracking URL: http://xxx:8088/proxy/application_1522141683864_0091/
    > user: hdfs
    > Exception in thread "main" org.apache.spark.SparkException: Application
    > application_1522141683864_0091 finished with failed status
    > at org.apache.spark.deploy.yarn.Client.run(Client.scala:1180)
    > at org.apache.spark.deploy.yarn.Client\$.main(Client.scala:1226)
    > at org.apache.spark.deploy.yarn.Client.main(Client.scala)
    > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    > at sun.reflect.NativeMethodAccessorImpl.invoke(
    > NativeMethodAccessorImpl.java:62)
    > at sun.reflect.DelegatingMethodAccessorImpl.invoke(
    > DelegatingMethodAccessorImpl.java:43)
    > at java.lang.reflect.Method.invoke(Method.java:498)
    > at org.apache.spark.deploy.SparkSubmit\$.org\$apache\$
    > spark\$deploy\$SparkSubmit\$\$runMain(SparkSubmit.scala:744)
    > at org.apache.spark.deploy.SparkSubmit\$.doRunMain\$1(
    > SparkSubmit.scala:187)
    > at org.apache.spark.deploy.SparkSubmit\$.submit(SparkSubmit.scala:212)
    > at org.apache.spark.deploy.SparkSubmit\$.main(SparkSubmit.scala:126)
    > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    > 18/03/28 18:04:48 INFO ShutdownHookManager: Shutdown hook called
    > 18/03/28 18:04:48 INFO ShutdownHookManager: Deleting directory
    > /tmp/spark-e9a0636e-6d6e-476a-ad62-dfcb1e4d6102
    > The command is:
    > export HADOOP_CONF_DIR=/opt/apache-kylin-2.3.0/hadoop-conf &&
    > /opt/apache-kylin-2.3.0/spark/bin/spark-submit --class
    > org.apache.kylin.common.util.SparkEntry  --conf
    > spark.executor.instances=1  --conf spark.yarn.queue=default  --conf
    > 
spark.yarn.am<https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fspark.yarn.am&data=02%7C01%7Cyangzhong%40ebay.com%7Cfeab4c1ee92e4334aba608d599347bf9%7C46326bff992841a0baca17c16c94ea99%7C0%7C1%7C636583373062626303&sdata=ajdeiDDmQfxuWGujEuk27vxTnh8AAbbdCqqreOegcnk%3D&reserved=0>.extraJavaOptions=-Dhdp.version=current
  --conf
    > spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf
    > spark.driver.extraJavaOptions=-Dhdp.version=current  --conf
    > 
spark.io.compression.codec=org.apache.spark.io<https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Forg.apache.spark.io&data=02%7C01%7Cyangzhong%40ebay.com%7Cfeab4c1ee92e4334aba608d599347bf9%7C46326bff992841a0baca17c16c94ea99%7C0%7C1%7C636583373062626303&sdata=j%2BmY2HNg8yAeISvOP9S5LlfoHPXPekmp3kplDXPGVFw%3D&reserved=0>.SnappyCompressionCodec
    > --conf spark.master=yarn  --conf 
spark.executor.extraJavaOptions=-Dhdp.version=current
    > --conf spark.hadoop.yarn.timeline-service.enabled=true  --conf
    > spark.executor.memory=1G  --conf spark.eventLog.enabled=true  --conf
    > spark.eventLog.dir=hdfs:///kylin/spark-history  --conf
    > spark.executor.cores=2  --conf spark.submit.deployMode=cluster --jars
    > /data/hdp/2.6.4.0-91/hbase/lib/htrace-core-3.1.0-
    > incubating.jar,/data/hdp/2.6.4.0-91/hbase/lib/metrics-core-
    > 2.2.0.jar,/data/hdp/2.6.4.0-91/hbase/lib/guava-12.0.1.jar,
    > /opt/apache-kylin-2.3.0/lib/kylin-job-2.3.0.jar -className
    > org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable
    > kylin_tmp.kylin_intermediate_kylin_hive_metrics_job_qa_
    > 912467dc_7a1c_42ce_8ff5_60c2a5b11441 -output hdfs://xxx:8020/kylin/kylin_
    > 
metadata/kylin-f943d829-3436-4816-bfa6-da2234401862/KYLIN_HIVE_METRICS_JOB_QA/cuboid/
    > -segmentId 912467dc-7a1c-42ce-8ff5-60c2a5b11441 -metaUrl
    > kylin_metadata@hdfs,path=hdfs://xxx:8020/kylin/kylin_
    > metadata/metadata/912467dc-7a1c-42ce-8ff5-60c2a5b11441 -cubename
    > KYLIN_HIVE_METRICS_JOB_QA
    >
    > I not sure whether is my HDP environment problem, but I can sure the
    > sample cube running spark mode well. And there had someone bring
    > forward same issues but not earn resolve. So please tell me how to do?
    >
    > PS:
    >
    > I can't find more detail info about this exception. Only cat from
    > kylin.log file. If you need more things, tell what steps.
    >
    > Thanks.
    >
    >


Reply via email to