andygrove commented on code in PR #2474: URL: https://github.com/apache/datafusion-comet/pull/2474#discussion_r2389372975
########## docs/source/contributor-guide/benchmarking_aws_ec2.md: ########## @@ -148,76 +157,11 @@ aws_access_key_id=your-access-key aws_secret_access_key=your-secret-key ``` -## Run Spark Benchmarks - -Run the following command (the `--data` parameter will need to be updated to point to your S3 bucket): - -```shell -$SPARK_HOME/bin/spark-submit \ - --master $SPARK_MASTER \ - --conf spark.driver.memory=4G \ - --conf spark.executor.instances=1 \ - --conf spark.executor.cores=8 \ - --conf spark.cores.max=8 \ - --conf spark.executor.memory=16g \ - --conf spark.eventLog.enabled=false \ - --conf spark.local.dir=/mnt/tmp \ - --conf spark.driver.extraJavaOptions="-Djava.io.tmpdir=/mnt/tmp" \ - --conf spark.executor.extraJavaOptions="-Djava.io.tmpdir=/mnt/tmp" \ - --conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem \ - --conf spark.hadoop.fs.s3a.aws.credentials.provider=com.amazonaws.auth.DefaultAWSCredentialsProviderChain \ - tpcbench.py \ - --benchmark tpch \ - --data s3a://your-bucket-name/top-level-folder \ - --queries /home/ec2-user/datafusion-benchmarks/tpch/queries \ - --output . \ - --iterations 1 -``` - -## Run Comet Benchmarks - -Install Comet JAR from Maven: - -```shell -wget https://repo1.maven.org/maven2/org/apache/datafusion/comet-spark-spark3.5_2.12/0.9.0/comet-spark-spark3.5_2.12-0.9.0.jar -P $SPARK_HOME/jars -export COMET_JAR=$SPARK_HOME/jars/comet-spark-spark3.5_2.12-0.9.0.jar -``` - -Run the following command (the `--data` parameter will need to be updated to point to your S3 bucket): - -```shell -$SPARK_HOME/bin/spark-submit \ - --master $SPARK_MASTER \ - --conf spark.driver.memory=4G \ - --conf spark.executor.instances=1 \ - --conf spark.executor.cores=8 \ - --conf spark.cores.max=8 \ - --conf spark.executor.memory=16g \ - --conf spark.memory.offHeap.enabled=true \ - --conf spark.memory.offHeap.size=16g \ - --conf spark.eventLog.enabled=false \ - --conf spark.local.dir=/mnt/tmp \ - --conf spark.driver.extraJavaOptions="-Djava.io.tmpdir=/mnt/tmp" \ - --conf spark.executor.extraJavaOptions="-Djava.io.tmpdir=/mnt/tmp" \ - --conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem \ - --conf spark.hadoop.fs.s3a.aws.credentials.provider=com.amazonaws.auth.DefaultAWSCredentialsProviderChain \ - --jars $COMET_JAR \ - --driver-class-path $COMET_JAR \ - --conf spark.driver.extraClassPath=$COMET_JAR \ - --conf spark.executor.extraClassPath=$COMET_JAR \ - --conf spark.plugins=org.apache.spark.CometPlugin \ - --conf spark.shuffle.manager=org.apache.spark.sql.comet.execution.shuffle.CometShuffleManager \ - --conf spark.comet.enabled=true \ - --conf spark.comet.expression.allowIncompatible=true \ - --conf spark.comet.exec.replaceSortMergeJoin=true \ - --conf spark.comet.exec.shuffle.enabled=true \ - --conf spark.comet.exec.shuffle.fallbackToColumnar=true \ - --conf spark.comet.exec.shuffle.compression.codec=lz4 \ - --conf spark.comet.exec.shuffle.compression.level=1 \ - tpcbench.py \ - --benchmark tpch \ - --data s3a://your-bucket-name/top-level-folder \ - --queries /home/ec2-user/datafusion-benchmarks/tpch/queries \ - --output . \ - --iterations 1 +Modify the scripts to add the following configurations. + +```shell +--conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem \ +--conf spark.hadoop.fs.s3a.aws.credentials.provider=com.amazonaws.auth.DefaultAWSCredentialsProviderChain \ ``` + +Now run the `spark-tpch.sh` and `comet-tpch.sh` scripts. Review Comment: Thanks, I'll go ahead and merge this one but will follow up in a separate PR when I update the official benchmark results -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
