eric-wang-1990 opened a new pull request, #3688: URL: https://github.com/apache/arrow-adbc/pull/3688
## Summary Add custom BenchmarkDotNet columns to display Peak Memory, Total Rows, and Total Batches in the summary table instead of requiring users to check console output. ## Changes - Add `BenchmarkMetrics` class to store peak memory, total rows, and total batches - Store metrics in temp file (`cloudfetch_benchmark_metrics.json`) for cross-process access - Update `PeakMemoryColumn` to read from temp file instead of static dictionary - Add `TotalRowsColumn` to display total rows processed - Add `TotalBatchesColumn` to display total batches processed - Register all three custom columns in `CloudFetchBenchmarkRunner` - Update README with .NET Framework 4.7.2 instructions for Power BI testing - Update README with new metrics column documentation and examples ## Problem Solved This fixes the "See previous console output" issue where custom columns couldn't access metrics because BenchmarkDotNet runs iterations in separate processes. The temp file approach ensures metrics are available when generating the final summary table. ## Before ``` | Peak Memory (MB) | |----------------------------:| | See previous console output | ``` ## After ``` | Peak Memory (MB) | Total Rows | Total Batches | |-----------------:|-----------:|--------------:| | 256.48 | 1,441,548 | 145 | ``` ## Testing - Built successfully on macOS with net8.0 - All three custom columns now display actual values in summary table - Metrics written to temp file during execution - README updated with net472 instructions for Windows/Power BI testing 🤖 Generated with [Claude Code](https://claude.com/claude-code) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
