Djjanks commented on code in PR #14:
URL: https://github.com/apache/arrow-js/pull/14#discussion_r2324681999


##########
src/ipc/reader.ts:
##########
@@ -354,12 +358,31 @@ abstract class RecordBatchReaderImpl<T extends TypeMap = 
any> implements RecordB
         return this;
     }
 
-    protected _loadRecordBatch(header: metadata.RecordBatch, body: any) {
-        const children = this._loadVectors(header, body, this.schema.fields);
+    protected _loadRecordBatch(header: metadata.RecordBatch, body: 
Uint8Array): RecordBatch<T> {
+        let children: Data<any>[];
+        if (header.compression != null) {
+            const codec = compressionRegistry.get(header.compression.type);
+            if (codec?.decode && typeof codec.decode === 'function') {
+                const { decommpressedBody, buffers } = 
this._decompressBuffers(header, body, codec);
+                children = this._loadCompressedVectors(header, 
decommpressedBody, this.schema.fields);
+                header = new metadata.RecordBatch(
+                    header.length,
+                    header.nodes,
+                    buffers,
+                    null
+                );
+            } else {
+                throw new Error('Record batch is compressed but codec not 
found');
+            }
+        } else {
+            children = this._loadVectors(header, body, this.schema.fields);
+        }
+
         const data = makeData({ type: new Struct(this.schema.fields), length: 
header.length, children });
         return new RecordBatch(this.schema, data);
     }
-    protected _loadDictionaryBatch(header: metadata.DictionaryBatch, body: 
any) {
+
+    protected _loadDictionaryBatch(header: metadata.DictionaryBatch, body: 
Uint8Array) {

Review Comment:
   Can you please tell me a bit more about your use case?
   
   1.  In which scenarios do you encounter issues?
   - [ ] When an arrow-js compressed file is read by another tool.
   - [ ] When a compressed file generated by another tool is read by arrow-js.
   - [ ] When an arrow-js compressed file is read by arrow-js .
   2.  Have you reverted the changes made in commit  [1e43814 fix(ipc/writer): 
handle dictionary batch correctly when compression is 
enabled](https://github.com/apache/arrow-js/pull/14/commits/1e43814079ffe3039577159eb90303be421bbfee)
 because I removed compression from the `_writeDictionaryBatch` in that change?
   3.  Have you run tests in arrow-js? They work with commit `1e43814`, but 
when you roll it back, the tests start failing because writing is incorrect.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to