acezen commented on PR #518:
URL: 
https://github.com/apache/incubator-graphar/pull/518#issuecomment-2172094296

   > alritey folks .. done On Wed, Jun 12, 2024 at 4:39 PM Weibin Zeng 
***@***.***> wrote:
   > […](#)
   > please send me an example for spark + csv and i ll refactor it … 
<#m_-7895995846438427138_> On Wed, Jun 12, 2024 at 2:51 PM Semyon *@*.*> wrote: 
May we add also a test for graphar spark+json? — Reply to this email directly, 
view it on GitHub <#518 (comment) <[#518 
(comment)](https://github.com/apache/incubator-graphar/pull/518#issuecomment-2162529195)>>,
 or unsubscribe 
https://github.com/notifications/unsubscribe-auth/ATIQOSBBSXKDTSQNNEBHTVLZHAHJ3AVCNFSM6AAAAABJDPBQKGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNRSGUZDSMJZGU
 
<https://github.com/notifications/unsubscribe-auth/ATIQOSBBSXKDTSQNNEBHTVLZHAHJ3AVCNFSM6AAAAABJDPBQKGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNRSGUZDSMJZGU>
 . You are receiving this because you were mentioned.Message ID: @.*> You can 
refer to the existed test case an just add a test with updated json test data 
like: - Spark 
https://github.com/apache/incubator-graphar/blob/ff057cf9bfb3325e2822b31a3e3041e3314c2d97/maven-projects/spark/graphar/src/test/sc
 ala/org/apache/graphar/TestReader.scala#L99-L117 json test can be: test("read 
vertex chunks") {// construct the vertex information val prefix = testData + 
"/ldbc_sample/json/" val vertex_yaml = prefix + "Person.vertex.yml" val 
vertex_info = VertexInfo.loadVertexInfo(vertex_yaml, spark) // construct the 
vertex reader val reader = new VertexReader(prefix, vertex_info, spark) // test 
reading the number of vertices assert(reader.readVerticesNumber() == 903) val 
property_group = vertex_info.getPropertyGroup("gender") // test reading a 
single property chunk val single_chunk_df = 
reader.readVertexPropertyChunk(property_group, 0) 
assert(single_chunk_df.columns.length == 4) assert(single_chunk_df.count() == 
100) val cond = "gender = 'female'" var df_pd = 
single_chunk_df.select("firstName", "gender").filter(cond) - pyspark 
https://github.com/apache/incubator-graphar/blob/ff057cf9bfb3325e2822b31a3e3041e3314c2d97/pyspark/tests/test_reader.py#L29-L67
 json test can be: def test_vertex_reader_with
 _json(spark): initialize(spark) vertex_info = VertexInfo.load_vertex_info( 
GRAPHAR_TESTS_EXAMPLES.joinpath("/ldbc_sample/json/") 
.joinpath("Person.vertex.yml") .absolute() .__str__() ) vertex_reader = 
VertexReader.from_python( 
GRAPHAR_TESTS_EXAMPLES.joinpath("/ldbc_sample/json/").absolute().__str__(), 
vertex_info, ) assert VertexReader.from_scala(vertex_reader.to_scala()) is not 
None assert vertex_reader.read_vertices_number() > 0 assert ( 
vertex_reader.read_vertex_property_group( 
vertex_info.get_property_group("name") ).count() > 0 ) assert ( 
vertex_reader.read_vertex_property_chunk( vertex_info.get_property_groups()[0], 
0 ).count() > 0 ) assert ( 
vertex_reader.read_all_vertex_property_groups().count() >= 
vertex_reader.read_vertex_property_group( vertex_info.get_property_group("age") 
).count() ) assert ( vertex_reader.read_multiple_vertex_property_groups( 
[vertex_info.get_property_group("name")] ).count() > 0 ) — Reply to this email 
directly, view it on GitHub <[#518 (comment)](h
 ttps://github.com/apache/incubator-graphar/pull/518#issuecomment-2162737305)>, 
or unsubscribe 
<https://github.com/notifications/unsubscribe-auth/ATIQOSCZKVPDSRXA7AYB56DZHAT7JAVCNFSM6AAAAABJDPBQKGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNRSG4ZTOMZQGU>
 . You are receiving this because you were mentioned.Message ID: ***@***.***>
   
   Hi, @amygbAI Can you add me to your graphar folk repo's collaborator? I can 
help you fix the format and test.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to