Hello! Well, COPY command does allow you to do column mapping:
COPY FROM '/path/to/local/file.csv'INTO tablename (*columnName, columnName, ...*) FORMAT CSV If you need to do non-trivial transformations, you can use JDBC driver in SET STREAMING ON mode. Regards, -- Ilya Kasnacheev вт, 29 окт. 2019 г. в 13:11, Muhammed Favas < [email protected]>: > Hi, > > > > I have tried simple python program without using spark. First I read whole > csv into python dataframe using pandas library. > > Now I want to bulk insert the whole dataframe into ignite table without > looping through. > > > > The purpose of this test is to evaluate the best way(means faster wat) to > bulk load csv files into ignite. > > > > Ignite COPY command I can not use here, because I need an option to do > column mapping while import csv files. > > > > > > *Regards,* > > *Favas * > > > > *From:* Stephen Darlington <[email protected]> > *Sent:* Monday, October 28, 2019 5:05 PM > *To:* [email protected] > *Subject:* Re: Write python dataframe to ignite table. > > > > What have you tried? As long as your command-line includes the right JAR > files it seems to more-or-less just work for me: > > > > > https://medium.com/@sdarlington/the-trick-to-successfully-integrating-apache-ignite-and-pyspark-890e436d09ba > > > > Regards, > > Stephen > > > > On 22 Oct 2019, at 11:41, Muhammed Favas < > [email protected]> wrote: > > > > Hi, > > > > Is there a way to bulk load python dataframe values to ignite table? > > > > *Regards,* > > *Favas * > > >
