[jira] [Created] (IGNITE-10818) GridQueryTypeDescriptor should have cacheName and alias field

2018-12-25 Thread Ray Liu (JIRA)
Ray Liu created IGNITE-10818:


 Summary: GridQueryTypeDescriptor should have cacheName and alias 
field
 Key: IGNITE-10818
 URL: https://issues.apache.org/jira/browse/IGNITE-10818
 Project: Ignite
  Issue Type: Improvement
Reporter: Ray Liu
Assignee: Ray Liu


Currently, GridQueryTypeDescriptor don't have cacheName and alias field.

We have to cast GridQueryTypeDescriptor to QueryTypeDescriptorImpl to get these 
two fields.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (IGNITE-10585) JDBC driver returns Double in column metadata for Float type

2018-12-06 Thread Ray (JIRA)
Ray created IGNITE-10585:


 Summary: JDBC driver returns Double in column metadata for Float 
type
 Key: IGNITE-10585
 URL: https://issues.apache.org/jira/browse/IGNITE-10585
 Project: Ignite
  Issue Type: Bug
  Components: jdbc
Reporter: Ray
Assignee: Ray
 Fix For: 2.8


When I create a table using 

create table c(a varchar, b float, primary key(a));

The meta information for column b is wrong when I use !desc c to check.

0: jdbc:ignite:thin://127.0.0.1/> !desc c
TABLE_CAT
TABLE_SCHEM PUBLIC
TABLE_NAME C
COLUMN_NAME A
DATA_TYPE 12
TYPE_NAME VARCHAR
COLUMN_SIZE null
BUFFER_LENGTH null
DECIMAL_DIGITS null
NUM_PREC_RADIX 10
NULLABLE 1
REMARKS
COLUMN_DEF
SQL_DATA_TYPE 12
SQL_DATETIME_SUB null
CHAR_OCTET_LENGTH 2147483647
ORDINAL_POSITION 1
IS_NULLABLE YES
SCOPE_CATLOG
SCOPE_SCHEMA
SCOPE_TABLE
SOURCE_DATA_TYPE null
IS_AUTOINCREMENT NO
IS_GENERATEDCOLUMN NO

TABLE_CAT
TABLE_SCHEM PUBLIC
TABLE_NAME C
COLUMN_NAME B
DATA_TYPE 8
{color:#d04437}TYPE_NAME DOUBLE{color}
COLUMN_SIZE null
BUFFER_LENGTH null
DECIMAL_DIGITS null
NUM_PREC_RADIX 10
NULLABLE 1
REMARKS
COLUMN_DEF
SQL_DATA_TYPE 8
SQL_DATETIME_SUB null
CHAR_OCTET_LENGTH 2147483647
ORDINAL_POSITION 2
IS_NULLABLE YES
SCOPE_CATLOG
SCOPE_SCHEMA
SCOPE_TABLE
SOURCE_DATA_TYPE null
IS_AUTOINCREMENT NO
IS_GENERATEDCOLUMN NO



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (IGNITE-10569) Null meta information when getting meta for a customized schema cache through JDBC driver

2018-12-06 Thread Ray (JIRA)
Ray created IGNITE-10569:


 Summary: Null meta information when getting meta for a customized 
schema cache through JDBC driver
 Key: IGNITE-10569
 URL: https://issues.apache.org/jira/browse/IGNITE-10569
 Project: Ignite
  Issue Type: Bug
  Components: jdbc
Reporter: Ray
Assignee: Ray
 Fix For: 2.8


When I create a cache with customized schema(not PUBLIC), then query the column 
meta information through thin JDBC driver it will return null.

 

Analysis:

The cases of schema name is different in GridQueryTypeDescriptor and 
CacaheConfiguration.

So the schema validation

if (!matches(table.schemaName(), req.schemaName()))

in method JdbcRequestHandler.getColumnsMeta will not pass.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (IGNITE-10356) JDBC thin driver returns wrong data type for Date and Decimal SQL type

2018-11-20 Thread Ray (JIRA)
Ray created IGNITE-10356:


 Summary: JDBC thin driver returns wrong data type for Date and 
Decimal SQL type
 Key: IGNITE-10356
 URL: https://issues.apache.org/jira/browse/IGNITE-10356
 Project: Ignite
  Issue Type: Bug
Affects Versions: 2.6, 2.7
Reporter: Ray
Assignee: Ray
 Fix For: 2.8


JDBC thin driver will return wrong metadata for column type when user creates a 
table with Date and Decimal type.

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Spark dataframe API will get wrong schema if user executes add/drop column DDL

2018-11-19 Thread Ray
Currently, when user performs add/remove column DDL, the QueryEntity will not
change.

This result in Spark getting wrong schema because Spark relies on
QueryEntity to construct data frame schema.

In Vladimir Ozerov's reply in dev list,
http://apache-ignite-developers.2346864.n4.nabble.com/Schema-in-CacheConfig-is-not-updated-after-DDL-commands-Add-drop-column-Create-drop-index-td38002.html.

This behavior is by design, so I decide to fix this issue from the Spark
side.

 

So I propose this solution, instead of getting schema by QueryEntity I want
to get schema by a SQL select command.

Nikolay Izhikov, what do you think about this solution?

I already created a ticket in JIRA,
https://issues.apache.org/jira/browse/IGNITE-10314

If you think this solution OK then I'll start implementing.



--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Schema in CacheConfig is not updated after DDL commands(Add/drop column, Create/drop index)

2018-11-19 Thread Ray
When user performs column and index modification operation in SQL(ex create
index, drop index, add column, drop column),  QueryEntity in
CacheConfiguration for the modified cache is not updated.

Here's my analysis 

QueryEntity in QuerySchema is a local copy of the original QueryEntity, so
the original QueryEntity is not updated when modification happens.

I have created a ticket for this issue 
https://issues.apache.org/jira/browse/IGNITE-10314

But as Vlad said in the comments "public configuration is immutable, it
represents initial cache parameters. So it is expected that configuration
will not be updated after DDL commands. Real changes are accumulated in
separate query entity which is hidden from user and used internally"

But I think it's only reasonable to return the newest QueryEntity to user.

For example, a user adds a column to a table then he reads data using Spark
data frame API which currently relies on QueryEntity to construct data frame
schema, so user will get wrong schema.

What do you guys think?






--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


[jira] [Created] (IGNITE-10314) QueryEntity is not updated when column and index added or dropped

2018-11-18 Thread Ray (JIRA)
Ray created IGNITE-10314:


 Summary: QueryEntity is not updated when column and index added or 
dropped
 Key: IGNITE-10314
 URL: https://issues.apache.org/jira/browse/IGNITE-10314
 Project: Ignite
  Issue Type: Bug
Affects Versions: 2.6, 2.7
Reporter: Ray
Assignee: Ray
 Fix For: 2.8


When user performs column and index modification operation in SQL(ex create 
index, drop index, add column, drop column),  QueryEntity in CacheConfiguration 
for the modified cache is not updated.

 

Analyse 

QueryEntity in QuerySchema is a local copy of the original QueryEntity, so the 
original QueryEntity is not updated when modification happens.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Apache Ignite 2.7 release

2018-09-19 Thread Ray
Hello, Igniters. 

Is there any specific reason why this ticket is removed from 2.7 scope?
I think this ticket is important for both usability and performance.
Without this ticket, we have to create an index manually identical to
primary key if we want to use SQL query.
https://issues.apache.org/jira/browse/IGNITE-8386



--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Re: Apache Flink Sink + Ignite: Ouch! Argument is invalid

2018-07-26 Thread Ray
Hi Saikat,

The results flink calculated before sending to sink is correct, but the
results in Ignite is not correct.
You can remove the sink and print the stream content to validate my point.



--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Re: Ignite 2.6 emergency release suggestion

2018-06-12 Thread Ray
Igniters,

Can you squeeze this ticket into 2.6 scope?
https://issues.apache.org/jira/browse/IGNITE-8534

As ignite-spark module is relatively independent module, and there're
already two users in the user list trying to use spark 2.3 with Ignite last
week only.

http://apache-ignite-users.70518.x6.nabble.com/Spark-Ignite-connection-using-Config-file-td21827.html

http://apache-ignite-users.70518.x6.nabble.com/Spark-Ignite-standalone-mode-on-Kubernetes-cluster-td21739.html





--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


[jira] [Created] (IGNITE-8697) Flink sink throws java.lang.IllegalArgumentException when running in flink cluster mode.

2018-06-04 Thread Ray (JIRA)
Ray created IGNITE-8697:
---

 Summary: Flink sink throws java.lang.IllegalArgumentException when 
running in flink cluster mode.
 Key: IGNITE-8697
 URL: https://issues.apache.org/jira/browse/IGNITE-8697
 Project: Ignite
  Issue Type: Bug
Affects Versions: 2.5, 2.4, 2.3
Reporter: Ray
Assignee: Roman Shtykh


if I submit the Application to the Flink Cluster using Ignite flink sink I get 
this error

 
java.lang.ExceptionInInitializerError
at 
org.apache.ignite.sink.flink.IgniteSink$SinkContext.getStreamer(IgniteSink.java:201)
at 
org.apache.ignite.sink.flink.IgniteSink$SinkContext.access$100(IgniteSink.java:175)
at org.apache.ignite.sink.flink.IgniteSink.invoke(IgniteSink.java:165)
at 
org.apache.flink.streaming.api.functions.sink.SinkFunction.invoke(SinkFunction.java:52)
at 
org.apache.flink.streaming.api.operators.StreamSink.processElement(StreamSink.java:56)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:560)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:535)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:515)
at 
org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:679)
at 
org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:657)
at 
org.apache.flink.streaming.api.operators.TimestampedCollector.collect(TimestampedCollector.java:51)
at 
org.myorg.quickstart.InstrumentStreamer$Splitter.flatMap(InstrumentStreamer.java:97)
at 
org.myorg.quickstart.InstrumentStreamer$Splitter.flatMap(InstrumentStreamer.java:1)
at 
org.apache.flink.streaming.api.operators.StreamFlatMap.processElement(StreamFlatMap.java:50)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:560)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:535)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:515)
at 
org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:679)
at 
org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:657)
at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:104)
at 
org.apache.flink.streaming.api.functions.source.SocketTextStreamFunction.run(SocketTextStreamFunction.java:110)
at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:87)
at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:56)
at 
org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:99)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:306)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IllegalArgumentException: Ouch! Argument is invalid: Cache 
name must not be null or empty.
at 
org.apache.ignite.internal.util.GridArgumentCheck.ensure(GridArgumentCheck.java:109)
at 
org.apache.ignite.internal.processors.cache.GridCacheUtils.validateCacheName(GridCacheUtils.java:1581)
at 
org.apache.ignite.internal.IgniteKernal.dataStreamer(IgniteKernal.java:3284)
at 
org.apache.ignite.sink.flink.IgniteSink$SinkContext$Holder.(IgniteSink.java:183)
... 27 more



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review request for IGNITE-8534 Upgrade Ignite Spark Module's Spark version to 2.3.

2018-05-30 Thread Ray
Before spark 2.3, spark is compiled using scala 2.11 and 2.10 separately.
Spark-2.10 module in Ignite exists to accommodate this, it's not used
anywhere else in project.
Now spark 2.3 decided to remove support for scala 2.10, so we can safely
remove spark-2.10 module in Ignite.

It won't affect visor-console-2.10 and scala-2.10 modules because these two
modules are Ignite internal modules to support legacy users using scala
2.10.



--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Review request for IGNITE-8534 Upgrade Ignite Spark Module's Spark version to 2.3.

2018-05-27 Thread Ray
Valentin Kulichenko and Nikolay Izhikov can you please take a look at PR and
provide comments? Other reviewers are welcome as well.

https://github.com/apache/ignite/pull/4033

I modified Ignite Spark module to fit Spark 2.3 APIs and removed spark-2.10
module because Spark 2.3 stooped support for scala 2.10.
https://issues.apache.org/jira/browse/SPARK-19810

Also, please find the dev list discussion thread here.
http://apache-ignite-developers.2346864.n4.nabble.com/Discussion-Upgrade-Ignite-Spark-Module-s-Spark-version-to-2-3-0-td30762.html

Ray




--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Re: How to turn debug log on in Intellij Idea console for ignite core module?

2018-05-27 Thread Ray
Not yet.

I tried your way, but seems it's not working.
The default logger is JavaLogger when I started Ignite node from
CommandLineStartup.
And the ignite core module does not contain Log4JLogger so it will not load
\ignite\config\ignite-log4j.xml.

I tried adding the ignite-log4j module into ignite core module but Intellij
says it contains module cycles.
Because ignite-log4j module also depends on ignite-core module.

I also tried comment the scope for log4j in pom but still no luck.

log4j
log4j






--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Re: [jira] [Created] (IGNITE-8595) SQL: Ability to cancel DDL operations

2018-05-24 Thread Ray
Thanks for the reply, Vladimir.

Is there a ticket to track the cancel feature described by you?



--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Re: [jira] [Created] (IGNITE-8595) SQL: Ability to cancel DDL operations

2018-05-24 Thread Ray
Hello Vladimir.

>From the https://apacheignite-sql.readme.io/docs/query-cancellation, it is
only possible to cancel select query in java API.
So select query submitted from thin client(sqlline)/jdbc/odbc is not
cancellable, is my understanding correct?





--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Re: How to turn debug log on in Intellij Idea console for ignite core module?

2018-05-23 Thread Ray
Thanks for the reply.

But your approach seems only to be working for tests in Ignite core module.
What I'm looking for is a way to open DEBUG mode when starting an Ignite
node from
modules/core/src/main/java/org/apache/ignite/startup/cmdline/CommandLineStartup.java.



--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


How to turn debug log on in Intellij Idea console for ignite core module?

2018-05-23 Thread Ray
I tried change .level=INFO to FINE in config/java.util.logging.properties
file and disable quite mode using -DIGNITE_QUIET=false, but it still didn't
work and there's no DEBUG or TRACE level log in the console.
Please advice me how to turn debug log on in Intellij Idea console for
ignite core module.



--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


How to run an Ignite node inside Intellij Idea?

2018-05-22 Thread Ray
I try to run an Ignite ndoe inside Intellij Idea for debug reasons.
I ran the mvn clean install -Pall-java,all-scala,licenses -DskipTests
command from devnotes and it completed without error.
But when I try to run Ignite node from this class CommandLineStartup and
selected default xml config.
But I got following error

Choose configuration file ('c' to cancel) [0]: 0

Using configuration: config/default-config.xml

class org.apache.ignite.IgniteException: Failed to create Ignite component
(consider adding ignite-spring module to classpath) [component=SPRING,
cls=org.apache.ignite.internal.util.spring.IgniteSpringHelperImpl]
at
org.apache.ignite.internal.util.IgniteUtils.convertException(IgniteUtils.java:990)
at org.apache.ignite.Ignition.start(Ignition.java:355)
at
org.apache.ignite.startup.cmdline.CommandLineStartup.main(CommandLineStartup.java:301)
Caused by: class org.apache.ignite.IgniteCheckedException: Failed to create
Ignite component (consider adding ignite-spring module to classpath)
[component=SPRING,
cls=org.apache.ignite.internal.util.spring.IgniteSpringHelperImpl]
at
org.apache.ignite.internal.IgniteComponentType.componentException(IgniteComponentType.java:320)
at
org.apache.ignite.internal.IgniteComponentType.create0(IgniteComponentType.java:296)
at
org.apache.ignite.internal.IgniteComponentType.create(IgniteComponentType.java:207)
at
org.apache.ignite.internal.IgnitionEx.loadConfigurations(IgnitionEx.java:742)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:945)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:854)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:724)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:693)
at org.apache.ignite.Ignition.start(Ignition.java:352)
... 1 more
Caused by: java.lang.ClassNotFoundException:
org.apache.ignite.internal.util.spring.IgniteSpringHelperImpl
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at
org.apache.ignite.internal.IgniteComponentType.create0(IgniteComponentType.java:282)
... 8 more
Failed to start grid: Failed to create Ignite component (consider adding
ignite-spring module to classpath) [component=SPRING,
cls=org.apache.ignite.internal.util.spring.IgniteSpringHelperImpl]
Note! You may use 'USER_LIBS' environment variable to specify your
classpath.

Any advice?



--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Discussion: Upgrade Ignite Spark Module's Spark version to 2.3.0

2018-05-21 Thread Ray
Spark released its newest version 2.3.0 on Feb 28th, so I'd like to open a
discussion about whether we should upgrade Ignite Spark module to to the
latest version.
In the release notes
https://spark.apache.org/releases/spark-release-2-3-0.html, spark 2.3
introduced many useful new features and as well as many performance and
stability changes.

So in the next release, I think we should support the Spark 2.3 in Ignite
Spark module.
I already created a ticket in jira
https://issues.apache.org/jira/browse/IGNITE-8534, and the patch is ready
for review.

Spark 2.3 dropped support for scala 2.10
https://issues.apache.org/jira/browse/SPARK-19810, so we also need to remove
spark-2.10 module if we decide to support Spark 2.3.

Please share your input, Igniters.



--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


[jira] [Created] (IGNITE-8534) Upgrade Ignite Spark Module's Spark version to 2.3.0

2018-05-20 Thread Ray (JIRA)
Ray created IGNITE-8534:
---

 Summary: Upgrade Ignite Spark Module's Spark version to 2.3.0
 Key: IGNITE-8534
 URL: https://issues.apache.org/jira/browse/IGNITE-8534
 Project: Ignite
  Issue Type: Improvement
  Components: spark
Reporter: Ray
Assignee: Ray
 Fix For: 2.6


Spark released its newest version 2.3.0 on Feb 28th, we should upgrade Ignite 
Spark module to to the latest version.

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Request to contribute

2018-05-18 Thread Ray
Hello Alexey,

My jira id is ldz, sorry for the confusion.





--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Request to contribute

2018-05-18 Thread Ray
Hello Ignite community,

I'd like to do an enhancement on Ignite, my JIRA id is ldzhjn.
Please grant permissions.

Thanks



--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Re: [jira] [Created] (IGNITE-6769) SQL: prototype for hand-made parser

2017-10-30 Thread Ray
Hi Vladimir,

Thanks for the reply.

Is there any roadmap for this new execution engine?
For example, in version 2.4 the new engine will process select commands and
the rest commands will still be handled by H2.





--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Re: [jira] [Created] (IGNITE-6769) SQL: prototype for hand-made parser

2017-10-30 Thread Ray
Hi Vladimir,

What's the purpose of this ticket?

Does it mean in the future Ignite will have its own SQL parser instead of
H2's parser Ignite is using now?



--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Ignite SQL join performance

2017-10-30 Thread Ray
Currently the only way to join data is nested loops approach stated in this
two tickets.
https://issues.apache.org/jira/browse/IGNITE-6201
https://issues.apache.org/jira/browse/IGNITE-6202

So we can anticipate the SQL join performance will degrade a lot after the
data size reached certain amount.
After some research, I found out that deep down Ignite utilize H2 database
to do the join.
But currently H2 only supports nested loop approach to do the join.

Does it mean we'll have to remove H2 if we want to implement ticket 6201 and
6202?




--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Re: Spark+Ignite SQL syntax proposal

2017-10-07 Thread Ray
Hi Nikolay,

Could you also implement the DataFrame support for spark-2.10 module?
There's some legacy spark users still using Spark 1.6, they need the
DataFrame features too.

Thanks




--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/


Re: Spark+Ignite SQL syntax proposal

2017-10-07 Thread Ray
Hi Nikolay,

Could you also implement the DataFrame support for spark-2.10 module?
There's still some legacy users who still uses spark 1.6, they need
DataFrame feature too.

Thanks



--
Sent from: http://apache-ignite-developers.2346864.n4.nabble.com/