[ 
http://issues.apache.org/jira/browse/DERBY-1624?page=comments#action_12428846 ] 
            
Manish Khettry commented on DERBY-1624:
---------------------------------------

Here is a description of the problem.

Derby rewrite queries involving group by's by adding an outer select and 
transforming the having clause to a where clause in the outer query, with some 
modifications. So in this case, given a query like:

select alias.x as c0, count(*) as c1
from foo alias 
group by alias.x having count(*) > 0  ;

gets rewritten to:

select * from (select alias.x as c0, count(*) as c1 ... ) where generated_col > 
0;

Subsequently, the "*" in the outer query gets expanded to:

select alias.c0, c1 from (....) where generated_col > 0;

To me this seems a bit fishy-- it looks like "alias" is scoped in the subquery 
and not really visible in the outer query. For this reason, the following query 
also fails (users have to add a correlation name for subselects).

select alias.c0 from (select alias.x as c0 from foo alias) tabname;

Now the funny thing is that it passes without a having clause, the queyr works 
and thats because of the rather obfuscated 5 way search for column references 
in FromSubquery. Notice the huge difference in the search code between cases 2 
and 5. BTW, the code I pasted in the previous is not whats in the codeline-- I 
was playing aroudn with it, so please look at the checked in code).

One fishy thing I came across was the notion of a "clause" in a ValueNode; i.e 
IN_SELECT_LIST, IN_WHERE_CLAUSE, IN_HAVING_CLAUSE etc. First we only seem to 
use IN_SELECT_LIST and I do not see (unless I'm missing something) the last 
two. Second, what exactly are the semantics of the clause instance variable? IN 
a query like this, to which clause does the expression "expr" belong?

select .. from ... having c0 > (select max(expr) ...);

Is "expr" in a having clause or a select clause? I think answering this would 
help because the column binding logic depends on this-- look at case 2, again: 
Should expr be considered to be in a having clause and therefore bound by this 
bit of code?

                else if (generatedForGroupByClause && generatedForHavingClause 
&&
                             (columnsTableName != null || 
                              columnReference.getClause() != 
ValueNode.IN_SELECT_LIST)) // 2
                {

Anyway, I realize that I am posing more questions than providing solutions and 
that all of this is deep down in the innards of the query parsing/binding code 
which most of us are only incompletely familiar with but if you have any 
insights, please update the bug. 

I'd hate to see hibernate unusable with Derby due to bugs like this.




> use of direct column name rather than alias make aggregation fail (Hibernate 
> depends on that)
> ---------------------------------------------------------------------------------------------
>
>                 Key: DERBY-1624
>                 URL: http://issues.apache.org/jira/browse/DERBY-1624
>             Project: Derby
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 10.1.1.0, 10.1.3.1
>            Reporter: Emmanuel Bernard
>
> Error: org.apache.derby.client.am.SqlException: Column 'MODEL0_.COL_0_0_' is 
> either not in any table in the FROM list or appears within a join 
> specification and is outside the scope of the join specification or appears 
> in a HAVING clause and is not in the GROUP BY list. If this is a CREATE or 
> ALTER TABLE  statement then 'MODEL0_.COL_0_0_' is not a column in the target 
> table., SQL State: 42X04, Error Code: -1
> for
> select
>         model0_.balance as col_0_0_,
>         count(*) as col_1_0_ 
>     from
>         account model0_ 
>     group by
>         model0_.balance 
>     having
>         count(*) > 1

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Reply via email to