[ 
http://issues.apache.org/jira/browse/DERBY-119?page=comments#action_12425049 ] 
            
Deepa Remesh commented on DERBY-119:
------------------------------------

I think the following additional tests will be good to add:

1) Check that a column in a primary key cannot be modified to null.
2) Check that a column defined as UNIQUE cannot be modified to null.
3) Add a check to see specifying "COLUMN" in the alter statement is optional.

I tried the above tests with 'alterColumnNotNull_1.diff' patch. 1)  and 3) 
pass. In case of 2) no exception is thrown. We can alter a column defined as 
UNIQUE to null. And we can also insert null values into the unique column.

Derby reference guide mentions this about UNIQUE constraint:
"Specifies that values in the columns must be unique. The identified columns 
must be defined as NOT NULL."

Dev guide mentions this:
"The SQL standard defines that unique constraints on nullable columns allow any 
number of nulls; Derby does not permit unique constraints on nullable columns."

Based on this, I think we should be checking that we cannot modify a UNIQUE 
column to null.

I see that in the test, you have a comment in the form of a question:
+ij> -- Now that B has a null value, can I modify it to NOT NULL?

I think the current behaviour is correct. It should not be possible to modify a 
column which already has a null value to NOT NULL.

> Add ALTER TABLE option to change column from NULL to NOT NULL
> -------------------------------------------------------------
>
>                 Key: DERBY-119
>                 URL: http://issues.apache.org/jira/browse/DERBY-119
>             Project: Derby
>          Issue Type: New Feature
>          Components: SQL
>            Reporter: Bernd Ruehlicke
>         Assigned To: Bryan Pendleton
>         Attachments: alterColumnNotNull_1.diff
>
>
> There was a thread about this on the Cloudscape forum
> http://www-106.ibm.com/developerworks/forums/dw_thread.jsp?message=4103269&cat=19&thread=59941&forum=370#4103269
> Since this describes the problem I will just copy the content of this entry 
> as my dexscription
> The content of this was 
> "
> Hi,
> I stumbled across a behaviour of cloudscape which is not a bug but IMHO an 
> implementation choice. To assign a primary key to a table using ALTER TABLE 
> all columns must be declared NOT NULL first, which can only be specified upon 
> column creation (no ALTER TABLE statement exists to change the NOT NULL 
> property of a column).
> Most databases I know do two things differently:
> 1) when a primary key is assigned all pk columns are automatically set to NOT 
> NULL, if one of them contains NULL values, the ALTER TABLE statement fails
> 2) it is possible to alter the column to set the NOT NULL property after 
> column creation (fails when there are already records containing NULL values)
> If I have understood the limitations correctly in Cloudscape I have no choice 
> but to remove and re-add the column which is supposed to be used in the 
> primary key, if it is not already declared as NOT NULL. This means that in 
> the case of a table containing valid data (unique and not null) in the column 
> in all records, I would have to export the data, remove and re-add the column 
> and reimport that data, which would not be necessary e.g. in Oracle or MaxDB.
> Is it possible to change that behaviour or is there a good reason for it? It 
> looks as if it makes the life of the user more difficult than necessary for 
> certain metadata manipulations. Making it possible to alter the NOT NULL 
> property of a column would solve this and IMHO having a primary key 
> constraint do this implicitly makes sense as well. 
> Thanks in advance for any insight on this,
> Robert"

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Reply via email to