persistence@glassfish.java.net

Re: [Fwd: Re: Issue 134 - _at_Column attributes are ignored during java2db]

From: King Wang <king.wang_at_oracle.com>
Date: Thu, 19 Jan 2006 15:46:12 -0500

My answers are inlined.
King

Pramod Gopinath wrote:

> Hi King
> Thanks for Ur reply.
>
> There is a issue with calling the following 3 methods w/o checking for
> the field type :
> fieldDef.setSize(dbField.getLength());
> fieldDef.setSize(dbField.getPrecision());
> fieldDef.setSubSize(dbField.getScale());
>
> If we have the above 3 methods defined w/o checking for type this is
> the behavior
>
>
> Field Type @Column defined Output obtained
> ------------ --------------------- ------------------
> int -no- INTEGER
> String -no- VARCHAR(255)
> String length=14 VARCHAR(255)
> BigDecimal -no- DECIMAL(-1,-1)
> BigDecimal precision=12, DECIMAL(12)
> scale=0
> BigDecimal precision=4, DECIMAL(4,1)
> scale=1
>
>
> Couple of issues :
> 1. For a "String" field even though I have defined the length=14, we
> still get the dbField as VARCHAR(255). In this case the
> dbField.getLength() correctly returns the value of 14, but the
> dbField.getPrecision() returns a value of 0. For this value of 0 we
> get back the default varchar value of 255.

This must be a bug. Do you get the varchar value based on the
precision/scale value? Sounds odd to me. If the length is set correctly,
for the type String, 14 should be the size for varchar.

>
> 2. If a BigDecimal field is used, w/o any @Column defined the
> dbField.getLength() and dbField.getPrecision() returns -1. Setting
> this as is into the field is wrong.

We should then change the default value as 0 or some positive number
rather than -1. But I can see the (default) value is passed through
properly.
But this should always be the issue even you only call one or two
setters instead of three of them, right?

>
>
> To tackle both the issues 1, 2 should we check if these values are > 0
> before we set the fieldDef. This would solve issue 1 and 2 and not
> require the check for field type of String or BigDecimal. But this
> solution (>0 check w/o fieldType check) leads to a new problem.
> If I have @VERSION defined for an int field, we create the dbfield as
> "DECIMAL(15). But the dbField.getLength() returns 255, and
> dbField.getPrecision() and dbField.getScale() return 0. So the above
> logic would return us a field as DECIMAL(255) which is wrong.

This might be another bug. The precision 15 is not passed through. Need
to find out when dbField was built and how the values werer passed in. I
assume you would hit this issue even you did not call those three
setters. Bottomline is dbField.getLength() return the incorrect result.

>
>
> For the default values of scale and precision yes we all agree that
> the default value of 0 is wrong. As far as the ddl generation is
> concerned am not doing anything specific for it.
> I am using the test case that was provided with the issue when it was
> filed. Working on making the testcase comprehensive.
>
> Thanks
> Pramod
>
>
>
> King Wang wrote:
>
>> NO, I was not on the reply list and just saw your message this morning.
>> I think you are right about the field length, size and subsize. Sorry
>> my bad.
>> But I still think we should always call those three method no matter
>> what type it is:
>>
>> fieldDef.setSize(dbField.getLength());
>> fieldDef.setSize(dbField.getPrecision());
>> fieldDef.setSubSize(dbField.getScale());
>>
>> This should work for any Java type like String, BigDecimal as well as
>> BigInteger, Long, etc. Again, the default value (0, based on the
>> spec.) will be set for those not-applied cases.
>>
>> I just had a bit chat with Gordon. He think spec default value 0 for
>> scale and precision does not make sense. I do agree. But not sure how
>> you guys deal with this.
>>
>> Last thing, do you have a testcase for the change? I would like to
>> define a set of annotation for fields with various java types, and
>> some of them would be primary key fields. The testcase would catch
>> issues caused by the changes.
>>
>> Thanks,
>> King
>>
>