Hi Pramod,
There are some issues on your changes:
1. If there is @Column defined and the value is passed from a
Databasefield object, your change will only set the type name on the
field definition, but omit all those setter calls(PK, and those you just
added allowNull, unique) which should be called no matter if @Column
annotation is defined. The new filed definition would lose some data
passed through the database field.
2. I am not sure if this setting is right:
fieldDef.setSize(dbField.getLength());
dbField.length is the field column name size (in Oracle, the limit is
30 for column size). field definition size is the size of the field
itself (e.g. Varchar2 size 255). Those are two different things.
3. We may want to always set the size and subsize passed through the
database field, regardless the field type. Default value ("-1") would be
set for some non-applied types, but that should be ok.
King
Pramod Gopinath wrote:
> Hi King
> As part of trying to solve this issue found that in the
> DefautTableGenerator.getFieldDefFromDBField() we create a new
> FieldDefinition. This field definition object is the one used to
> create the sql as part of the ddl generation. I have made some changes
> to this method to ensue that we take into consideration the @Column
> annotation details (if specified).
>
> I am attaching the java file that has the changed
> getFieldDefFromDBField() method. Currently am using the precision and
> scale values only for java.math.BigDecimal fields and not sure if this
> is correct.
>
> Could U review this code for me.
>
> Thanks
> Pramod
>
>------------------------------------------------------------------------
>
>/*
> * The contents of this file are subject to the terms
> * of the Common Development and Distribution License
> * (the "License"). You may not use this file except
> * in compliance with the License.
> *
> * You can obtain a copy of the license at
> * glassfish/bootstrap/legal/CDDLv1.0.txt or
> * https://glassfish.dev.java.net/public/CDDLv1.0.html.
> * See the License for the specific language governing
> * permissions and limitations under the License.
> *
> * When distributing Covered Code, include this CDDL
> * HEADER in each file and include the License file at
> * glassfish/bootstrap/legal/CDDLv1.0.txt. If applicable,
> * add the following below this CDDL HEADER, with the
> * fields enclosed by brackets "[]" replaced with your
> * own identifying information: Portions Copyright [yyyy]
> * [name of copyright owner]
> */
>package oracle.toplink.essentials.tools.schemaframework;
>
>import java.sql.DatabaseMetaData;
>import java.sql.ResultSet;
>import java.sql.SQLException;
>import java.util.HashMap;
>import java.util.Hashtable;
>import java.util.Iterator;
>import java.util.Map;
>import java.util.Vector;
>
>import oracle.toplink.essentials.exceptions.DatabaseException;
>import oracle.toplink.essentials.internal.helper.ClassConstants;
>import oracle.toplink.essentials.internal.helper.ConversionManager;
>import oracle.toplink.essentials.internal.helper.DatabaseField;
>import oracle.toplink.essentials.internal.helper.DatabaseTable;
>import oracle.toplink.essentials.internal.sessions.DatabaseSessionImpl;
>import oracle.toplink.essentials.logging.AbstractSessionLog;
>import oracle.toplink.essentials.logging.SessionLog;
>import oracle.toplink.essentials.mappings.AggregateCollectionMapping;
>import oracle.toplink.essentials.mappings.DatabaseMapping;
>import oracle.toplink.essentials.mappings.DirectCollectionMapping;
>import oracle.toplink.essentials.mappings.DirectMapMapping;
>import oracle.toplink.essentials.mappings.ManyToManyMapping;
>
>import oracle.toplink.essentials.descriptors.ClassDescriptor;
>import oracle.toplink.essentials.internal.sessions.AbstractSession;
>import oracle.toplink.essentials.sessions.Project;
>import oracle.toplink.essentials.threetier.ServerSession;
>
>
>/**
> * DefaultTableGenerator is a utility class used to generate a default table schema for a TopLink project object.
> *
> * The utility can be used in TopLink CMP for OC4J to perform the table auto creation process, which can be triggered
> * at deployment time when TopLink project descriptor is absent (default mapping) or present.
> *
> * The utility can also be used to any TopLink application to perform the table drop/creation at runtime.
> *
> * The utility handles all direct/relational mappings, inheritance, multiple tables, interface with/without tables,
> * optimistic version/timestamp lockings, nested relationships, BLOB/CLOB generation.
> *
> * The utility is platform-agnostic.
> *
> * Usage:
> * - CMP
> * 1. set "autocreate-tables=true|false, autodelete-tables=true|false" in oc4j application deployment
> * descriptor files (config/system-application.xml, config/application.xml, or orion-application.xml in an .ear)
> *
> * 2. Default Mapping: the same as CMP, plus system properties setting -Dtoplink.defaultmapping.autocreate-tables='true|false'
> * and -Dtoplink.defaultmapping.autodelete-tables='true|false'
> *
> * - Non-CMP:
> * TODO: sessions.xml support (CR 4355200)
> * 1. Configuration: through sessions.xml
> * 2. Directly runtime call through schema framework:
> * SchemaManager mgr = new SchemaManager(session);
> * mgr.replaceDefaultTables(); //drop and create
> * mgr.createDefaultTables(); //create only
> *
> * The utility currently only supports relational project.
> *
> * @author King Wang
> * @since Oracle TopLink 10.1.3
> */
>public class DefaultTableGenerator {
> //the project object used to generate the default data schema.
> Project project = null;
>
> //used to track the table definition: keyed by the table name, and valued
> //by the table definition object
> private Map tableMap = null;
>
> //used to track th field definition: keyed by the database field object, and
> //valued by the field definition.
> private Map fieldMap = null;
>
> /**
> * Default construcotr
> */
> public DefaultTableGenerator(Project project) {
> this.project = project;
> tableMap = new HashMap();
> fieldMap = new HashMap();
> }
>
> /**
> * Generate a default TableCreator object from the TopLink project object.
> */
> public TableCreator generateDefaultTableCreator() {
> TableCreator tblCreator = new TableCreator();
>
> //go through each descriptor and build the table/field definitions out of mappings
> Iterator descIter = project.getDescriptors().values().iterator();
>
> while (descIter.hasNext()) {
> ClassDescriptor desc = (ClassDescriptor)descIter.next();
>
> //aggregate RelationalDescriptor does not contains table/field data
> if (!desc.isAggregateDescriptor()) {
> initTableSchema((ClassDescriptor)desc);
> }
> }
>
> //Post init the schema for relation table and direct collection/map tables, and several special mapping handlings.
> descIter = project.getOrderedDescriptors().iterator();
>
> while (descIter.hasNext()) {
> ClassDescriptor desc = (ClassDescriptor)descIter.next();
>
> if (!desc.isAggregateDescriptor()) {
> postInitTableSchema(desc);
> }
> }
>
> tblCreator.addTableDefinitions(tableMap.values());
>
> return tblCreator;
> }
>
> /**
> * Generate a default TableCreator object from the TopLink project object,
> * and porform the table existence check through jdbc table metadata, and filter out
> * tables which are already in the database.
> */
> public TableCreator generateFilteredDefaultTableCreator(AbstractSession session) throws DatabaseException {
> TableCreator tblCreator = generateDefaultTableCreator();
>
> try {
> //table exisitence check.
> java.sql.Connection conn = null;
> if (session.isServerSession()) {
> //acquire a connection from the pool
> conn = ((ServerSession)session).getDefaultConnectionPool().acquireConnection().getConnection();
> } else if (session.isDatabaseSession()) {
> conn = ((DatabaseSessionImpl)session).getAccessor().getConnection();
> }
> if (conn == null) {
> //TODO: this is not pretty, connection is not obtained for some reason.
> return tblCreator;
> }
> DatabaseMetaData dbMetaData = conn.getMetaData();
> ResultSet resultSet = dbMetaData.getTables(null, dbMetaData.getUserName(), null, new String[] { "TABLE" });
> java.util.List tablesInDatabase = new java.util.ArrayList();
>
> while (resultSet.next()) {
> //save all tables from the database
> tablesInDatabase.add(resultSet.getString("TABLE_NAME"));
> }
>
> resultSet.close();
>
> java.util.List existedTables = new java.util.ArrayList();
> java.util.List existedTableNames = new java.util.ArrayList();
> Iterator tblDefIter = tblCreator.getTableDefinitions().iterator();
>
> while (tblDefIter.hasNext()) {
> TableDefinition tblDef = (TableDefinition) tblDefIter.next();
>
> //check if the to-be-created table is already in the database
> if (tablesInDatabase.contains(tblDef.getFullName())) {
> existedTables.add(tblDef);
> existedTableNames.add(tblDef.getFullName());
> }
> }
>
> if (!existedTableNames.isEmpty()) {
> session.getSessionLog().log(SessionLog.FINEST, "skip_create_existing_tables", existedTableNames);
>
> //remove the existed tables, won't create them.
> tblCreator.getTableDefinitions().removeAll(existedTables);
> }
> } catch (SQLException sqlEx) {
> throw DatabaseException.errorRetrieveDbMetadataThroughJDBCConnection();
> }
>
> return tblCreator;
> }
>
> /**
> * Build tables/fields infomation into the table creator object from a TopLink descriptor.
> * This should handle most of the direct/relational mappings except many-to-many and direct
> * collection/map mappings, witch must be down in postInit method.
> */
> protected void initTableSchema(ClassDescriptor desc) {
> TableDefinition tblDef = null;
> DatabaseTable dbTbl = null;
> Iterator dbTblIter = desc.getTables().iterator();
>
> //create a table definition for each mapped database table
> while (dbTblIter.hasNext()) {
> dbTbl = (DatabaseTable) dbTblIter.next();
> tblDef = getTableDefFromDBTable(dbTbl);
> }
>
> //build each field definition and figure out which table it goes
> Iterator fieldIter = desc.getFields().iterator();
> DatabaseField dbField = null;
>
> while (fieldIter.hasNext()) {
> dbField = (DatabaseField) fieldIter.next();
>
> boolean isPKField = false;
>
> //first check if the filed is a pk field in the default table.
> isPKField = desc.getPrimaryKeyFields().contains(dbField);
>
> //then check if the field is a pk field in the secondary table(s), this is only applied to the multiple tables case.
> Hashtable secondaryKeyMap = (Hashtable) desc.getAdditionalTablePrimaryKeyFields().get(dbField.getTable());
>
> if (secondaryKeyMap != null) {
> isPKField = isPKField || secondaryKeyMap.containsValue(dbField);
> }
>
> //build or retrieve the field definition.
> FieldDefinition fieldDef = getFieldDefFromDBField(dbField, isPKField);
>
> //find the table the field belongs to, and add it to the table, ony if not already added.
> tblDef = (TableDefinition) tableMap.get(dbField.getTableName());
>
> if (!tblDef.getFields().contains(fieldDef)) {
> tblDef.addField(fieldDef);
> }
> }
> }
>
> /**
> * Build additional table/field definitions for the dscriptor, like relation table
> * and direct-collection, direct-map table, as well as reset LOB type for serialized
> * object mapping and type conversion maping for LOB usage
> */
> private void postInitTableSchema(ClassDescriptor desc) {
> Iterator mappingIter = desc.getMappings().iterator();
>
> while (mappingIter.hasNext()) {
> DatabaseMapping mapping = (DatabaseMapping) mappingIter.next();
>
> if (mapping.isManyToManyMapping()) {
> buildRelationTableDefinition((ManyToManyMapping) mapping);
> } else if (mapping.isDirectCollectionMapping()) {
> buildDirectCollectionTableDefinition((DirectCollectionMapping) mapping, desc);
> } else if (mapping.isAggregateCollectionMapping()) {
> //need to figure out the target foreign key field and add it into the aggregate target table
> addForeignkeyFieldToAggregateTargetTable((AggregateCollectionMapping) mapping);
> }
> }
> }
>
> /**
> * Build relation table definitions for all many-to-many relationships in a TopLink desciptor.
> */
> private void buildRelationTableDefinition(ManyToManyMapping mapping) {
> //first create relation table
> TableDefinition tblDef = getTableDefFromDBTable(mapping.getRelationTable());
>
> DatabaseField dbField = null;
>
> //add source foreign key fields into the relation table
> Vector srcFkFields = mapping.getSourceRelationKeyFields();
>
> for (int index = 0; index < srcFkFields.size(); index++) {
> dbField = resolveDatabaseField((DatabaseField) srcFkFields.get(index), (DatabaseField) mapping.getSourceKeyFields().get(index));
> setFieldToRelationTable(dbField, tblDef);
> }
>
> //add target foreign key fields into the relation table
> Vector targFkFields = mapping.getTargetRelationKeyFields();
>
> for (int index = 0; index < targFkFields.size(); index++) {
> dbField = resolveDatabaseField((DatabaseField) targFkFields.get(index), (DatabaseField) mapping.getTargetKeyFields().get(index));
> setFieldToRelationTable(dbField, tblDef);
> }
> }
>
> /**
> * Build direct collection table definitions in a TopLink desciptor
> */
> private void buildDirectCollectionTableDefinition(DirectCollectionMapping mapping, ClassDescriptor desc) {
> //first create direct collection table
> TableDefinition tblDef = getTableDefFromDBTable(mapping.getReferenceTable());
>
> DatabaseField dbField = null;
>
> //add the table reference key(s)
> Vector refPkFields = mapping.getReferenceKeyFields();
>
> for (int index = 0; index < refPkFields.size(); index++) {
> dbField = resolveDatabaseField((DatabaseField) refPkFields.get(index), (DatabaseField) mapping.getSourceKeyFields().get(index));
> tblDef.addField(getDirectCollectionReferenceKeyFieldDefFromDBField(dbField));
> }
>
> //add the direct collection field to the table.
> tblDef.addField(getFieldDefFromDBField(mapping.getDirectField(), false));
>
> //if the mapping is direct-map field, add the direct key field to the table as well.
> if (mapping.isDirectMapMapping()) {
> dbField = ((DirectMapMapping) mapping).getDirectKeyField();
> tblDef.addField(getFieldDefFromDBField(dbField, false));
> }
> }
>
> /**
> * Add the foreign key to the aggregate collection mapping target table
> */
> private void addForeignkeyFieldToAggregateTargetTable(AggregateCollectionMapping mapping) {
> //unlike normal one-to-many mapping, aggregate collection mapping does not have 1:1 back reference
> //mapping, so the target foreign key fields are not stored in the target descriptor.
> Iterator targFKIter = mapping.getTargetForeignKeyFields().iterator();
>
> while (targFKIter.hasNext()) {
> DatabaseField dbField = (DatabaseField) targFKIter.next();
>
> //retrive the target table denifition
> TableDefinition targTblDef = getTableDefFromDBTable(dbField.getTable());
>
> //add the target foreign key field definition to the table definition
> targTblDef.addField(getFieldDefFromDBField(dbField, false));
> }
> }
>
> /**
> * Build a table definition object from a database table object
> */
> private TableDefinition getTableDefFromDBTable(DatabaseTable dbTbl) {
> TableDefinition tblDef = (TableDefinition) this.tableMap.get(dbTbl.getName());
>
> if (tblDef == null) {
> //table not built yet, simply built it
> tblDef = new TableDefinition();
> tblDef.setName(dbTbl.getName());
> tblDef.setQualifier(dbTbl.getTableQualifier());
> tableMap.put(dbTbl.getName(), tblDef);
> }
>
> return tblDef;
> }
>
> /**
> * Resolve the foreign key database field metadata in relation table or direct collection/map table.
> * Those metadata includes type, and maybe dbtype/size/subsize if DatabaseField carries those info.
> */
> private DatabaseField resolveDatabaseField(DatabaseField childField, DatabaseField parentField) {
> //set through the type from the source table key field to the relation or direct collection table key field.
> DatabaseField reslovedDatabaseField = new DatabaseField();
> reslovedDatabaseField.setName(childField.getName());
> reslovedDatabaseField.setType(getFieldDefFromDBField(parentField, true).getType());
>
> return reslovedDatabaseField;
> }
>
> /**
> * Build a field definition object from a database field.
> */
> private FieldDefinition getFieldDefFromDBField(DatabaseField dbField, boolean isPrimaryKey) {
> FieldDefinition fieldDef = (FieldDefinition) this.fieldMap.get(dbField);
>
> if (fieldDef == null) {
> //not built yet, build one
> fieldDef = new FieldDefinition();
> fieldDef.setName(dbField.getName());
>
> if (dbField.getColumnDefinition().length() > 0) {
> fieldDef.setTypeName(dbField.getColumnDefinition());
> } else {
> Class fieldType = dbField.getType();
>
> if ((fieldType == null) || (!fieldType.isPrimitive() && (new DatabaseSessionImpl(project).getPlatform().getFieldTypeDefinition(fieldType) == null))) {
> //TODO: log a warning for inaccessiable type or not convertable type.
> AbstractSessionLog.getLog().log(SessionLog.FINEST, "field_type_set_to_java_lang_string", dbField.getQualifiedName(), fieldType);
>
> //set the default type (lang.String) to all un-resolved java type, like null, Number, util.Date, NChar/NType, Calendar
> //sql.Blob/Clob, Object, or unknown type). Please refer to bug 4352820.
> fieldDef.setType(ClassConstants.STRING);
>
> } else {
> //need to convert the primitive type if applied.
> fieldDef.setType(ConversionManager.getObjectClass(fieldType));
> }
>
> fieldDef.setIsPrimaryKey(isPrimaryKey);
>
> if(fieldType.getName().equals("java.lang.String")) {
> fieldDef.setSize(dbField.getLength());
> } else if(fieldType.getName().equals("java.math.BigDecimal")) {
> fieldDef.setSize(dbField.getPrecision());
> fieldDef.setSubSize(dbField.getScale());
> }
>
> fieldDef.setShouldAllowNull(dbField.isNullable());
> fieldDef.setUnique(dbField.isUnique());
>
> fieldMap.put(dbField, fieldDef);
> }
> }
>
> return fieldDef;
> }
>
> /**
> * Build a field definition object from a database field.
> */
> private FieldDefinition getDirectCollectionReferenceKeyFieldDefFromDBField(DatabaseField dbField) {
> FieldDefinition fieldDef = (FieldDefinition)getFieldDefFromDBField(dbField, true).clone();
> //direct collection/map table reference kye filed is not unique, need to set it as non-pk.
> fieldDef.setIsPrimaryKey(false);
> return fieldDef;
> }
>
> /**
> * Build and add a field definition object to relation table
> */
> private void setFieldToRelationTable(DatabaseField dbField, TableDefinition tblDef) {
> FieldDefinition fieldDef = getFieldDefFromDBField(dbField, false);
>
> if (!tblDef.getFields().contains(fieldDef)) {
> //only add the field once, to avoid add twice if m:m is bi-directional.
> tblDef.addField(getFieldDefFromDBField(dbField, false));
> }
> }
>}
>
>