My experience has also been that JDBC is better for inserting bulk data.
However, the draw back *seems* to be that when the rest of the application
is using JPA (for instance for handing data from the database to JSF for
reporting) JPA remains unaware of the new data in the tables.
Dennis
On Tue, Nov 27, 2007 at 12:20 PM, <glassfish_at_javadesktop.org> wrote:
> If you insert something new in to the database, you're not corrupting the
> cache because there's nothing for the data row in the cache yet.
>
> Typical cache corruption happens when you update the DB behind the back of
> the cache, i.e. changing a row so that the DB has different data than the
> cached row. But an insert doesn't have this problem, because the row doesn't
> exist in the cache yet. So, using JDBC for bulk inserts is no problem.
>
> Updates are best managed through the JPA, specifically because of the
> caching issue.
>
> But bulk updates are readily applied using the JPA and EQL, as EQL is cache
> sensitive and will "do the right thing" when you use EQL for updates.
>
> But if you are bulk loading a lot of data, direct JDBC inserts are simply
> far more efficient.
> [Message sent by forum member 'whartung' (whartung)]
>
> http://forums.java.net/jive/thread.jspa?messageID=247408
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: users-unsubscribe_at_glassfish.dev.java.net
> For additional commands, e-mail: users-help_at_glassfish.dev.java.net
>
>
--
Dennis R. Gesker
email: dennis_at_gesker.com
google talk: gesker_at_gmail.com
aim: dennisgesker
pgp/gpg: 0x267D3FE8
First things first, but not necessarily in that order. -- Unknown