Sekhar Kosuru | 19 Jul 13:43 2014
Picon

Database hanging after Heap space error

Hi,

We are using Derby for our application, we are running derby as service. In our applications when we found any error with Derby, we will check service is running or not. If we found service down we will start the service.

But in some places we are getting heap space error in database log files. after that we are unable to start the Derby. It is not issuing any connection.

DriverManager.getConnection() is hanging.

Please help us to solve the same.

Regards
Sekhar
china_wang | 26 Jun 08:08 2014

You cannot invoke other java.sql.Clob/java.sql.Blob methods

Dear All,

      I have a problem that i can't get Clob  when i use
spring3.0.5+ibatis2.3.0+derby10.10.0.1 ,
my code snippets :

<select id="PUB-DATASETS" resultClass="java.util.HashMap"
parameterClass="map" remapResults="true">
		
        select $columns$ from pub_datasets where 1=1
        <dynamic prepend="and">
				<isNotEmpty property="dataSoursIds">     				  
     				 DATASOURCE_ID in ($dataSoursIds$) 
				</isNotEmpty>				
		</dynamic>             
        <dynamic prepend="and">
				<isNotEmpty property="dataSetIds">     				  
     				 DATASET_ID in ($dataSetIds$)
				</isNotEmpty>				
		</dynamic>
</select>

ArrayList<Map&lt;String, Object>> list =new
ArrayList<Map&lt;String,Object>>();	
list =(ArrayList)
datasetDAO.findAll("MS-PUB-DATASETS-SELECT-ALL-BY-ADMIN",params);

for(Map<String, Object> map:list)
{
	if(map.get("TEMPLATE")!=null)
	{
	  System.out.println(((Clob)map.get("TEMPLATE")).length());
	}
}

"((Clob)map.get("TEMPLATE")).length() " will throw exception.....

 sorry my english level is poor...

Thanks for every help. 

wang

--
View this message in context: http://apache-database.10148.n7.nabble.com/You-cannot-invoke-other-java-sql-Clob-java-sql-Blob-methods-tp140391.html
Sent from the Apache Derby Users mailing list archive at Nabble.com.

Kempff, Malte | 24 Jun 17:45 2014

AW: Problems with import of CSV to Table

I found my problem now, it was a not existing file for input, I had overseen in the error report.

FEHLER XIE04: Nicht gefundene Datendatei: memorecords.dat

 

But I have also another question. When I tried to give nulls in the import export statement using the procedures, ij gave me an syntax error.

What do I need to do that ij accepts my null-values for using the defaults?

 

Thanks for hints

 

Malte

 

Von: Kempff, Malte
Gesendet: Dienstag, 24. Juni 2014 17:21
An: Derby Discussion (derby-user-PvNy6fhA98DNLxjTenLetw@public.gmane.org)
Betreff: Problems with import of CSV to Table

 

Hi,

I have this Table

ij(DB_8_2)> describe memo_records;

COLUMN_NAME         |TYPE_NAME|DEC&|NUM&|COLUM&|COLUMN_DEF|CHAR_OCTE&|IS_NULL&

------------------------------------------------------------------------------

RECORD_ID           |VARCHAR  |NULL|NULL|512   |NULL      |1024      |NO

INPUTFILE_ID        |INTEGER  |0   |10  |10    |NULL      |NULL      |YES

TMPST_IN            |TIMESTAMP|9   |10  |29    |CURRENT_T&|NULL      |NO

TMPST_USED          |TIMESTAMP|9   |10  |29    |NULL      |NULL      |YES

STATE               |CHAR     |NULL|NULL|1     |'I'       |2         |NO

 

And I have this CSV-Structure for importing it.

"174145"|1382|"2014-03-28 05:09:44.441"||"I"                            

"174146"|1382|"2014-03-28 05:09:44.558"||"I"                             

"174147"|1382|"2014-03-28 05:09:44.585"||"I"                            

"174507"|1424|"2014-04-07 05:09:56.649"|"2014-04-10 18:11:45.388"|"U"   

"174508"|1424|"2014-04-07 05:09:56.738"||"I"

I am getting this Error in ij:

FEHLER 38000: Bei der Auswertung eines Ausdrucks wurde die Ausnahme 'java.sql.SQLException: Bei der Auswertung eines Ausdrucks wurde die Ausnahme 'java.lang.reflect.InvocationTargetException' ausgel÷st.' ausgel÷st.

FEHLER 38000: Bei der Auswertung eines Ausdrucks wurde die Ausnahme 'java.lang.reflect.InvocationTargetException' ausgel÷st.

 

Well actually I cannot find the problem, I suspect that it might be the date.

But shouldn’t an export produce csv correctly for the import?

 

Here is the Statement for export:

call SYSCS_UTIL.SYSCS_EXPORT_QUERY('select RECORD_ID                   

                                          ,INPUTFILE_ID                

                                          ,TMPST_IN                    

                                          ,TMPST_USED                  

                                          ,STATE                       

                                    from memo_records ',               

                                   'memorecords.dat', '|', '"', 'utf8');

And here is my statement for import:

call syscs_util.syscs_import_table ('APP', 'MEMO_RECORDS', 'memorecords.dat', '|', '"', 'utf8',0);

 

I am using right now 10.8.2 needing to migrate down to 10.1.1.0 (crazy, isn’t it?).

Got somebody an idea what I could do here to solve the problem?

 

Thanks a lot in advance

 

Malte

 

------------------------------------------------------------------------------------------------ Disclaimer: The contents of this electronic mail message are only binding upon Equens or its affiliates, if the contents of the message are accompanied by a lawfully recognised type of signature. The contents of this electronic mail message are privileged and confidential and are intended only for use by the addressee. If you have received this electronic mail message by error, please notify the sender and delete the message without taking notices of its content, reproducing it and using it in any way. ------------------------------------------------------------------------------------------------
Kempff, Malte | 24 Jun 17:21 2014

Problems with import of CSV to Table

Hi,

I have this Table

ij(DB_8_2)> describe memo_records;

COLUMN_NAME         |TYPE_NAME|DEC&|NUM&|COLUM&|COLUMN_DEF|CHAR_OCTE&|IS_NULL&

------------------------------------------------------------------------------

RECORD_ID           |VARCHAR  |NULL|NULL|512   |NULL      |1024      |NO

INPUTFILE_ID        |INTEGER  |0   |10  |10    |NULL      |NULL      |YES

TMPST_IN            |TIMESTAMP|9   |10  |29    |CURRENT_T&|NULL      |NO

TMPST_USED          |TIMESTAMP|9   |10  |29    |NULL      |NULL      |YES

STATE               |CHAR     |NULL|NULL|1     |'I'       |2         |NO

 

And I have this CSV-Structure for importing it.

"174145"|1382|"2014-03-28 05:09:44.441"||"I"                            

"174146"|1382|"2014-03-28 05:09:44.558"||"I"                             

"174147"|1382|"2014-03-28 05:09:44.585"||"I"                            

"174507"|1424|"2014-04-07 05:09:56.649"|"2014-04-10 18:11:45.388"|"U"   

"174508"|1424|"2014-04-07 05:09:56.738"||"I"

I am getting this Error in ij:

FEHLER 38000: Bei der Auswertung eines Ausdrucks wurde die Ausnahme 'java.sql.SQLException: Bei der Auswertung eines Ausdrucks wurde die Ausnahme 'java.lang.reflect.InvocationTargetException' ausgel÷st.' ausgel÷st.

FEHLER 38000: Bei der Auswertung eines Ausdrucks wurde die Ausnahme 'java.lang.reflect.InvocationTargetException' ausgel÷st.

 

Well actually I cannot find the problem, I suspect that it might be the date.

But shouldn’t an export produce csv correctly for the import?

 

Here is the Statement for export:

call SYSCS_UTIL.SYSCS_EXPORT_QUERY('select RECORD_ID                   

                                          ,INPUTFILE_ID                

                                          ,TMPST_IN                    

                                          ,TMPST_USED                  

                                          ,STATE                       

                                    from memo_records ',               

                                   'memorecords.dat', '|', '"', 'utf8');

And here is my statement for import:

call syscs_util.syscs_import_table ('APP', 'MEMO_RECORDS', 'memorecords.dat', '|', '"', 'utf8',0);

 

I am using right now 10.8.2 needing to migrate down to 10.1.1.0 (crazy, isn’t it?).

Got somebody an idea what I could do here to solve the problem?

 

Thanks a lot in advance

 

Malte

 

------------------------------------------------------------------------------------------------ Disclaimer: The contents of this electronic mail message are only binding upon Equens or its affiliates, if the contents of the message are accompanied by a lawfully recognised type of signature. The contents of this electronic mail message are privileged and confidential and are intended only for use by the addressee. If you have received this electronic mail message by error, please notify the sender and delete the message without taking notices of its content, reproducing it and using it in any way. ------------------------------------------------------------------------------------------------
Myrna van Lunteren | 11 Jun 17:39 2014
Picon

ApacheCon CFP closes June 25


Dear Apache DDLUtils, Torque, JDO and Derby enthusiasts,

As you may be aware, ApacheCon will be held this year in Budapest, on
November 17-23. (See http://apachecon.eu for more info.)

The Call For Papers for that conference is still open, but will be
closing soon. We need you talk proposals, to represent the DB project at
ApacheCon. We need all kinds of talks - deep technical talks, hands-on
tutorials, introductions for beginners, or case studies about the
awesome stuff you're doing with DDLUtils, Torque, JDO and Derby.

Please consider submitting a proposal, at
http://events.linuxfoundation.org//events/apachecon-europe/program/cfp

Thanks!

Myrna van Lunteren
DB Project chair
Tim Dudgeon | 11 Jun 11:40 2014

trigger with cascade delete problem

I've encountered a tricky problem that I can't see a solution to at 
present. Let me describe.

I have 3 tables:
MAIN - the main data table with results in.
GROUPING - a table that is related to MAIN, with MAIM having a FK 
constraint to GROUPING
AGGREGATES - a table that contains aggregated information from MAIN, in 
part grouped by the info in GROUPING.

I'm filling the data in AGGREGATES using triggers on MAIN that firstly 
delete the old aggregate value and then inserts a new aggregate (e.g. 2 
after insert/update/delete "for each statement" triggers).

Mostly its working. When I delete a row in MAIN the row in AGGREGATES 
gets deleted and then inserted again with the new aggregate.
But when I delete a row from GROUPING and the cascade delete causes the 
corresponding rows in MAIN to be deleted it does not work.
I'm pretty sure this is because part of the selection criteria for the 
rows to delete involves a join to GROUPING, and the rows to join to have 
just been blown away by the delete operation, so nothing in AGGREGATES 
gets deleted.

Is there solution to this?

Tim

Chris Olver | 4 Jun 10:42 2014

Apache Derby - Locks up.

Hi,

 

We are looking to use Apache Derby (Network Server) as a Caching Layer for our ETL process. Unfortunately we are finding that it locks up quite frequently. We have two tables which have around 10 million rows, two indexes in each. We can be reading (straight SELECT * FROM) or writing updates when it happens.

 

CPU will spike to 100% (its on a rather powerful box) and then all existing and new JDBC clients are unable to connect. Running runtimeinfo (when it locks up, issuing this command can take a few minutes to get a response):

 

--- Derby Network Server Runtime Information ---

---------- Session Information ---------------

Session # :116

Database :etl

User : abc

# Statements:1

Prepared Statement Information:

        Stmt ID         SQLText

        -------------   -----------

        SYSLH0001       SELECT * FROM APP.USERS

 

Session # :117

-------------------------------------------------------------

# Connection Threads : 4

# Active Sessions : 2

# Waiting  Sessions : 0

 

Total Memory : 1756889088       Free Memory : 306272128

 

No errors can be seen in the log. I am rather confused as it seems like the perfect solution Derby just locks up.

 

Thoughts or advise appreciated.

 

OS: Windows 8.1

Java Runtime: 1.8.0_05-b13

Derby: 10.10.2.0

 

Regards,

 

-Chris

蓦然回首 | 14 May 11:38 2014

Derby DB suddenly can't open

I am use 10.8.2.x for more than 2 years, and it works well till today. Suddenly I can't open it even I upgrade it to 10.8.3, the stack is below:

java.sql.SQLException: Database 'XingDB' not found.
    at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
    at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.newSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleDBNotFound(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:207)


What's the reason it should be? Thanks for advance!

Regards,
Liu

John English | 14 May 15:52 2014
Picon

Unique constraints and nulls

I have a table with two columns whose combined value I want to be unique:

   create table Foo (
     A integer,
     B integer,
     primary key(A),
     unique(A,B)
   );

This works fine except when B is null, when I can have multiple rows containing 
identical values of the form (A,null).

Is there an easy way to constrain the values of A to be unique even when B is 
null? (I could try to change things so that empty strings are used instead of 
nulls, but that would involve changing existing code and it will take quite a 
bit of work to ensure that there aren't any unexpected knock-on effects, so I 
prefer to stick with nulls if I can.)

TIA,
--

-- 
John English

Patrick Meyer | 19 Apr 03:34 2014
Picon

CREATE TABLE lexical error

I have an application that allows users to import data into Derby. As such, the users specify the column names. A user encountered a lexical error message that I have been able to reproduce with the following CREATE TABLE statements. Can anyone explain why the column names appear to be causing a lexical error and how to avoid it? I am using 10.9.1.

 

These two statements result in errors:

 

CREATE TABLE TBLPAT1 (xb1x DOUBLE, xb2x DOUBLE)

Error: Lexical error at line 1, column 24.  Encountered: "\ufeff" (65279), after : "".

 

CREATE TABLE TBLPAT2 (xb1x VARCHAR(50), xb2x VARCHAR(50))

Error: Lexical error at line 1, column 24.  Encountered: "\ufeff" (65279), after : "".

SQLState:  42X02

ErrorCode: 30000

 

 

These statements work just fine.

 

CREATE TABLE TBLPAT3 (xvar1x DOUBLE, xvar2x DOUBLE)

 

CREATE TABLE TBLPAT4 (xvar1x VARCHAR(50), xvar2x DOUBLE)

 

CREATE TABLE TBLPAT5 (xvar1x VARCHAR(50), xvar2x VARCHAR(50))

 

 

Thanks,

Patrick

 

Chux | 18 Apr 09:08 2014
Picon

Best way to have a DB browser in your destop app

Hello guys,

I have a desktop app using Java FX and so I have Derby as embedded DB. 
These builds I deploy to my clients.

I need however a little tool to somehow access the embedded database for 
viewing, and maybe some on-the-fly modifications.

When the FX app is up and running the DB is locked to it. So my option is 
to include a built in DB manager tool inside the app.

So I was wondering if you know any java-based database viewers that I can
import and use inside my app?

Best,
Chux

Gmane