vaithal | 15 Dec 05:40 2014
Picon

Service could not be stopped in OpenJdk

Hi Friends,

We have converted java applications into windows services using
"procrun.exe". When we were using Oracle's Java, "net stop <serviceName>"
was working fine without any issues. Recently we switched to OpenJDK. Since
then, it fails with error message "could not be stopped" soon after 10sec.
But, it takes another 20sec to actually stop the service. It is not waiting
for value specified for "--StopTimeout" parameter. Do you know whether have
you come across this issue and how to fix it?

thanks,
Veena.

--
View this message in context: http://apache-commons.680414.n4.nabble.com/Service-could-not-be-stopped-in-OpenJdk-tp4669639.html
Sent from the Commons - User mailing list archive at Nabble.com.
Kristian Rosenvold | 12 Dec 18:27 2014
Picon

[compress] Transferring entries from one zip file to another ?

I am investigating the possibility to transfer zip entries from one
file to another without actually compressing/decompressing them. Is
this possible ? (It appears not...) If not, would it be feasible to do this ?

Kristian
Andrew Kolbus | 9 Dec 03:15 2014
Picon

Help putting TelnetClient into binary mode?

I am using the TelnetClient from commons-net 3.3 in JDK 8.  When I try to receive xmodem data over a telnet session, some bytes do not make it through to my code.

Example:
The remote device sends this (confirmed using windows network monitor):
01 01 FE [...] C0 00 FF FF FF 00 00 00 00 27 27 00 [...] (133 byte xmodem-crc packet)

However, from the telnet client, I get this:
01 01 FE [...] C0 00 FF 00 00 00 27 27 00 [...] (130 bytes total)

From what I have read, this is related to NVT.  It looks like it is converting [FF FF] to [FF] (removing one byte), and just outright removing [FF 00] (removing another two bytes for a total of three).

I tried using telnetClient.addOptionHandler(new SimpleOptionHandler(TelnetOption.BINARY, true, true, true, true));, but that did not change the data I received.  

A sample of the code I used to discover the issue is attached.  Is there a way to get the TelnetClient to pass all data through?
TelnetClient telnetClient = new TelnetClient();

// Tried with and without the following OptionHandler:
//telnetClient.addOptionHandler(new SimpleOptionHandler(TelnetOption.BINARY, true, true, true, true));

telnetClient.connect(ip,port);
telnetClient.setKeepAlive(true);
InputStream inputStream = telnetClient.getInputStream();
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(inputStream));
PrintWriter printWriter = new PrintWriter(telnetClient.getOutputStream(),true);

// Authentication code removed

// Receive XModem data
// Send command to initiate XMODEM-CRC transfer
byte start = 0x43;
printWriter.write(start);
printWriter.flush();

// Enter loop to display received binary data
ByteArrayOutputStream collector = new ByteArrayOutputStream();
while(true){
    collector.write(bufferedReader.read());
    log.debug(Hex.encodeHexString(collector.toByteArray()));
}

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe <at> commons.apache.org
For additional commands, e-mail: user-help <at> commons.apache.org
Robert Egan | 3 Dec 14:36 2014

[daemon] StartParams and StopParams

Can these options be used during prunsrv start and stop commands? The
documentation does not say, and the only working examples I've seen only
use them with the install command.

-- 
*Robert Egan*
Senior Developer, Telecommuter - US
SCM: AD Infrastructure Services
T +1 617 455 1425

ACI Worldwide, 2815 Coliseum Centre Drive, Suite 300, Charlotte, NC, 28217,
USA

www.aciworldwide.com

--

-- 
 <http://www.aciworldwide.com>

This email message and any attachments may contain confidential, 
proprietary or non-public information. The information is intended solely 
for the designated recipient(s). If an addressing or transmission error has 
misdirected this email, please notify the sender immediately and destroy 
this email. Any review, dissemination, use or reliance upon this 
information by unintended recipients is prohibited. Any opinions expressed 
in this email are those of the author personally.
pappu prasad | 30 Nov 19:52 2014
Picon

Fwd: [net] mismatch between timestamp and timezone

Hi,

I am Using FTPClient.java, FTPFile.java,FTPListParseEngine.java from
commons-net-3.2.jar to access the files on the ftp server(Linux machine)
located in EET timezone. After accessing the files if I try to see the time
zone and time stamp of the file,timestamp is shown in UTC format but
timezone is shown as timezone of the place where file is being accessed .

Expected was that both should have been in EET format.

Example:
Suppose a file sample.txt on FTP server (in EET zone) is created at
11/27/2014 8:28:20 AM.

While accessing the file(in EET time zone) using jar mentioned above
,FTPFile.java's getTimestamp method will return time as 11/27/2014 6:28:20
AM EET.

Instead it should have been 11/27/2014 8:28:20 AM EET.

Please help me to resolve this issue.

Thanks & Regards,

Pappu
Michael Howard | 24 Nov 19:47 2014

[math] - Fitting curve with 6 parameters

Hello!

I am trying to estimate the parameters of a large number of curves. They are all instances of the generalized
logistic function (http://en.wikipedia.org/wiki/Generalised_logistic_function). Right now, I am
using a very basic hill climbing algorithm that I wrote quickly just to see if these curves would indeed be
suitable for the data I'm working with.  It would be nice to take advantage of the property that the function
is differentiable. I am not an expert in optimization, but I think using an algorithm like
Levenberg-Marquardt would likely give a better fit, faster than my crude technique. Plus, L-M is what
I've seen used in research papers which fit this function.

I'm trying to do this with what is available in the commons library, but I can't make sense of how to
accomplish it.

First, there is actually already a implementation of the generalized logistic curve in the commons, under
org.apache.commons.math3.analysis.function.Logistic.Parametric. The
'ParametricUnivariateFunction' class looks like it's intended for use with curve fitting algorithms,
using subclasses of org.apache.commons.math3.fitting.AbstractCurveFitter. But, there isn't an
implementation which accepts Logistic. If I understand correctly, I would have to code my own
implementation of a curve fitting algorithm and have it extend AbstractCurveFitter.

Next, I considered following what's described on the Optimization package documentation
(http://commons.apache.org/proper/commons-math/userguide/optimization.html). The partial
derivatives for the generalized logistic function are listed on the Wikipedia page and so there's
nothing hard about coding them.

But, I don't see how I code the partial derivatives in the required format. In the quadratic problem example
in the documentation, the Jacobian is computed in the following way:

            jacobian[i][0] = x.get(i) * x.get(i);//x^2
            jacobian[i][1] = x.get(i);//x
            jacobian[i][2] = 1.0;//1

But in my case, coding the partial derivatives also requires knowing what the value of some of the other
parameters. For example, dY/dA is 1 - (1+Q*e^(-B(x-M))^(1/v). I don't know how I can code this properly
from the observations I have.

Is there a way of correctly representing this kind of function, perhaps by using the DerivativeStructure class?

Thanks,
-Michael
Gary Gregory | 23 Nov 01:30 2014
Picon

[ANNOUNCEMENT] Apache Commons CSV 1.1 Released

The Apache Commons CSV team is pleased to announce the 1.1 release!

The Apache Commons CSV library provides a simple interface for reading and
writing
CSV files of various types.

This is our second release.

Changes in this version include:

New features:
o [CSV-129] Add CSVFormat#with 0-arg methods matching boolean arg methods.
o [CSV-131] Save positions of records to enable random access. Thanks to
Holger Stratmann.
o [CSV-139] CSVPrinter.printRecord(ResultSet) with metadata.

Fixed Bugs:
o [CSV-140] QuoteMode.NON_NUMERIC doesn't work with
CSVPrinter.printRecords(ResultSet). Thanks to Damjan Jovanovic.
o [CSV-130] CSVFormat#withHeader doesn't work well with #printComment, add
withHeaderComments(String...). Thanks to Sergei Lebedev.
o [CSV-128] CSVFormat.EXCEL should ignore empty header names.
o [CSV-132] Incorrect Javadoc referencing org.apache.commons.csv.CSVFormat
withQuote(). Thanks to Sascha Szott.

Changes:
o [CSV-124] Improve toString() implementation of CSVRecord. Thanks to
Kalyan.
o [CSV-134] Unified parameter validation. Thanks to wu wen.

For complete information on Commons CSV, including instructions on how to
submit bug reports, patches, or suggestions for improvement, see the Apache
Commons CSV website:

Site: http://commons.apache.org/proper/commons-csv/

Download: http://commons.apache.org/csv/download_codec.cgi

Happy Coding!
Happy Thanksgiving!
Gary Gregory on behalf of the Apache Commons CSV team

--

-- 
E-Mail: garydgregory <at> gmail.com | ggregory <at> apache.org
Java Persistence with Hibernate, Second Edition
<http://www.manning.com/bauer3/>
JUnit in Action, Second Edition <http://www.manning.com/tahchiev/>
Spring Batch in Action <http://www.manning.com/templier/>
Blog: http://garygregory.wordpress.com
Home: http://garygregory.com/
Tweet! http://twitter.com/GaryGregory
Kristian Rink | 21 Nov 16:16 2014
Picon

[fileupload] multipart parsing failed: null?

Folks;

trying to track down a strange error frequently occurring on our 
infrastructure when users are, well, uploading files using a 
multipart/form data upload form. Traces see below. Worth noting:

- The application runs in an embedded jetty (HTTP) behind an apache2 
mod_proxy reverse proxy (HTTPS).

- These issues do not generally appear, I tried quite some uploading 
myself today and never managed to reproduce this behaviour even while 
uploading loads of files, large files and both together.

- It does not seem to be generally tied to a particular browser; the 
users associated with these messages use Firefox, MSIE or Chrome.

- Looking at network traffic (and the transfer monitor in the app), it 
_seems_ all data to be sent with the request have successfully been 
transmitted yet parsing the request, ultimately, fails.

- On _some_ clients, in such situations users reported the upload was 
canceled with a "connection reset by peer" error, even though I do not 
see reasons for that in our mod_proxy server log.

So far I feel a bit clueless where to look next, here. Does anyone out 
here have an idea what could possibly go wrong?

Thanks in advance for any hints on that,
Kristian

Trace:

[...]

org.apache.commons.fileupload.FileUploadBase$IOFileUploadException: 
Processing of multipart/form-data request failed. null
         at 
org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:351) 
~[commons-fileupload-1.3.1.jar:1.3.1]
         at

org.apache.commons.fileupload.servlet.ServletFileUpload.parseRequest(ServletFileUpload.java:115) 
~[commons-fileupload-1.3.1.jar:1.3.1]
[...]
Caused by: org.eclipse.jetty.io.EofException: null
         at 
org.eclipse.jetty.server.HttpInput$3.noContent(HttpInput.java:465) 
~[jetty-server-9.1.4.v20140401.jar:9.1.4.v20140401]
         at org.eclipse.jetty.server.HttpInput.read(HttpInput.java:125) 
~[jetty-server-9.1.4.v20140401.jar:9.1.4.v20140401]
         at

org.apache.commons.fileupload.MultipartStream$ItemInputStream.makeAvailable(MultipartStream.java:999) 
~[commons-fileupload-1.3.1.jar:1.3.1]
         at

org.apache.commons.fileupload.MultipartStream$ItemInputStream.read(MultipartStream.java:903) 
~[commons-fileupload-1.3.1.jar:1.3.1]
         at java.io.InputStream.read(InputStream.java:101) ~[na:1.7.0_51]
         at 
org.apache.commons.fileupload.util.Streams.copy(Streams.java:100) 
~[commons-fileupload-1.3.1.jar:1.3.1]
         at 
org.apache.commons.fileupload.util.Streams.copy(Streams.java:70) 
~[commons-fileupload-1.3.1.jar:1.3.1]
         at 
org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:347) 
~[commons-fileupload-1.3.1.jar:1.3.1]
[...]
Thib Guicherd-Callin | 18 Nov 00:45 2014
Picon

[codec] Codec 1.10 requires Java 6 or Java 7?


The 'Releases' paragraph of the front page:

http://commons.apache.org/proper/commons-codec/index.html

says "Codec 1.10 (mirrors) requires Java 1.6". But the release notes for 
1.10:

http://commons.apache.org/proper/commons-codec/changes-report.html#a1.10

list JIRA issue CODEC-178:

https://issues.apache.org/jira/browse/CODEC-178

as a bug fix, and the title "Deprecate Charsets Charset constants in 
favor of Java 7's java.nio.charset.StandardCharsets" could be construed 
to mean Java 7 is required. So, is Java 7 required, or does the 1.10 fix 
for CODEC-178 work with Java 6?

Thib
Syed Mudassir | 14 Nov 19:09 2014
Picon

Setting FtpFile Permissions

Hello,
  When I connect to an FTP server, I get access to a remote file through FtpFile class.
  I wanted to set POSIX file permissions on this file such as rw-r--r--.  I tried to use setPermission() of FtpFile.  But to my surprise, there is no change on the remote site.
  I reverted to FtpClient.sendSiteCommand().  But I am afraid this could be platform-dependent.
  Whats the right way of setting permissions in platform-independent way?
--
Duncan Jones | 14 Nov 12:56 2014
Picon

Re: [JCS] disk persistence - same conf file, different results

Hi João,

This is probably best addressed to the Commons user mailing list. I've
BCC'd the issues list so people know where this email thread went to.
Please also prefix the Commons component into the subject line (in
this case [JCS], which I've done for you).

Duncan

On 14 November 2014 11:25, joao tiago a. m. viegas <jtviegas <at> gmail.com> wrote:
> hello,
>
> I've been testing jcs, I want a cache that allows me to save all the
> eleemnts when I shutsown gracefully, and later when I create a new instance
> of the same cache I can get those same objects I had before in the first
> encarnation of that same cache.
> I made this test and I'm actually using two identical confs, but for some
> reason I'm getting different results.
> Could you please help me understanding why?
>
> tests are:
>
> private static final String[] configs = new String[]
>> {"ObjectPersistingInMemoryAndDisk_x1.ccf",
>> "ObjectPersistingInMemoryAndDisk_x2.ccf"};
>>
> private static final String[] expecteds = new String[] {null, null};
>
> private static final String region = "OUR_REGION";
>> private static final String key="key", value="value";
>
> .....
>> .....
>
>  <at> Test
>> public void testObjectPersistingInMemoryAndDisk_1() throws Exception{
>>
>> performObjectPersistingInMemoryAndDisk(JCS_CONFIG_DIR + configs[0],
>> expecteds[0]);
>> }
>>
>>  <at> Test
>> public void testObjectPersistingInMemoryAndDisk_2() throws Exception{
>>
>> performObjectPersistingInMemoryAndDisk(JCS_CONFIG_DIR + configs[1],
>> expecteds[1]);
>> }
>>
>> private void performObjectPersistingInMemoryAndDisk(String configFile,
>> String expected) throws Exception{
>>
>>
>> System.out.println("-----------------------------------------------------------");
>>
>> System.out.println("testing with config file:" + configFile);
>> JCS.setConfigFilename(configFile);
>> JCS cache = JCS.getInstance(region);
>> cache.put(key, value);
>> Thread.sleep(5000);
>> Assert.assertNotNull(cache.get(key));
>> Assert.assertEquals(value, cache.get(key));
>>
>> IStats stats = cache.getStatistics().getAuxiliaryCacheStats()[0];
>>
>> cache.freeMemoryElements(Integer.parseInt(stats.getStatElements()[0].getData()));
>> Thread.sleep(5000);
>> Assert.assertNotNull(cache.get(key));
>> Assert.assertEquals(value, cache.get(key));
>> System.out.println(cache.getStats());
>> CompositeCacheManager.getInstance().shutDown();
>>
>> Thread.sleep(5000);
>>
>> JCS.setConfigFilename(configFile);
>> cache = JCS.getInstance(region);
>> Assert.assertEquals(expected, cache.get(key));
>>
>> System.out.println(cache.getStats());
>> cache.clear();
>> cache.dispose();
>> CompositeCacheManager.getInstance().shutDown();
>> }
>
>
> ...the confs are:
>
> -- ObjectPersistingInMemoryAndDisk_x1.ccf --
> # DEFAULT CACHE REGION
> jcs.default=DISK_REGION
> jcs.default.cacheattributes=org.apache.jcs.engine.CompositeCacheAttributes
> jcs.default.cacheattributes.MaxObjects=1000
> jcs.default.cacheattributes.MemoryCacheName=org.apache.jcs.engine.memory.lru.LRUMemoryCache
> #jcs.default.cacheattributes.DiskUsagePatternName=UPDATE
>
> jcs.region.OUR_REGION=DISK_REGION
> jcs.region.OUR_REGION.cacheattributes=org.apache.jcs.engine.CompositeCacheAttributes
> jcs.region.OUR_REGION.cacheattributes.MaxObjects=1000
> jcs.region.OUR_REGION.cacheattributes.MemoryCacheName=org.apache.jcs.engine.memory.lru.LRUMemoryCache
> jcs.region.OUR_REGION.cacheattributes.UseMemoryShrinker=true
> jcs.region.OUR_REGION.cacheattributes.MaxMemoryIdleTimeSeconds=3600
> jcs.region.OUR_REGION.cacheattributes.ShrinkerIntervalSeconds=60
> jcs.region.OUR_REGION.cacheattributes.MaxSpoolPerRun=500
> jcs.region.OUR_REGION.elementattributes=org.apache.jcs.engine.ElementAttributes
> jcs.region.OUR_REGION.elementattributes.IsEternal=false
>
> jcs.auxiliary.DISK_REGION=org.apache.jcs.auxiliary.disk.indexed.IndexedDiskCacheFactory
> jcs.auxiliary.DISK_REGION.attributes=org.apache.jcs.auxiliary.disk.indexed.IndexedDiskCacheAttributes
> jcs.auxiliary.DISK_REGION.attributes.DiskPath=/tmp/jcs/cache
> jcs.auxiliary.DISK_REGION.attributes.maxKeySize=100000
>
>
> -- ObjectPersistingInMemoryAndDisk_x2.ccf --
> # DEFAULT CACHE REGION
> jcs.default=DISK_REGION
> jcs.default.cacheattributes=org.apache.jcs.engine.CompositeCacheAttributes
> jcs.default.cacheattributes.MaxObjects=1000
> jcs.default.cacheattributes.MemoryCacheName=org.apache.jcs.engine.memory.lru.LRUMemoryCache
> #jcs.default.cacheattributes.DiskUsagePatternName=UPDATE
>
> jcs.region.OUR_REGION=DISK_REGION
> jcs.region.OUR_REGION.cacheattributes=org.apache.jcs.engine.CompositeCacheAttributes
> jcs.region.OUR_REGION.cacheattributes.MaxObjects=1000
> jcs.region.OUR_REGION.cacheattributes.MemoryCacheName=org.apache.jcs.engine.memory.lru.LRUMemoryCache
> jcs.region.OUR_REGION.cacheattributes.UseMemoryShrinker=true
> jcs.region.OUR_REGION.cacheattributes.MaxMemoryIdleTimeSeconds=3600
> jcs.region.OUR_REGION.cacheattributes.ShrinkerIntervalSeconds=60
> jcs.region.OUR_REGION.cacheattributes.MaxSpoolPerRun=500
> jcs.region.OUR_REGION.elementattributes=org.apache.jcs.engine.ElementAttributes
> jcs.region.OUR_REGION.elementattributes.IsEternal=false
>
> jcs.auxiliary.DISK_REGION=org.apache.jcs.auxiliary.disk.indexed.IndexedDiskCacheFactory
> jcs.auxiliary.DISK_REGION.attributes=org.apache.jcs.auxiliary.disk.indexed.IndexedDiskCacheAttributes
> jcs.auxiliary.DISK_REGION.attributes.DiskPath=/tmp/jcs/cache
> jcs.auxiliary.DISK_REGION.attributes.maxKeySize=100000
>
>
>
>
> os melhores cumprimentos / best regards / Mit freundlichen Grüssen / Saludos
>
> joão tiago viegas

Gmane