Explodey | 30 Jul 00:33 2014

Mixer: Speeding/slowing audio

I've been learning SDL 1.2, and I just recently started learning the sdl_mixer functions, so I've been adding music and sounds to my games. I'm working on this game where I need to play a sound at different sample rates, so the speed and pitch of the sound are altered. I was planning to just use ProTools to create several versions of the same sound, played at different speeds, but then it occurred to me that my game would maybe use less ram if I could just change the speed of the audio file on the fly. Is there a function in the mixer that would let me do that?

I don't need to do anything fancy, like time-shifting without changing the pitch, or ramping up the speed as the audio plays. I just want to take a 22.05kHz wav file and play it at maybe 28kHz or so. As some of you already know, I'm very new to programming, so if this is something that would be very complicated, I'll probably just make some new versions of the wav file instead. But I figured there might be a simple way to do it.

SDL mailing list
SDL <at> lists.libsdl.org
Werner Stoop | 29 Jul 18:51 2014

NVidia OpenGL performance problems

Dear SDL Users,

I have written an OpenGL application in SDL 2.0 and the performance is decent (+/- 50fps) on my development machine (Windows 7, with a Radeon graphics card and Catalyst drivers).

The problem is that the performance is almost non-existent (1/3 fps) on all of the Windows machines with NVidia graphics cards that I've tested it on. It is almost as if the application is being rendered entirely in software. I've tried it on 3 separate machines, all with the same results.

I've played around with the `SDL_GL_SetAttribute()` function, like trying to set `SDL_GL_CONTEXT_PROFILE_MASK` to `SDL_GL_CONTEXT_PROFILE_COMPATIBILITY` and setting the `SDL_GL_CONTEXT_MAJOR_VERSION` and `SDL_GL_CONTEXT_MINOR_VERSION` flags to a variety of values, and calling `SDL_GL_SetAttribute(SDL_GL_ACCELERATED_VISUAL, 1);` but none of them seems to make any difference.

On the NVidia machines, glGetString(GL_VERSION) reports OpenGL version 1.1.0 (whereas on my Radeon box it reports "4.1.10600 Compatibility Profile Context")

Perhaps I should mention that it still uses some old OpenGL features like `glBegin(); ...; glEnd();` - It turns out my OpenGL knowledge is severely outdated. The problem is that my deadline is approaching and since this application is more of a demo, I don't have the time or inclination to rewrite it using modern OpenGL. I also didn't think it would be a problem, because the OpenGL implementations should be backwards compatible. Shouldn't they?

So, my questions:
  • Has anyone else experienced problems like this?
  • Is there an explaination for the problem? Hox can I resolve it?
On the surface it looks like a driver issue, but the machines I've been testing on are used for gaming and CAD, and those applications seem fine. GL Extensions Viewer on the one machine I've tested it on reports
Renderer: GeForce GTX 550 Ti/PCIe/SSE2
Vendor: NVIDIA Corporation
Memory: 0 MB
Version: 4.4.0
Shading language version: 4.40 NVIDIA via Cg compiler

So it would seem to rule out an outdated driver.

It seems that my problem is related to these issues: http://stackoverflow.com/q/15183930/115589 and http://gamedev.stackexchange.com/q/11533, but none of those questions seem to have a definitive answer.

Thank you,
Werner Stoop
SDL mailing list
SDL <at> lists.libsdl.org
Sik the hedgehog | 27 Jul 12:31 2014

Candidate list with ibus in fullscreen mode

Making a separate thread since this is a different bug.

The issue is simple: if the window is fullscreen (even
SDL_WINDOW_FULLSCREEN_DESKTOP), the candidate list won't be visible.
The reason for this is that the fullscreen window ends up on top of
it. I was trying to look for a workaround and eventually found this:


It appears that fullscreen windows that have focus will be on top of
everything else no matter what, even on top of always-on-top windows.
So it seems there isn't any proper way to fix this except messing with
the protocol (*again*, as if we didn't have fullscreen problems

For more realistic fixes, I can only think of the following two:

1) Have a way to take control over the candidate list so the program
itself draws it instead of ibus. This is the hard way and requires an
API change, and I'm not sure how much does ibus allow to do. It's
bound to be the most stable though if it works.

2) Temporarily remove the _NET_WM_STATE_FULLSCREEN flag when entering
text input mode (and restoring it when exiting it). I have absolutely
no idea how stable this would be, but it should allow the candidate
list window float on top of the SDL window. The upside is that this
would work as-is if implemented (no API changes, no need to modify

Any comments?
Ryan C. Gordon | 27 Jul 06:25 2014


As of tonight in Mercurial, you can now "push" audio to an SDL audio 
device instead of using a callback.

Open the audio device like you normally would, but specify a NULL 
callback (this would previously cause SDL_OpenAudio*() to fail 

Then you may call:

    SDL_QueueAudio(myDeviceId, myAudioData, numberOfBytes);

Queue as much or little as you want, SDL will drain it to the audio 
device as necessary. If you provide too little, SDL will feed silence to 
the device until you queue more data.

To see how much is waiting to be fed to the device still:

    bytesRemaining = SDL_GetQueuedAudioSize(myDeviceId);

Between these two calls, you can manage the audio device without the 
control inversion inherent in the usual callback system. This new API is 
built on top of the old one (internally, SDL provides its own callback 
to feed from data you have queued), but might be simpler to use, and 
easier to wrap one's head around...and since it doesn't need to operate 
in a separate thread, one doesn't have to think about locking at the 
application level.

The patch (with the documentation in SDL_audio.h) is here:


Sik the hedgehog | 27 Jul 05:40 2014

Candidate list appearing at the wrong coordinates

Having problems implementing IME support in my game. While I know
support for it in Linux is already flaky (candidate list will get
hidden in fullscreen mode), I thought I should still work on the parts
that do work (i.e. windowed mode). So this happened:


The code responsible for this mess:
      int list_x = x;
      int list_y = FILE_NAME_Y2;
      virtual_to_real_coord(&list_x, &list_y);

      SDL_Rect list_rect;
      list_rect.x = list_x;
      list_rect.y = list_y;
      list_rect.w = 1;
      list_rect.h = 1;


Yes, I did think that maybe coordinates are being calculated wrong, so
I checked where the list ended up in the screenshot to make sure. It
seems like the calculations are correct, but the list position is
relative to the window frame, not the inside (so the main reason it
appears higher than it should is that the titlebar is heavily skewing
the position - if the frame was thicker it would also appear
noticeably shifted to the left).

So the solution seems obvious: offset the coordinates so they're
relative to the window contents and not the frame. Should SDL be doing
it itself or should I be doing it on my own? (probably the former) If
the latter, how do I get the window frame size? (in a
platform-independent way, that is)
ShiroAisu | 26 Jul 14:11 2014

How many threads / mutexes can we use?


I'm creating a heavily threaded game and I wanted to know: is there a limit for the amount of mutexes and threads we can use / define / start?

Thank you for the attention
SDL mailing list
SDL <at> lists.libsdl.org
Zammalad | 26 Jul 12:37 2014

Safe/acceptable to call SDL_SetTextureColorMod a lot?

As a simple way to render multiple textures that are the same other than color I load a plain white circle in to an SDL_Texture then just call SDL_SetTextureColorMod() giving it the color I want to make the circle.

This all works fine if the textures are individual (Example 1) but if I am sharing the SDL_Texture so that multiple objects all reference it, it means that SDL_SetTextureColorMod() must be called every render frame before the object renders the texture since the color it gave last time may have been changed by another object (Example 2).

Is calling SDL_SetTextureColorMod() every render frame, for potentially quite a lot of objects sharing a texture, going to cause major performance issues?

The reason it is required is the system is designed using a shared texture functionality with basic reference counting (I understand that there are probably better ways to do this using smart pointers but that is not the topic for discussion here). Is it going to be better to just let each object have it's own copy of the SDL_Texture so it only has to set the color once (or whenever it needs to change) rather than every render frame?

Example 1:

SDL_Texture* tex1;
SDL_Texture* tex2;
SDL_Texture* tex3;
// All 3 instances have their own SDL_Texture
MyObject A(tex1);
MyObject B(tex2);
MyObject C(tex3);
// A call to set the color of the texture is only required once for each class

Example 2:

SDL_Texture* tex;
// All 3 instances share the SDL_Texture
MyObject A(tex);
MyObject B(tex);
MyObject C(tex);
// A call to set the color of the texture must be made before rendering
// each object to ensure that any other object has not set a different color.
// E.g if the draw order was A, B, C and the color was set to be different
// for each object then before rendering B, each frame it would need to set
// the color again otherwise it would have the color of A and the same
// for the others   

Edit: This would also extend to SDL_SetTextureAlphaMod() and SDL_SetTextureBlendMode() or other similar functions

For reference I'm using SDL2.

The Z-Buffer
SDL mailing list
SDL <at> lists.libsdl.org
Alvin Beach | 25 Jul 20:11 2014

SDL_CreateWindowFrom and blocking SDL_CreateRenderer


I am working on updating an older application that uses SDL 1.2 where, the main SDL_Surface is embedded into
another window. This works great with SDL

However, I am having issues doing the same with SDL2, but with SDL_Window (not SDL_Surface). I call
SDL_CreateWindowFrom() when I have the window
handle. It seems to work just fine as I do get a non-NULL pointer. However, calling SDL_CreateRenderer()
just hangs. I've stepped through
SDL_CreateRenderer() and it appears to block here:



	/* Blocking wait for "UnmapNotify" event */
        X11_XIfEvent(display, &event, &isUnmapNotify, (XPointer)&data->xwindow);   //<--- hangs here

I have a suspicion that the X11 events are being consumed by the application toolkit (FLTK) and not are not
reaching SDL2.

Does anyone have any experience with embedding a SDL2 SDL_Window inside of another window? Or any insights
on how it is done differently than SDL 1.2?
Any information or advice would be greatly appreciated. Thanks.


Alexey Petruchik | 23 Jul 21:01 2014


Here is a small patch that adds support of android's SYSTEM_UI_FLAG_LOW_PROFILE that makes games look more immersive.

Regards, Alexey.
# HG changeset patch
# User stopiccot <alexey.petruchik <at> gmail.com>
# Date 1406141828 -10800
# Node ID 0597ac969fec6c62baa0e4bc449b68f51b2d9e0e
# Parent  6d059ed9b6cafdeb276802426880715f2b296a7f
Hint for SYSTEM_UI_FLAG_LOW_PROFILE on android

diff --git a/android-project/src/org/libsdl/app/SDLActivity.java b/android-project/src/org/libsdl/app/SDLActivity.java
--- a/android-project/src/org/libsdl/app/SDLActivity.java
+++ b/android-project/src/org/libsdl/app/SDLActivity.java
 <at>  <at>  -85,8 +85,15  <at>  <at> 

         // Set up the surface
         mSurface = new SDLSurface(getApplication());
+        nativeAddHintCallback(SDL_HINT_ANDROID_USE_UI_LOW_PROFILE, new SDLHintCallback() {
+             <at> Override
+            public void callback(String name, String oldValue, String newValue) {
+                updateLowProfileSettings(newValue);
+            }
+        });

-        if(Build.VERSION.SDK_INT >= 12) {
+        if (Build.VERSION.SDK_INT >= 12) {
             mJoystickHandler = new SDLJoystickHandler_API12();
         else {
 <at>  <at>  -112,9 +119,9  <at>  <at> 
         Log.v("SDL", "onResume()");
+        updateLowProfileSettings(nativeGetHint(SDL_HINT_ANDROID_USE_UI_LOW_PROFILE));

      <at> Override
     public void onWindowFocusChanged(boolean hasFocus) {
 <at>  <at>  -202,6 +209,19  <at>  <at> 

+    void updateLowProfileSettings(String value) {
+        if ("1".equals(value)) {
+            runOnUiThread(new Runnable() {
+                 <at> Override
+                public void run() {
+                    getWindow().getDecorView().setSystemUiVisibility(View.SYSTEM_UI_FLAG_LOW_PROFILE);
+                }
+            });
+        }
+    }

     // Messages from the SDLMain thread
     static final int COMMAND_CHANGE_TITLE = 1;
 <at>  <at>  -298,7 +318,14  <at>  <at> 
                                                int is_accelerometer, int nbuttons, 
                                                int naxes, int nhats, int nballs);
     public static native int nativeRemoveJoystick(int device_id);
+    interface SDLHintCallback {
+        void callback(String name, String oldValue, String newValue);
+    }
     public static native String nativeGetHint(String name);
+    public static native void nativeAddHintCallback(String name, SDLHintCallback callback);

      * This method is called by SDL using JNI.
diff --git a/include/SDL_hints.h b/include/SDL_hints.h
--- a/include/SDL_hints.h
+++ b/include/SDL_hints.h
 <at>  <at>  -477,6 +477,11  <at>  <at> 

+ /**
+ * \brief If set to 1, sets UI_LOW_PROFILE setting for SDLActivity
+ */

  *  \brief  An enumeration of hint priorities
diff --git a/src/core/android/SDL_android.c b/src/core/android/SDL_android.c
--- a/src/core/android/SDL_android.c
+++ b/src/core/android/SDL_android.c
 <at>  <at>  -396,6 +396,32  <at>  <at> 
     return result;

+void Android_JNI_HintCallback(void *userdata, const char *name, const char *oldValue, const char
*newValue) {
+    JNIEnv *env = Android_JNI_GetEnv();
+    jobject callback = (jobject)userdata;
+    jclass cls = (*env)->GetObjectClass(env, callback);
+    jmethodID method = (*env)->GetMethodID(env, cls, "callback", "(Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;)V");
+    jstring javaName     = (*env)->NewStringUTF(env, name);
+    jstring javaOldValue = (*env)->NewStringUTF(env, oldValue);
+    jstring javaNewValue = (*env)->NewStringUTF(env, newValue);
+    (*env)->CallVoidMethod(env, callback, method, javaName, javaOldValue, javaNewValue);
+    (*env)->DeleteLocalRef(env, javaName);
+    (*env)->DeleteLocalRef(env, javaOldValue);
+    (*env)->DeleteLocalRef(env, javaNewValue);
+void Java_org_libsdl_app_SDLActivity_nativeAddHintCallback(JNIEnv* env, jclass cls, jstring
name, jobject callback) {
+    const char *utfname = (*env)->GetStringUTFChars(env, name, NULL);
+    SDL_AddHintCallback(utfname, Android_JNI_HintCallback, (*env)->NewGlobalRef(env, callback));
+    (*env)->ReleaseStringUTFChars(env, name, utfname);    
              Functions called by SDL into Java
SDL mailing list
SDL <at> lists.libsdl.org
mr_tawan | 23 Jul 18:33 2014

Re: load bmp returns NULL

DAVIDlavi20091997 wrote:
mr naith and mr tawan tanke you very much to help me dill with this problam the problam just was that the i didnt crated the renderr befure the sdl texture and that i didnt copied the image to the main cpp dirctaury, imn very surry for wasting your time on such an abstract and Trivial matter,tanke you very very much

I know that you're too excited by now. but you might want to consider asking someone to spell check first ^^' It's kinda hard to read.

Glad you get it working (I didn't really help much after all, all the credit goes to Nait.).
SDL mailing list
SDL <at> lists.libsdl.org
Naith | 23 Jul 18:25 2014

Re: load bmp returns NULL

And what did you do to make it work?
SDL mailing list
SDL <at> lists.libsdl.org