Crash when external screen is disconnected while app in background

5 replies [Last post]
brunchboy
brunchboy's picture
User offline. Last seen 8 years 48 weeks ago. Offline
Joined: 05/14/2012
Posts:

Greetings,

I am working on an app that renders video to an external display, and am finding this a truly delightful framework to work with. I have run into a crash that is beyond my ability to diagnose at this stage, though I have poked around in the source some, and installed appledoc and generated the documentation in the hope that might help set me in the right direction, but I think it is time to ask for help.

The problem scenario is as follows: If I enable the external screen (I am using AirPlay mirroring to do that), my app detects that and creates a window and GPUImageView to start rendering to it. However, if I suspend the app, and disconnect the external screen by turning off AirPlay while outside the app, as soon as I return to the app, I get the following crash (the first “Screen disconnected” line is my going-into-the-background code trying to detach from the screen protectively, then there is a time gap while I navigate over to the AirPlay UI and turn off mirroring and return to the app; the second “Screen disconnected” line is iOS telling me the screen is actually gone, after which point I try turning back on video capture, and the crash happens):

2012-05-14 15:40:34.027 ClubCam[5059:707] Entering background
2012-05-14 15:40:34.265 ClubCam[5059:707] Screen disconnected
2012-05-14 15:40:50.412 ClubCam[5059:707] Screen disconnected
2012-05-14 15:40:50.532 ClubCam[5059:707] *** Assertion failure in -[GPUImageView createDisplayFramebuffer], /Users/jim/Dropbox/Projects/ClubCam/ClubCam/GPUImage/framework/Source/GPUImageView.m:161
2012-05-14 15:40:50.537 ClubCam[5059:707] *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Failure with display framebuffer generation'
*** First throw call stack:
(0x358ec88f 0x37c93259 0x358ec789 0x353d43a3 0x84cb 0x823f 0x3539713f 0x35396da5 0x3536d997 0x333fd269 0x353fa4ff 0x358b8547 0x35844097 0x3536e3eb 0x333fc51b 0x33320b03 0x33320567 0x3331ff3b 0x374df22b 0x358c0523 0x358c04c5 0x358bf313 0x358424a5 0x3584236d 0x374de439 0x3334ecd5 0x4047 0x3fec)

It looks like the crash comes because the GPU pipeline is trying to re-establish itself, and of course the external screen is gone. I have tried various things to shut down the rendering to the external screen when going into the background, but they don’t seem to do the trick:

    [videoCamera stopCameraCapture];
    [self screenDidDisconnect:nil];

Within screenDidDisconnect I remove the GPUImageView which is associated with the external screen from the filter chain:

    [filter removeTarget:externalView.imageView];
    [externalView shutDown];
    externalView = nil;

And within the external view controller’s shutDown method, I hide the window, try detaching it from the external screen, and nil out my references to it and the GPUImageView:

    self.externalWindow.hidden = YES;
    self.externalWindow.screen = [[UIScreen screens] objectAtIndex:0];
    self.externalScreen = nil;
    self.externalWindow = nil;

Nonetheless, as soon as the app leaves the background with the screen detached, and I call [videoCamera startCameraCapture]; the above crash takes place.

Also perhaps relevant, if I do not actually background the app all the way to the Springboard screen, but simply double-tap home to access the AirPlay interface, the crash does not happen, and I am able to add and remove the external screen quite happily and repeatedly.

So how can I avoid this? Is there a need/way to flush the rendering pipeline upon exit? Or should I just discard the entire old pipeline and allocate a new GPUImageVideoCamera whenever I come back into the foreground? I don’t know enough about what is going on “under the covers” to understand the best practice here.

Brad Larson
Brad Larson's picture
User offline. Last seen 4 years 22 weeks ago. Offline
Joined: 05/14/2008
Posts:

That framebuffer failure crash generally occurs when you try to create a framebuffer of an illegal size, usually (0,0). Something here must be causing the pipeline to try to recreate the framebuffer at a null size, but I'm not sure what's triggering that. I would think that you should be safe by disabling the camera capture when this goes to the background, so that it won't send any frames into the pipeline when you come back, but that doesn't seem to be working.

Looking at it, the assertion is in your GPUImageView, and I see that I recreate the display framebuffer on a resizing of the view. I wonder if you logged the view's size in the -observeValueForKeyPath:ofObject:change:context: method there if you'd see the size go to zero when the external display is disconnected. If so, you might be able to work around this by altering GPUImageView to not destroy and recreate the framebuffer if the view size is set to 0,0.

brunchboy
brunchboy's picture
User offline. Last seen 8 years 48 weeks ago. Offline
Joined: 05/14/2012
Posts:

Right, since posting this I discovered that the crash does not occur if I don’t re-enable camera capture when the app returns from the background (but of course that means the app stops being useful so it is not a solution). This makes me wonder if somehow that GPUImageView is not successfully being removed from the pipeline. Is there an easy way to test that?

brunchboy
brunchboy's picture
User offline. Last seen 8 years 48 weeks ago. Offline
Joined: 05/14/2012
Posts:

Ah hah. It was my own fault, of course. I put some breakpoints in GPUImageOutput’s addTarget: and removeTarget: methods and watched what was going on, and discovered that I had been careless, and two different notifications would cause a window to be allocated to the external screen and added as a target for the filtered video, and both were unexpectedly being received, so I was ending up with an extra window in the pipeline, and only getting rid of one of them when the application was suspended later.

So I will tighten up my code, and expect everything will work just fine. It will also be doing half as much work, with only the one window being rendered to.

Having the source for the library is so helpful, and this exercise helped me learn a bit more about how it actually works.

Brad Larson
Brad Larson's picture
User offline. Last seen 4 years 22 weeks ago. Offline
Joined: 05/14/2008
Posts:

Cool. Good to hear that you were able to get to the bottom of it. I know someone else here who was also trying to filter video to an AirPlay output, so if there was a bug with that they'd want to know about it too.

If you do run into any bugs with the framework when doing this (I haven't done much testing yet with the external screen outputs), please file them on the Issues tracker on the project GitHub page. That's worked pretty well as a to-do list for fixing the issues people are running into in the wild.

brunchboy
brunchboy's picture
User offline. Last seen 8 years 48 weeks ago. Offline
Joined: 05/14/2012
Posts:

I may well be the same person you’re thinking of; I decided to move my questions to this forum because they would be more likely helpful to others using the framework in the future than if I kept them on the course mailing list. But yes, if I find any actual issues, I will certainly file them. And if I add anything that might be useful as part of the framework I will let you know and perhaps submit a pull request, if I can figure out that side of git. Like another forum user, I might end up finding a use for looped video sources. And I have a glimmer in my eye about animated sprite overlays… Someday. But things are working great for now, and I am excited and having fun!

Syndicate content