Strange Canny filter output

1 reply [Last post]
jeff
User offline. Last seen 8 years 51 weeks ago. Offline
Joined: 06/04/2012
Posts:

I get non-binary results from the Canny filter, i.e., I expect that the output should be either 0x00 or 0xFF for the R,G,B values of each pixel. But after running the filter and viewing the pixels values in memory, I see some other values such as 0xEF, 0x08, 0x10, 0xF7. I think that somehow a few bits were spontaneously flipped between channels, but I'm not sure.

Has anybody had this problem? I'm trying to figure out if it's unique to my app, or if it's an OpenGL glitch or maybe a glitch in Apple's OpenGL ES implementation.

If I modify the canny filter to output the same value to all pixels, it works fine (e.g., gl_FragColor = vec4(1.0, 1.0, 0.0, 1.0) ). But once I reintroduce the step function (in GPUImageThresholdEdgeDetection.m) or use an if statement to assign different values (either 0.0 or 1.0) to the components of gl_FragColor, I again get erroneous values in memory.

Brad Larson
Brad Larson's picture
User offline. Last seen 4 years 22 weeks ago. Offline
Joined: 05/14/2008
Posts:

I just tried this with two different images, and both produced results as I would expect, with pixels either being (0,0,0) or (255,255,255). I couldn't find any mid-range values in either image.

Are you sure you aren't saving these with JPEG compression and seeing compression artifacts? Have you somehow resized the resulting image, where interpolation might be blurring some of the sharp white lines? If you're capturing raw data, are you capturing at the exact same size as the output image from the Canny filter?

Syndicate content