Playing audio and video



Relevant resources


Documentation


Web


Video


Sample code - Apple


For many people, the second most important function of the iPhone (after making phone calls) is to play music and video.  For the iPod touch, it's the primary function of that device.  Therefore, it makes sense that Apple would expose interfaces for designing iPhone applications that can play audio or video, whether from the user's iPod library or as part of an immersive game.


There are a number of different audio and video frameworks available to you on the iPhone.  These include:



iPod library access


Starting in iPhone OS 3.0, Apple gives you the ability to have read-only access to the user's iPod library of music and audio books.  This access is through the high-level Media Player framework, which makes it easy to retrieve music, playlists, or album artwork, and control the playback of the iPod application.


Media items within the iPod library are represented by instances of MPMediaItem.  An MPMediaItem contains not only the means to play the audio for that item, but it also has with it all the associated metadata of that item.  To access a particular piece of metadata, you use the valueForProperty: on the media item.  The constants that represent various metadata properties include: MPMediaItemPropertyPersistentID, MPMediaItemPropertyMediaType, MPMediaItemPropertyTitle, MPMediaItemPropertyAlbumTitle, MPMediaItemPropertyArtist, MPMediaItemPropertyAlbumArtist, MPMediaItemPropertyGenre, MPMediaItemPropertyComposer, MPMediaItemPropertyPlaybackDuration, MPMediaItemPropertyAlbumTrackNumber, MPMediaItemPropertyAlbumTrackCount, MPMediaItemPropertyDiscNumber, MPMediaItemPropertyDiscCount, MPMediaItemPropertyArtwork, MPMediaItemPropertyLyrics, and MPMediaItemPropertyIsCompilation.


MPMediaItems come in several types, represented by another set of constants: MPMediaTypeMusic, MPMediaTypePodcast, MPMediaTypeAudioBook, and MPMediaTypeAnyAudio.


A media item can be retrieved one of two ways: using a system-provided media picker or by performing a manual query against the user's media library.  To bring up the interface for picking media, you use an instance of MPMediaPickerController.  To only present certain media types, you can specify them via the mediaTypes property.  Likewise, you can allow for the selection of multiple items by setting the allowsPickingMultipleItems property to YES.


Once you've presented the MPMediaPickerController, you will need to the choices made in the picker, and the cancellation of the picker using delegate methods.  The delegate to the media picker must conform to the MPMediaPickerControllerDelegate protocol and implement the following methods:


- (void)mediaPicker: (MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection

- (void)mediaPickerDidCancel:(MPMediaPickerController *)mediaPicker


The AddMusic sample application provides a good demonstration of this process.


If you don't want to present an interface for picking media items, but handle that yourself, you can use an MPMediaQuery to search for, and find, specific items in the user's library.  To generate an MPMediaQuery, you can use one of the convenience constructors, such as +albumsQuery or +songsQuery to provide a preset query, or initialize and instance and create your own custom query.  When customizing a MPMediaQuery, you can use addFilterPredicate: and removeFilterPredicate: to find specific media items, as well as set a groupingType, which refines the order in which results are returned.  To execute this query, either access its items or collections properties to read back the items matching this query.


If you want your application to stay up to date as items change in the user's media library (most often due to synchronization actions), you can use the 

beginGeneratingLibraryChangeNotifications method on MPMediaLibrary's +defaultMediaLibrary, then listen for the MPMediaLibraryDidChangeNotification.


In addition to media items, you can extract the artwork for a specific media item using the MPMediaItemArtwork class.  To obtain an image from a MPMediaItemArtwork instance, you use imageWithSize:, which returns a UIImage shaped to fit the size you specify.  These are generated on the fly from the album artwork in the library.


You can play music obtained from the iPod library in two ways: completely within your application or by switching the iPod music player to a particular song or playlist.  An in-application music player is great for playing background music which terminates as soon as your application quits.  Controlling the iPod music player will cause music to keep playing after the application exits, but it will disrupt the user's current playlist or song.  To create one of these music players, you use code like the following:


MPMusicPlayerController* appMusicPlayer = [MPMusicPlayerController applicationMusicPlayer];


or


MPMusicPlayerController* iPodMusicPlayer = [MPMusicPlayerController iPodMusicPlayer];


In order to play something through an MPMusicPlayerController, you need to set up a play queue.  You can do this using setQueueWithQuery: if you want to use the results from a media query, or setQueueWithItemCollection: to feed in an MPMediaItemCollection.  Playback can be controlled via methods like play, pause, and stop.


Apple recommends that if you've implemented playback controls, you not toggle the play / pause state as soon as the user presses the button, but instead listen for MPMusicPlayerControllerPlaybackStateDidChangeNotification and change the buttons to match the current playback state.  This lets you respond to the user pulling out their headphones or a playlist hitting its end.


Overall, these iPod library access classes make it simple for you to play music and video from the user's iPod library.


AV Foundation


When dealing with audio playback in your application, you'll need to make sure that it plays well with the rest of the system.  For example, if you have a game that uses OpenAL, you might want to allow the user to keep playing iPod music in the background.  Similarly, you might want to make sure that your game's audio is properly muted on removal of headphones or setting the ringer switch to silent.


These audio session settings are handled through the use of AVAudioSession.  It's recommended that you read Apple's documentation in the Audio Session Programming Guide to get a firm grasp on all the things you need to keep in mind when working with audio sessions.


Audio Units


Audio units are audio processing plugins that let you access properties of an audio stream and perform manipulations of it.  System-supplied audio units include:



For more on audio units, refer to the System Audio Unit Access Guide.


OpenAL


OpenAL is an open standard for playing positional (3-D) audio.  It is the audio brother to OpenGL, which handles 3-D graphics.  As such, it is a C-based API, without the niceties of Cocoa abstractions.  It even shares the same coordinate space as OpenGL and many of the conventions.  The iPhone's OpenAL implementation is version 1.1, without any audio capture functionality.


OpenAL uses the concept of a listener, which represents the user's point of view in 3-D space and can be pointed in any 3-D direction.  In addition to this are sources, which are points in space that emit audio, and buffers, which are containers for audio to be played.


Before you place audio sources or a listener, you need to initialize an OpenAL context.  To do so, you'd use code like the following:


ALenum error;

ALCcontext *newContext = NULL;

ALCdevice *newDevice = NULL;

newDevice = alcOpenDevice(NULL);

newContext = alcCreateContext(newDevice, 0);

alcMakeContextCurrent(newContext);


This opens the default audio device for the iPhone, associates a context with it, and makes that the current context.


To orient a listener, you use code like the following:


ALfloat listenerOrientation[6] = { 0.0, 0.0, 1.0, 0.0, 1.0, 0.0 };

alListenerfv(AL_ORIENTATION, listenerOrientation);


This will orient the listener facing the positive Z direction, oriented in such a direction that up is in the positive Y direction.


Likewise, a listener can be positioned using 


alListenerfv(AL_ORIENTATION, listener_orientation);


To set up a sound source, you use code like the following:


alSourcei(source, AL_BUFFER, buffer);

alSourcei(source, AL_LOOPING, AL_TRUE);

alSourcefv(source, AL_POSITION, source_position_xyz);

alSourcef(source, AL_REFERENCE_DISTANCE, 5.0f);

alSourcePlay(source);


This adds a buffer for the audio source's queue, sets the source to loop, sets its position, and starts it playing.


These buffers and sources will have needed to be set up using code like the following:


alGenBuffers(1, &_buffer);

if((error = alGetError()) != AL_NO_ERROR) {

printf("Error Generating Buffers: %x", error);

exit(1);

}

alGenSources(1, &_source);

if(alGetError() != AL_NO_ERROR) 

{

printf("Error generating sources! %x\n", error);

exit(1);

}


Just like OpenGL, OpenAL lets you do powerful audio manipulation in 3-D, and it is recommended that you read the OpenAL Programmer's Guide.


HTTP Live Streaming


A new technology in iPhone OS 3.0, HTTP Live Streaming lets you stream video to an iPhone application from a standard web server in such a way that it can dynamically switch between streams at various data rates, as well as broadcast live content in a scalable manner.  


For applications that use video streaming, Apple has this to say:


If your app delivers video over cellular networks, and the video exceeds either 10 minutes duration or 5 MB of data in a five minute period, you are required to use HTTP Live Streaming. (Progressive download may be used for smaller clips.)


If your app uses HTTP Live Streaming over cellular networks, you are required to provide at least one stream at 64 Kbps or lower bandwidth (the low-bandwidth stream may be audio-only or audio with a still image).


Fore more, see Apple's HTTP Live Streaming Overview.