FAQ Database Discussion Community


How can I use Apple's Core Audio C API to create a simple, real-time I/O stream on OS X?

c,osx,audio,core-audio
After spending quite a while traversing the extensive Core Audio docs maze, I'm still unsure of what part of the C API I should be using to create a basic audio sample I/O stream in OS X. When I say "I/O stream" I mean a low-latency stream that is spawned...

iOS how to play midi notes?

ios,core-audio,midi,audiounit,audiotoolbox
I have searched and already have done an OS X app that can play MIDI notes, but when i tried in iOS, nothing happened. Here is the core code: AUGraph graph; AudioUnit synthUnit; AUNode synthNode, outNode; NewAUGraph(&graph); AudioComponentDescription cd; cd.componentManufacturer = kAudioUnitManufacturer_Apple; cd.componentType = kAudioUnitType_MusicDevice; cd.componentSubType = kAudioUnitSubType_MIDISynth; AUGraphAddNode(graph, &cd,...

Guitar Tuner in IOS: Goertzel Algorithm doesn't work for 2 of 6 strings

ios,core-audio,frequency,goertzel-algorithm
I'm trying to realize a guitar tuner in IOS 8 and I got some code from s.o. who already realized it: it deals with the Goertzel algorithm which in short terms compares the magnitudes of fixed frequencies - as defined for the srings E-A-D-G-B-E. - here the routine which is...

Core Audio Swift Equalizer adjusts all bands at once?

ios,swift,core-audio,audiounit,equalizer
I am having trouble setting up a kAudioUnitSubType_NBandEQ in Swift. Here is my code to initialize the EQ: var cd:AudioComponentDescription = AudioComponentDescription(componentType: OSType(kAudioUnitType_Effect),componentSubType: OSType(kAudioUnitSubType_NBandEQ),componentManufacturer: OSType(kAudioUnitManufacturer_Apple),componentFlags: 0,componentFlagsMask: 0) // Add the node to the graph status = AUGraphAddNode(graph, &cd, &MyAppNode) println(status) // Once the graph has been opened get an instance...

How to play PCM data/buffer just using AVAudioPlayer or AVPlayer?

ios,objective-c,ios7,core-audio,audiounit
How can i play PCM data/buffer just using AVAudioPlayer or AVPlayer ? I know, i can play PCM data/buffer using AudioUnit/AudioQueue by feeding the data into play callback method. But don't want to do this. I have searched on google a lot but couldn't find any helpful answer or any...

AVAudioFile in Swift Playground results in “error -54”

swift,core-audio,swift-playground,avaudiofile
I decided to have a play with AVAudioPlayer in a Swift Playground. The following code works fine in a normal Swift project, but returns the following error when running it in a Playground (in the assistant editor): 2015-05-12 00:08:04.374 AVAudioFile[2481:141158] 00:08:04.374 ERROR: AVAudioFile.mm:266: AVAudioFileImpl: error -54 Here is the code:...

Core Audio - Write data to beginning or middle of audio file (somewhere other than the end)

ios,objective-c,core-audio,extaudiofile
So, I have an audio recording iOS app I am working on. In my app I have the need to write audio at any point in the file (in the documents directory) the user chooses, which will overwrite whatever audio is there. Example: I record 3 min of audio, scroll...

iOS record audio and draw waveform like Voice Memos

ios,opengl-es,core-graphics,core-audio,avaudiorecorder
I'm going to ask this at the risk of being too vague or asking too many things in one question, but I'm really just looking for a point in the right direction. In my app I want to record audio, show a waveform while recording, and scroll through the waveform...

Single-value-tuple as last member of struct in swift

swift,core-audio,coremidi
MusicPlayer's API relies on variable length arrays as the last member of a struct to handle passing around data of unknown size. Looking at the generated interface for MusicPlayer, the structs used in this method present their last element in a single value tuple. example: struct MusicEventUserData { var length:...

Precise time of audio queue playback finish

ios,objective-c,core-audio,audioqueue,audioqueueservices
I am using Audio Queues to playback audio files. I need precise timing on the finish of last buffer. I need to notify a function no later than 150ms-200 ms after the last buffer is played... Thru callback method I know how many buffers are enqueued I know the buffer...

Why does the argument ioNumBytes of AudioFileReadPacketData cause a crash?

ios,core-audio,exc-bad-access,audioqueue
I have got something really strange (at least for me but I am a noob). UInt32 numBytesReadFromFile; OSStatus err = AudioFileReadPacketData( audioFile, // The audio file whose audio packets you want to read. NO, // is cache set? &numBytesReadFromFile, // On output, the number of bytes of audio data that...

VoIP limiting the number of frames in rendercallback

ios,objective-c,audio,core-audio,audiounit
I am currently developing a VoIP application and one of the libraries I am using requires me to send the frames in the input call back. The requirements is that I must send a sample count which is defined as the number of samples in a frame. This callback will...

Core Audio: Audio Unit to boost signal level

ios,audio,boost,core-audio,avaudiosession
Our VOIP app uses both the Voice Processing IO Unit and the Remote IO Unit (we rebuild the AUGraph depending on which IO unit we require). We've noted that the audio output level is not as loud as some other VOIP apps such as Skype. Rather than manipulating the incoming...

How to use an AU3DMixer with The Amazing Audio Engine?

ios,cocoa-touch,core-audio,audiounit,the-amazing-audio-engine
I am using the (indeed!) Amazing Audio Engine to play some tracks (with an AUFilePlayer, each in a separate AEAudioChannel), which works quite nicely. Now, I would like to add the 3D Mixer Audio Unit kAudioUnitSubType_AU3DMixerEmbedded, but after searching high and low, I can't find any information about how this...

xcode - AUiPodEQ AUGraph

xcode,avfoundation,core-audio,equalizer
I'm developing a music application for iOS using the AVAudioplayer, in which I want to implement an equalizer. I searched the internet for a good solution, and ended up with and AUGraph configuration like this: // multichannel mixer unit AudioComponentDescription mixer_desc; mixer_desc.componentType = kAudioUnitType_Mixer; mixer_desc.componentSubType = kAudioUnitSubType_MultiChannelMixer; mixer_desc.componentManufacturer = kAudioUnitManufacturer_Apple;...

How do I output CAF file size in log?

ios,xcode,swift,core-audio,caf
I know the url of blank.caf audio file what I cam creating(recording) in my iPhone app. I am concerned about its size and would like to output its size to the log. I could not find a method to do so. I am also interesting in finding out the audio...

Read audio file, perform filters (i.e. reverb), and then write audio file without playback on iOS

ios,audio,core-audio
I'm working on an app which has a requirement for running some basic audio filters (such as normalisation and reverb) on a file. The idea is to take an existing audio file, add the filters, and then write the data to a new file. Crucially, this must be done without...

Frequency drift in Core Audio on OSX

osx,core-audio,audiounit
I have a skeleton audio app which uses kAudioUnitSubType_HALOutput to play audio via a AURenderCallback. I'm generating a simple pure tone just to test things out, but the tone changes pitch noticeably from time to time; sometimes drifting up or down, and sometimes changing rapidly. It can be up to...

Core Audio Matrix Mixer gets stuck and abuses memory when setting number of input or output elements

ios,swift,core-audio,audiounit
I managed to build and use simple audio graphs with Core Audio and Swift but I can't find the right way to use the Matrix Mixer. When I try to set the number of elements, it looks like the program goes in an infinite loop which ends up using lots...

What's the reason of using Circular Buffer in iOS Audio Calling APP?

ios,objective-c,audio,core-audio,audiounit
My question is pretty much self explanatory. Sorry if it seems too dumb. I am writing a iOS VoIP dialer and have checked some open-source code(iOS audio calling app). And almost all of those use Circular Buffer for storing recorded and received PCM audio data. SO i am wondering why...

Swift Core Audio Learning Resources

ios,swift,core-audio
I am trying to learn iOS Core Audio for Swift. I started out with Swift right away and I got no understanding of Objective-C. I am an Engineer with training in Sound engineering so I don't need to learn the basics of "sound" (samplerate, bitdepth, etc.) just the way it...

Capture iOS microphone audio in ulaw format

ios,avfoundation,core-audio
I need to capture microphone voice input in real time and stream upstream via RTSP. The audio format needs to be in ulaw. I need to obtain the raw bytes so I can feed it to the Live 555 RTSP library. Given various stuffs in Core Audio and AV Foundation...

OSStatus error -50 (invalid parameters) AudioQueueNewInput recording audio on iOS

ios,objective-c,core-audio
I've been trawling the internet for ages trying to find the cause of this error but I'm stuck. I've been following the Apple Developer documentation for using Audio Services to record audio and I keep getting this error whatever I do. I can record audio fine using AVAudioRecorder into any...

AVAudioSession properties after Initializing AUGraph

ios,core-audio,avaudiosession
To start a call, our VOIP app sets up an AVAudioSession, then builds, initializes and runs an AUGraph. During the call, we allow the user to switch back and forth between a speakerphone mode using code such as: avSession = [AVAudioSession sharedInstance]; AVAudioSessionCategoryOptions categoryOptions = [avSession categoryOptions]; categoryOptions |= AVAudioSessionCategoryOptionDefaultToSpeaker;...

iOS float buffer to audio playback

ios,objective-c,core-audio
I am new to iOS programming and am trying to manipulate some audio characteristics of an audio file which requires the audio data as float values which am able to successfully obtain. I am stuck on how to play back that float values as audio data. I investigated a lot...

How to replace the deprecated AudioUnitSampleType on iOS (Audio Units)?

ios,objective-c,core-audio,audiounit
I've been studying the Audio Unit Hosting Guide for iOS, and then trying to play around with the sample projects provided. However, all of these projects are using the deprecated AudioUnitSampleType (which seems to be a simple typedef). It might be a rookie question, but how do I go about...

AudioFileReadPackets Error -50

osx,compiler-errors,core-audio,audiotoolbox
I'm playing with Audio API and my first step was to read the content of a file into the buffer. I have the following code, but when AudioFileReadPackets is executed I get return code -50 which I don't get it, since in the documentation it doesn't exists. Can you please...

Does Media Player Framework need audio sessions to be managed?

ios,media-player,core-audio,avaudiosession,mpmusicplayercontroller
I'd like to include a simple media player in my iOS 7+ app, and I've found some posts dealing with Media Player Framework and the MPMusicPlayerController, and I think that can meet my needs. However, I couldn't find the related documentation in Apple's docs, and instead I found that there...