cocoa: how do I draw camera frames on to the screen -
what trying display photographic camera feeds within nsview using avfoundation. know can achieved using "avcapturevideopreviewlayer". however, long term plan frame processing tracking hand gestures, prefer draw frames manually. way did utilize "avcapturevideodataoutput" , implement "(void)captureoutput: didoutputsamplebuffer: fromconnection:" delegate function.
below implementation of delegate function. within delegate function create cgimage sample buffer , render onto calayer. not work not see video frames rendered on screen. calayer (mdrawlayer) created in function "awakefromnib" , attached custom view in story board. verify calayer creation setting background colour orange , works.
- (void)captureoutput:(avcaptureoutput *)captureoutput didoutputsamplebuffer: (cmsamplebufferref)samplebuffer fromconnection:(avcaptureconnection *)connection { cvimagebufferref pixelbuffer = cmsamplebuffergetimagebuffer(samplebuffer); cvpixelbufferlockbaseaddress(pixelbuffer, 0); uint8_t *baseaddress = (uint8_t *)cvpixelbuffergetbaseaddress(pixelbuffer); size_t bytesperrow = cvpixelbuffergetbytesperrow(pixelbuffer); size_t width = cvpixelbuffergetwidth(pixelbuffer); size_t height = cvpixelbuffergetheight(pixelbuffer); cgcolorspaceref colorspace = cgcolorspacecreatedevicergb(); cgcontextref newcontext = cgbitmapcontextcreate(baseaddress,width,height, 8, bytesperrow, colorspace, kcgbitmapbyteorder32little | kcgimagealphapremultipliedfirst); cgimageref imgref = cgbitmapcontextcreateimage(newcontext); mdrawlayer.contents = (id) cfbridgingrelease(imgref); [mdrawlayer display]; cvpixelbufferunlockbaseaddress(pixelbuffer, 0); }
obviously not doing correctly, how should render photographic camera frames 1 1 onto calayer? also, know if approach correct. standard way of doing this?
your help appreciated. thanks:)
cocoa camera avfoundation
No comments:
Post a Comment