Tuesday, 15 March 2011

opencv - Fixed Number of Keypoints for similar Image detection -



opencv - Fixed Number of Keypoints for similar Image detection -

i have set of images of more 1000 pictures. every image extract surf descriptors. i'll add together query image , want seek find similar image in image set. perfomance , memory reasons extract every image 200 keypoint descriptors. , more or less problem. @ moment filter matches doing this:

symmetrie matching: simple bruteforce matching in both directions. image1 image2 , image2 image1. maintain matches exist in both directions.

list<matches> match1 = bruteforcematching.bfmatch(act.interestpoints, query.interestpoints); list<matches> match2 = bruteforcematching.bfmatch(query.interestpoints, act.interestpoints); list<matches> finalmatch = featurematchfilter.dosymmetrytest(match1, match2); float distance = 0; for(int = 0; < finalmatch.size(); i++) distance += finalmatch.get(i).distance; act.pic.distance = distance * (float) query.interestpoints.size() / (float) finalmatch.size();

i know there more filter methods. how can see seek weight distances number of final matches. don't have feeling iam doing correct. when other approaches looks compute extractet involvement points exists in image. have approach this? or thought weight distances?

i know there no golden solution, experiences, ideas , other approaches helpfull.

so "match1" represents directed matches of 1 of database images , "match2" query image, "finalmatch" matches between images , "finalmatch.get(i).distance" kind of mean value between 2 directed distances.

so is, calculate mean of sum of distances , scale them number of involvement points have. goal assume have meassure of how overall images match.

i pretty sure distance calculate doesn't reflect similarity well. dividing sum of distances number of matches makes sense , might give thought of similarity when compared other query images, scaling value number of involvement points doesn't meaningful.

first of suggest rid of scaling. i'm not sure brute forcefulness matching exactly, additionally symmetry test, should discard matches ratio of first , sec candidate high (if remember right, lowe suggest threshold of 0.8). then, if stiff scene, suggest apply kind of fundamental matrix estimation (8 point algorithm + ransac) , filter result using epipolar geometry. i'm pretty sure mean discriptor distance of "real" matches give thought "similarity" of database image , query.

opencv image-processing computer-vision surf cbir

No comments:

Post a Comment