6533b86ffe1ef96bd12cd161
RESEARCH PRODUCT
MFNet: Multi-feature convolutional neural network for high-density crowd counting
Songchenchen GongXuecan YangEl-bay Bourennanesubject
0209 industrial biotechnologyeducation.field_of_studyHuman headComputer sciencebusiness.industryPopulationPattern recognition02 engineering and technologyConvolutional neural networkImage (mathematics)Support vector machineTask (computing)Range (mathematics)020901 industrial engineering & automationFeature (computer vision)0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingArtificial intelligenceeducationbusinessdescription
The crowd counting task involves the issue of security, so now more and more people are concerned about it. At present, the most difficult problem of population counting consists in: how to make the model distinguish human head features more finely in the densely populated area, such as head overlap and how to find a small-scale local head feature in an image with a wide range of population density. Facing these challenges, we propose a network for multiple feature convolutional neural network, which is called MFNet. It aims to get high-quality density maps in the high-density crowd scene, and at the same time to perform the task of the count and estimation of the crowd. In terms of crowd counting, we use multiple sources of information, that is HOG, LBP and CANNY. With the support vector machine (SVM), each source provides us not merely a separate count estimation, but other statistical measures. In order to effectively solve the problem of extracting scale-related features in crowd counting, we have integrated MFNet, a convolutional neural network architecture. By comparing the experimental results of multiple data sets, MFNet is superior to other population counting methods.
year | journal | country | edition | language |
---|---|---|---|---|
2020-11-04 | 2020 11th IEEE Annual Information Technology, Electronics and Mobile Communication Conference (IEMCON) |